But for the most part this shouldn't really matter much. A huge amount of things these days are properly color managed, so as long as the thing that wrote the profile actually, you know, wrote what it actually wanted then it'll display just fine regardless of how many different "sRGB" profiles there are floating around. We're largely past the days of just hoping that the image and the display happen to agree on roughly the same colors.
That would be calibration and it's still necessary if you want color accuracy. That's about ensuring that what your monitor thinks it's displaying and what it's actually physically emitting are the same. The main thing that's changed here is that factory calibration has become a lot more common and is often more than good enough for anything short of serious professional work. Even for things that aren't professional displays. Like most flagship or even midrange smartphones are factory calibrated with dE values that would make reference monitors from 20 years ago blush. Right up until the OEM shoves a shitty color curve on it intentionally to make it "pop" or be more "vibrant" (Samsung calls this "Vivid", Pixel calls it "Adaptive", etc.. - but they at least usually have a "natural" option that gets you back to the properly calibrated display)
<Heath Ledger Joker>Ah haa ha ha haaaa!</Heath Ledger Joker>
We're nowhere near past that point, we haven't even begun to approach that point. That point is something I would like to reach before I die, but since that's maybe just a couple of decades away, it's not looking likely.
In general, Windows and Linux does not color manage, or so badly that it is counter-productive.
Most sub-$500 monitors do not report their native gamut! By default, operating systems assume monitors are sRGB (they're typically not), and send un-calibrated 8-bit RGB as-is.
On Windows and MacOS, enabling HDR mode typically sets the correct gamut, etc... and mostly makes things "just work", but that's at the OS level only.
Almost all applications map wide-gamut images to sRGB even on HDR monitors or simply re-interpret the RGB values as-if they're sRGB without even bothering to color space convert.
Firefox has color management off by default. Microsoft Edge defaults to "crush to sRGB". Apps with embedded web view controls are "who knows?"
In general, widge-gamut, 10 bits per channel, and HDR support are all a total shit show. I'm perpetually surprised if any of it works!
As a random example, my Nikon Z8 DSLR can natively record HDR 10 bit wide gamut HEIF files in-body. Windows can't display those at all. MacOS and iPhones can... sometimes... but then the viewer apps will often "get confused" and the brightness will jump around randomly and non-deterministically as you switch between thumbnail and full screen views. You can't forward such an image to anyone via iMessage, they'll get gibberish on their end, and SMS/MMS is hopeless.
Meanwhile, YouTube HDR generally "just works" on most devices, so I've started sending people my still image photography by converting them to a HDR 4K slideshow in DaVinci Resolve and giving them a YouTube link.
It's sad and pathetic that Meta set $80 billion on fire for the Metaverse and the rest of the industry found a decent chunk of a trillion dollars under the couch cushions to throw at AI slop, but nobody can "afford" to have one or two engineers fix their imaging pipeline.
Upload a HDR or wide-gamut image to Faceobook successfully and then tell me it "just works".
Or send one in an email.
Or do anything with it other than view it on your own device.
The rest of your rant seems mostly anchored around the fact that most things still produce sRGB, which is true, but they still typically map inputs into sRGB appropriately (that is, they get clipped). Which still is being properly color managed.