Windows is historically very good on Low-DPI but they also managed to be great on HiDPI.
Linux, well, it depends on so much things … you can achieve good on both but you’d better be ok with integer scaling and not have multiple displays with different DPI.
Then Mojave switched to Metal and removed subpixel AA, and now it’s the worst.
Thread from when it happened: https://news.ycombinator.com/item?id=17476873
Luckily, Mac is only for work, and it is passable with 32" at 4k, but I can use Linux everywhere else for much nicer fonts. Unluckily, work is 8+h :)
Now that Apple sells their own (very decent) monitor at somewhat more affordable price it makes sense to use it as an external display, I agree.
No it hasn't.
Maybe to you "always" means "since 2014" but if so that means you are very young and you should not generalise from that.
I've been using Macs since 1988 and Mac OS X since 2001 and it used to be great on SD screens. I used to use Safari on Windows XP because its font rendering was so much better than the built-in Windows Truetype renderer.
This change is new and recent.
It is absolutely not "always".
In this case the UI is scaled up by an integer factor on all screens so as to look nice on the highest DPI screen. It is scaled down from the higher resolution by a decimal factor that could be but needn't be an integer. If the factor is proportional to the difference in DPI the result is UI elements being sized precisely the same size across different size and dpi monitors.
All monitors share a singular scaling factor and DPI. Apps thus need to support high DPI but needn't do anything smart to support scaling because it happens outside of the apps remit.
This can be achieved again with xrandr --scale OR in the nvidia-settings GUI by setting viewport in to a higher resolution than viewport out. No the result isn't blurry.
xrandr --scale, xorg.conf, or nvidia-settings GUI and save to xorg config