In this case the UI is scaled up by an integer factor on all screens so as to look nice on the highest DPI screen. It is scaled down from the higher resolution by a decimal factor that could be but needn't be an integer. If the factor is proportional to the difference in DPI the result is UI elements being sized precisely the same size across different size and dpi monitors.
All monitors share a singular scaling factor and DPI. Apps thus need to support high DPI but needn't do anything smart to support scaling because it happens outside of the apps remit.
This can be achieved again with xrandr --scale OR in the nvidia-settings GUI by setting viewport in to a higher resolution than viewport out. No the result isn't blurry.
Out-of-band solutions are effectively bandaids and they unnecessarily increase development difficulty of GUI programs on Linux, since developers now have to be aware of the various side channels (fragmentation!) that communicate the DPI scale.
This is why SDL2 for the longest period did not support HiDPI on X11, but does on Wayland, MacOS, and Windows. A COSMIC dev just recently made a complaint about XSettings too! [0] You can't just ignore those problems, "Linux is hard to develop for, blah blah fragmentation blah blah" I am sure you have heard of.
Another thing. Per-output HiDPI is fine when all your programs support high DPI, but it's unworkable if you want to mix LoDPI and HiDPI applications in a single screen, i.e. if an application has better user experience if it is upscaled and blurry (!), you are SOL unless you want to apply scaling to your entire desktop.
You also lose the opportunity to implement some neat features like temporarily scaling up a window if it is being magnified or screenshot. (The idea's been floating about in the KDE community)
Finally, we can argue for days, but the HiDPI page on Arch wiki already says a lot when a good 90% of the article is about acheiving good DPI scaling on X11 [1]. Even the Wayland sections have an XWayland subsection in them...