>A lot of things have changed in the last quarter-century – in 1997 NVIDIA had yet to even coin the term “GPU”
[1] https://www.anandtech.com/show/21542/end-of-the-road-an-anan...
I prayed for programmable blending via "blending shaders" (and FWIW programmable texture decoding via "texture shaders" - useful for custom texture format/compression, texture synthesis, etc) since i first learned about pixel shaders waay back in early 2000s.
Somehow GPUs got raytracing before programmable blending when the former felt like some summer night dream and the latter just about replacing yet another fixed function block with a programmable one :-P
(still waiting for texture shaders though)
https://medium.com/pocket-gems/programmable-blending-on-ios-...
This is a great example of using it: https://vulkan.org/user/pages/09.events/vulkanised-2024/vulk...
OTOY does all their rendering with compute nowadays.
It's vaguely like comparing a full CPU emulator with something that implements the ADD and MUL instructions.
Just to clarify, Dolphin's specialized shaders simulate fixed-function blending/texturing too. What's different about ubershaders is that a single shader can handle a wide variety of fixed-function states whereas a specialized shader can only handle a single state.
Thus whereas specialized shaders have to be generated and compiled on-the-fly resulting in stutter; ubershaders can all be pre-compiled before running the game. Add to this the capability to asynchronously compile specialized shaders to replace ubershaders and the performance loss of ubershaders becomes negligible. A rare case of having your cake and eating it too.