To this general principle you can add browsers and websites; what the browser giveth, the websites taketh away. You may think browsers are slow... they really aren't! There's a staggering, even arguably insane, amount of optimization in there. But then we write websites that are barely adequate, and load them up with ad scripts that aren't even barely adequate, and blame the browsers for being slow.
Write yourself an old-school 1998-style static website without a big pile of fancy features, give yourself solid .css and .js caching and use it judiciously, and the browsers can blast content to the screen blazingly fast, for all the work it is doing.
If you even could feed a 2024 web site to a 1998 browser, you'd probably be able to eat a meal while it was trying to render facebook.
I don't. I use uBlock Origin which blocks "ad scripts" and the like. My everyday machine is an old PC (older than 10 years) still on Win7, and everything is running just fine.
I also use a top of the line, recent PC on Ubuntu, mostly for development. Websites there feel instantaneous. I sometimes wonder what a subpar browser would feel like on that machine.
Maybe I should just try to run Ladybird on this to see how it goes.
But, what if we had an AI agent dedicated to improving performance? It doesn’t need to be capable of solving every problem, but it could address the low-hanging fruit problems which aren’t hard to solve but nobody has time to look at.
There is also a lot of accidental complexity which you might be able to get rid of only by BC breaks, unfortunately.