I think we just have to accept that that's how websites are built now. It drove me nuts for a while, too. But modern JS engines are blindingly fast, and 2-3mb of JS download (that will be cached aggressively) is a non-issue for the vast majority of users.
I started talking to a junior developer the other day about server side rendering in the days of Rails/PHP/etc. and he looked at me like I was crazy. Couldn't even grasp the concept. I think for better or worse this is where we are headed.
> 2-3mb of JS download (that will be cached aggressively)
It's a giant problem on mobile. Connection quality varies and the larger the payload the greater the probability that it just doesn't all load. Caching? Mobile Safari will just reload the entire damn page periodically if you swap apps -- I don't think it caches in the same way that desktop browsers do.
I don't think we should simply accept that's how things are done now.
> I started talking to a junior developer the other day about server side rendering in the days of Rails/PHP/etc. and he looked at me like I was crazy.
A web developer like the one you mention that can't even conceive of server side rendering makes them a bad web developer. That lack of core understanding means they have no idea how a web browser works and only view the world through JavaScript. They have no concept of progressive enhancement.
So this means they'll likely spend the next five years poorly implementing features a web browser already has. They're going to make websites, er "apps", that won't work properly in micro browsers. Hyperlinks won't work or will be flakey enough to might as well not work. And best of all their stuff won't work on entry level devices that are extremely popular.
> But modern JS engines are blindingly fast, and 2-3mb of JS download (that will be cached aggressively) is a non-issue for the vast majority of users.
You say this but it's not the common case. A lot of people have shitty devices because they're cheap. Whether they're entry level phones or shitty bullpen office systems people everywhere are stuck with them.
No matter how fast Chrome might execute some JITed inner loop is immaterial when the next line adds the DOM's billionth div pretending to be a button.
Thanks to cargo culting CI/CD there's little guarantee some app is going to reference the same JavaScript file on different days so the cached version will be tossed and yet another copy of Doom will be downloaded.
There are plenty of uses for JacaScript and cool web technologies. There's also lots of places where JavaScript is indispensable and enables awesome things. But requiring 3MB of JavaScript to read a static blog post is just poor craftsmanship. It's not even interesting as a project because you've signed the reader up to execute some unsolicited code to do who knows what. If you love JavaScript, render all your markdown with it on your device and just send me the static output. Don't make me download Doom and the markdown just to turn it into a few paragraphs of text I can't actually read. Web developers should respect their audience enough not to make them spend unnecessary resources to use their stuff.
Rails does this surprisingly well using Stimulus with web sockets to mediate the exchange of events and data between the client and server layers.
Similar strategies are used in Phoenix Live View apps.
Load static markup and data -> request more data if you need -> send events and data to the server -> respond with the new state to display if different than the client’s version.