story
If you're talking about Google's homepage, the answer is "a lot". You can check for yourself - go to google.com, select "view source" and compare the amount of Closure-compiled JavaScript against HTML markup.
> Google's primary web search feature could, in theory, be implemented without a line of JavaScript
...and yet, in practice, Google defaults to a JavaScript-heavy implementation. Search is Google's raison d'être and primary revenue driver, I posit it therefore is optimized up the wazoo. I wouldn't hastily assume incompence given those priors.
Google homepage is 2MB. Two fucking megabytes. Without JS, it's 200K.
I can't be the only person who remembers when Google was known for even omitting technically optional html tags on their homepage, to make it load fast - they even documented this as a formal suggestion: https://google.github.io/styleguide/htmlcssguide.html#Option...
This was back when a large fraction of search users were on 56k modems. Advances in broadband connectivity, caching, browser rendering, resource loading scheduling, and front-end engineering practices may result in the non-intuitive scenario where the 2MB Google homepage in 2024 has the same (or better!) 99-percentile First-Meaningful-Paint time as a stripped-down 2kb homepage in 2006.
The homepage size is no longer that important because how much time do you save by shrinking a page from 2MB to 300kb on a 50mbps connection with a warm cache?Browser cache sizes are much larger than they were 10 years ago (thanks to growth in client storage). After all, page weight is mostly used as a proxy for loading time.
- We're on the verge of another browser monopoly, cheered on by developers embracing the single controlling vendor;
- We already have sites declaring that they "work best in Chrome" when what they really mean is "we only bothered to test in Chrome".
- People are not only using UA sniffing with inevitable disastrous results, they're proclaiming loudly that it's both necessary and "the best" solution.
- The amount of unnecessary JavaScript is truly gargantuan, because how else are you going to pad your resume?
I mean really what's next?
Are we going to start adopting image slice layouts again because browsers gained machine vision capabilities?
Since you're replying to my comment and paraphrasing a sentence of mine, I'm guessing I'm "people".
I'm curious to hear from you on what - if any - is a better alternative that can be used to determine the browser identity or characteristics (implied by name and version) on the server side? "Do not detect the browser on the server side" is not a valid answer; and suggests to me the person proffering it as an answer isn't familiar with large-scale development of performant web-apps or websites for heterogenous browsers. A lot of browser inconsistencies have to be papered over (e.g. with polyfills or alternative algorithms implementations), without shipping unnecessary code to browsers that don't need the additional code. If you have a technique faster and/or better that UA sniffing on the server side, I'll be happy to learn from you.
"Do feature JavaScript feature detection on the client" is terrible for performance if you're using it to dynamically load scripts on the critical path.
Two, it was an accessibility nightmare.
At least modern grid design fix those