(For those unfamiliar with it, HTTP/0.9 is the label for a protocol where the client opens a TCP connection to the server and sends “GET /”, and the server responds with the body, verbatim, and closes the connection. No status codes, no headers, nothing.)
For instance, here is an untested hypothesis: ~30 of the hostnames on the list have abandoned DNS A records, pointing to EC2 servers. Those EC2 IPs have since been repurposed as honeypots of some kind. The honeypots present themselves as HTTP/0.9, in order to look more like low-grade IoT devices.
That hypothesis is almost certainly wrong, but you could quickly invent another and at some point one of them will be correct. The internet is just a very messy place.
* I block much more than ads.
It seems like most people who use Safari use full system-level ad blockers like Wipr. But I was also Googling around and found some complaints about Wipr, like it taking forever to be updated to block YouTube ads when they switch how they're displayed, or not being as good at getting around ad-block detection on certain sites.
I actually use Chrome, Firefox, and Safari on a daily basis, so I have some pretty solid annecdata that Safari is the best experience.
But since this extension is banned from the Chrome Web Store, you'll have to install it manually there. On Firefox it's available from their add-on store.
Plus I do not like ad fraud. I get lots of captchas. I also don't wanna give money to Google and Facebook.
If uBO sells out (I don't think gorhill will, but I don't know him) it'll be the top link on HN
https://www.lowyat.net/2020/224264/chromium-versions-nano-ad...
https://arstechnica.com/information-technology/2020/10/popul...
If this silently clicks ads in the background the click rate goes up but the goal action (buying, signing up) remains at 0 which smartprices the ads and tanks the value of the ads for the real clickthroughs.
If you must block ads just use uBlock Origin so you are treated like your visit doesn't exist.
A deeper question is it moral to consume content and block the ads. Wouldn't that be the same as pirating a movie? (not that there is anything wrong with pirating). Are there people who would block ads and complain about people pirating, or even stealing from a store if you can get away with it.
> Does AdNauseam's automatic Ad-clicking create billable events for advertisers?
> It depends on the advertising business model and the degree of effort they are willing to put into filtering. Some might, others would not.
So it seems like the extension only "generates revenue for site owners" if it's either
- small enough for ad selling companies to not know it exists (and sophisticated enough to defeat their baseline filtering measures) or - large enough for ad selling companies to notice it exists and sophisticated enough to defeat active attempts to filter clicks from AdNauseam specifically
I don't know how this arms race usually turns out, but it seems like there could be a chance that this is just a uBlock Origin that consumes more bandwidth.
I block scripts on news sites since they serve no purpose.It removes things like popups, autoplaying videos and comments.
And I have so many custom filters set up to block cookie prompts, annoyances and other stuff.
For example looking at en.wikipedia.org I can just remove that COVID box. Scalable? No. Smart? Eeeyuh... Satisfying? Hell yes.
They've seemingly gone out of their way to make this difficult by putting each letter of "Sponsored" in its own div, but you can make a filter based on the aria-label instead.
Or am I missing something?
https://web.dev/interactive/#what-tti-measures
https://developer.mozilla.org/en-US/docs/Web/API/Performance...
If you wanted to measure the effect of protocol, you could compare requesting with support HTTP/N to requesting with support for HTTP/N+1. Since this is all synthetic testing it shouldn't be too hard to run a controlled experiment instead of a correlational one.
1 ads
2 client side rendering. Youtube nowadays is 400-1000 kilobytes of Json + ~!8! megabyte of javascript(1). Pre Polymer (YT client side rendering engine update) you would receive 50KB of pre rendered pure HTML that would display instantly. Nowadays scrolling comments results in seeing them appear one by one while browser struggles with constant DOM updates.
1) >7MB desktop_polymer_inlined_html_polymer_flags.js + 1.6MB base.js
Edit: After reading the article
>Top linked URLs
That's amazing. I've got the same combination on my luggage ^^^^^ tracking blocker, every single one of those (even the yt iframe one).
Nothing bout ads or tracking.
>What marketing strategies does Itnext use? Get traffic statistics, SEO keyword opportunities, audience insights, and competitive analytics for Itnext.
oh, Itnext is all about that user tracking
HN discourse would have you think single-page apps are pervasive, but they're only on a tiny fraction of websites: https://css-tricks.com/how-the-web-is-really-built/
I'd bet the majority of developers working on SPAs are doing so for internal dashboards/tools.
Exactly as anyone should have expected.
Can you not.
JQuery was first released in 2006, which is 14 years ago in human years, but much longer in JavaScript years. Measured in Angular versions, it is probably hundreds of versions ago.
For example, the myths about page weight and big SPA JS websites being the source of web performance issues is one I see so frequently here. Even in this thread others have started to claim the problem is all JS (take a closer look at the article). And it's good to see some actual data to back what I actually have seen optimizing many websites myself (spoiler: a lot of badly performing sites use jQuery).
For speed (which is also a complex subject that can't be captured in one metric[2]), the problem isn't page weight or JS frameworks, it's network latency[3]. Because you can't optimize the speed of light. This is especially true for websites that connect to many different origins (usually for different types of analytics services) and for many things related to the critical render path[4].
The issues I see most often are not the huge frameworks loading, but inefficient loading of all resources, especially those that will affect user-centric timing metrics. I frequently see many render-blocking CSS and JS files loaded which increases First Contentful Paint. I see images, script code, and other resources that affect below-the-fold content loaded before above-the-fold content and resources. And of course my favorite: above-the-fold content loaded with JS. These affect Largest Contentful Paint. Etc etc.
Of course we can all claim the problem is something else and collectively decide to switch to server-side rendering as an industry but this won't fix issues with web performance. Understanding how browsers load pages and how your users perceive the load will.
0. https://developer.mozilla.org/en-US/docs/Web/API/Performance...
1. https://web.dev/interactive/
2. https://developers.google.com/web/fundamentals/performance/s...
3. https://www.speedshop.co/2015/11/05/page-weight-doesnt-matte...
4. https://developers.google.com/web/fundamentals/performance/c...
I have a WIP branch that caches things locally, but my boss tells me that the app is fast enough and I should be working on other things. And I'm loathe to sneak it in without proper testing as caching is something that has a tendency to break in horrible ways if you're not careful.
if that was the case electron apps with content available locally would not be so slow
Technically it's not a metric for one-of rendering, but memory leaks by an open browser tab have bugged me a bit lately.
Safari does a per URL report in MacOS, and the ancient MacBook with "only" 8GB RAM gets a tad hot and bothered when a local PC store page runs up a 1.5 GB tab just sitting there, GMail snarfs 1 GB if not opened in a fresh tab once in a while, etc.
I ask because I don't understand why a zero download time for a cached document couldn't be simply masked by some (random) wait by the browser instead of downloading the file again.
From the chrome update page linked in the article, the explanation is:
> However, the time a website takes to respond to HTTP requests can reveal that the browser has accessed the same resource in the past, which opens the browser to security and privacy attacks, [...]
which seems to indicate that only time matters in the attacks. Yet, the third bullet point suggests:
> Cross-site tracking: The cache can be used to store cookie-like identifiers as a cross-site tracking mechanism.
as a possible attack based on the cache, which doesn't seem to involve document download time.
https://developers.google.com/web/updates/2020/10/http-cache...
I'd not encountered Amazon Publisher Services but this article makes them look very bad.
Oh and a lot of sites load jQuery synchronously in the <head> tag, usually pointing to some external CDN. That means the browser stops parsing the HTML mid-way through the page load, resolves the CDN's domain to IP, does a TLS handshake and establishes a connection to the CDN, waits to download the script, parses, runs it, and then proceeds on to the rest of the HTML. All because the developer(s) didn't know about the script `defer` attribute[1].
This is not true since about 2008. All major browsers parse the HTML without waiting for synchronous <script> to load. https://developer.mozilla.org/en-US/docs/Glossary/speculativ...
If the <script> uses document.write() and what's written contains unbalanced tags, or if the document is modified in some other ways, then they re-parse but that combination is rare. This page explains what to avoid to ensure re-parsing doesn't happen (the rules may differ with other browsers): https://developer.mozilla.org/en-US/docs/Glossary/speculativ...
[1] - https://addons.mozilla.org/en-US/firefox/addon/localcdn-fork...
It's not feature complete with jquery as a drop in replacement but the API is about the same and it covers the 90%.
AFAIK Rocket is a Rust framework and my guess is that the average Rust dev cares more about perf. Which would imply that perf could be more about mindset than technology.
But that's just my humble interpretation...
(Also, the Rust Rocket is not primarily focused on perf, but developer ergonomics.)
The historic reality is that HTML served as a "backdoor"/gateway to let the linux community compete with microsoft: linux devs could ship their code for windows clients.
Now, HTML webapps cannot run properly on a mobile platform without depleting expensive batteries. So each mobile platform are making bucks on the back of developers who have to work twice as hard.
I have very high hopes that webassembly could somehow solve all those problems, but there are several problems:
* Not all browser vendors like WASM and I doubt that any WASM progress would be supported equally.
* WASM toolchains are not really mature for most languages. Compiler makers don't have a lot of incentive working with WASM.
* The DOM would still exist.
It’s excellent for documents, and very capable for apps. Games and other intensely custom-graphical interfaces are really the only situation where it’s not great, and in that case you can drop into a canvas and/or SVG (SVG admittedly still being DOM, but not HTML DOM) with no difficulty.
HTML was not initially designed for interactive applications, but for many years has progressively been.
The DOM gets you things like support for accessibility tech in a way that few other technologies can rival; the only real defect it has is that it strictly uses a push model and you can’t query the screen reader—so you can’t do things like tweak things for particular screen readers (for good or ill), or skip parts of the accessibility tree for efficiency based on where the tool is looking at present, but must present the entire accessibility tree up front. But although this is limiting, there are also reasons why it is so, around privacy and robustness.
WebAssembly never had anything at all to do with the DOM, let alone replacing it.
I suspect you'd quickly find there's a lot more complexity there than you predicted.
Anyway - if that's what you want check out Qt + QML. You can do it badly, but when you use it well it's the best declarative UI framework I know.
I'm quite tired of the hot air pointless uneducated uninformed web bashing. I don't know why the web attracts such angry angry people but for all the complaining the grandparent post is about as real of a counter-idea to the dom as I have ever seen.
what we do with the web is often bad. fully agreed there are terrible experiences everywhere. but starting fresh, starting with different source materials, different rendering data structures: I don't think that targets at all what is wrong. so I just see such complaining as misdirected, complaints against a pop culture that instead focus on the tech that allows that pop culture. and I see it as sabotaging the best tech humanity has going, the freest most expressive most versatile data we've got that we often yes use quite poorly. but that we do get better at. that we continue to evolve our architectures of use around. and I see such grumbles as undermining this great thing, while supporting something limited & domineering & utterly in corporate control, something apart from the greater connected cyberspace: native (awful) apps.
lmao. sources?