Your smart thermometer isn't making Reddit posts trying to sound like a human who's just concerned that the bedroom is a bit too warm.
I think there's different kinds of traffic: raw packets and user-like interactions. I think the OP is not very clear on the line, but it's significant.
If most of the traffic on the internet akin to shell scripts making bulk FTP transfers, it's probably not news. If most of the comments being made or most of the streams on Netfix are being consumed by bots, that's pretty big news.
If you perform simple extrapolation, the M2M data only surpass the others around 2029.
Coincidently, in the original timeline of Transformer movie, 2029 is the year that the Resistance, led by John Connor, destroyed Skynet and ended the war against the machines.
I’d love to see that crossover Terminator and Transformers movie. Optimus Prime vs T-800 anyone?
Leaving the original timeline uncertain.
Was it original, really the original? Or the 10th, or millionth loop?
Skynet can still be in our future.
Who is this official making this pronouncement?
It's just that now the official numbers say so.
But anyone on twitter or reddit can tell you the dead internet theory has been progressing at a swift pace for a decade.
AI just made it more apparent.
Glad I found this quote. It is quite helpful for an AI to search the web on behaolf of me... even if it was finding where I can buy particular/similar peanuts locally I got from abroad.
In fact, even ads ingested by the training data set at this very moment could be useful. Go to Gemini and tell it you want to buy a jacket or whatever and it will recommend some products it ingested from the training data.
The issue in this particular case is that those content and their web servers are set up for human traffic. In the worst case, a human consumes a few megabytes of data from the server and then leaves. A few of those visits will convert into a job or business opportunity - a fair bargain. LLM scrapers are not like that. They're greedy resource hogs. They not only want everything you have, a whole bunch of them do it repeatedly and endlessly to your server. There's no possible way to justify the cost of such massive bandwidth consumption for a bunch of parasites that never give anything in return. And what do we get? A crappy user experience from all those sites putting up protection measures. This is the tragedy of the commons.
So who is the culprit? The greedy bunch who created the technology that behaves like this and then benefits immensely from it. Are those bad people? Absolutely! Naturally, we need them and their ill intentioned creations off our shared spaces. This isn't anything new. This game has been playing out in different forms since eternity.
Playwright launched in 2020; similar projects have since launched; similar project existed before.
It used to be automated by script, now you even have AI.
We now have a dead internet.
It's also important to understand what's happening. Why or what would the scripted bot be doing? It's not read, otherwise nobody would notice. They are actually posting things. It's not just posting cat pictures, nobody would notice that neither.
Each bot has a different intention, but universally they all has mass intention to manipulate some subject. Reddit is bad because the bots have the power to curate content with downvotes.
So online discussions have synthetic content intending to change opinions. How does that interact with the various subjects you're interested in?
What's even crazier is the intersection of echo chambers and bots. There are people who have blocked essentially all humans and live in a world of bots who agree with them. It is causing insane social problems.
The current one is awful, and there's so much AI/Bot content, but I can find far more detailed information using AI enabled search that isn't covered in ads. I can get an initial overview of methodology without trawling through SEO articles.
I think AI has been almost a natural response to the enshittification of the internet - ChatGPT wouldn't seem so transformative if google search was working like google search rather than ad generator 5000 before it released.
Best thing to do is to avoid idly browsing social media and curate your internet experience.
But honestly, if google provided me with a good search, I probably would seriously reduce my AI usage when researching. E
If AI slop is replacing the content you were consuming, it was already slop.
Which means filtering and ranking systems become the main bottleneck.
That pushes platforms toward stronger algorithmic selection and sometimes stronger convergence of attention.
Once content gets cheap, the winners are less likely to be the best creators and more likely to be the strongest gatekeepers.