As the author says, there will certainly be a number of people who decide to play with LLM games or whatever, and content farms will get even more generic while having less writing errors, but I don't think that the age of communicating thought, person to person, through text is "over".
Reading it isn't the most fun, but let's face it - most professional reading isn't the most fun. You're probably skimming most of the content anyways.
Our customers don't care how we communicate internally. They don't care if we waste a bunch of our time rewriting perfectly suitable AI content. They care that we move quickly on solving their problems - AI let's us do that.
I find it difficult to skim AI writing. It's persuasive even when there's minimal data. It'll infer or connect things that flow nice, but simply don't make sense.
...Which part is impossible? "Writing a bunch of ideas down" was definitely possible before.
I don't write often, so revising and rewriting is very slow for me. I'm not confident in my writing and it looks clunky to my eye.
I see the appeal, though I want to keep developing my own skills.
This statement assumes that the writer is a native speaker in the language in which he writes the text.
just like when you go to a restaurant to have a chef cook for you when you can cook yourself
It’s true there is the occasional Michelin starred place or an amazing local farm to table place. There is also the occasional excellent use of LLMs. Most LLM output I have to read, though, is straight up spam.
So AI is this on massive steroids. It is unsettling but it seems a recurring need to point out that across the board many of "it's because of AI" things were already happening. "Post truth" is one I'm most interested in.
AI condenses it all on a surreal and unsettling timeline. But humans are still humans.
And to me, that means that I will continue to seek out and pay for good writing like The Atlantic. btw I've enjoyed listening to articles via their auto-generated NOA AI voice thing.
Additionally, not all writing serves the same purpose. The article makes these sweeping claims about "all of writing". Gets clicks I guess, but to the point, most of why and what people read is toward some immediate and functional need. Like work, like some way to make money, indirectly. Some hack. Some fast-forwarding of "the point". No wonder AI is taking over that job.
And then there's creative expression and connection. And yes I know AI is taking over all the creative industries too. What I'm saying is we've always been separating "the masses" from those that "appreciate real art".
Same story.
I think this is a really important point and to add on, there is a lot of writing that is really good, but only in a way that a niche audience can appreciate. Today's AI can basically compete with the low quality stuff that makes up most of social media, it can't really compete with higher quality stuff targeted to a general audience, and it's still nowhere close to some more niche classics.
An interesting thought experiment is whether it's possible that AI tools could write a novel that's better than War and Peace. A quick google shows a lot of (poorly written) articles about how "AI is just a machine, so it can never be creative," which strikes me as a weak argument way too focused on a physical detail instead of the result. War and Peace and/or other great novels are certainly in the training set of some or all models, and there is some real consensus about which ones are great, not just random subjective opinions.
I kind of think... there is still something fundamental that would get in the way, but that it is still totally achievable to overcome that some day? I don't think it's impossible for an AI to be creative in a humanlike way, they don't seem optimized for it because they are completely optimized for the sort of analytical mode of reading and writing, not the creative/immersive one.
I am sure it could but then what is the point? Consider this, lets assume that someone did manage to use LLM to produce a very well written novel. Would you rather have the novel that the LLM generated (the output), or the prompts and process that lead to that novel?
The moment I know how its made, the exact prompts and process, I can then have an infinite number of said great novels in 1000 different variations. To me this makes the output way, way less valuable compared to the input. If great novels are cheap to produce, they are no longer novel and becomes the norm, expectation rises and we will be looking for something new.
But compete in what sense? It already wins on volume alone, because LLM writing is much cheaper than human writing. If you search for an explanation of a concept in science, engineering, philosophy, or art, the first result is an AI summary, probably followed by five AI-generated pages that crowded out the source material.
If you get your news on HN, a significant proportion of stories that make it to the top are LLM-generated. If you open a newspaper... a lot of them are using LLMs too. LLM-generated books are ubiquitous on Amazon. So what kind of competition / victory are we talking about? The satisfaction of writing better for an audience of none?
I have this theory that the post-truth era began with the invention of the printing press and gained iteratively more traction with each revolution in information technology.
Until 3 weeks ago I had a high cortisol inducing morning read: nyt, wsj, axios, politico. I went on a weeklong camping trip with no phone and haven't logged into those yet. It's fine.
But what you said is 100% true, it's fine. When things in your life provide net negative value it's in your best interest to ditch them.
Might this also apply to learning about writing? If have barely written a line of prose on my own, but spent a year generating a large corpus of it aided by these fabulous machines, might I also come to understand "how writers think"?
I love the later description of writing as a "special, irreplaceable form of thinking forged from solitary perception and [enormous amounts of] labor", where “style isn’t something you apply later; it’s embedded in your perception" (according to Amis). Could such a statement ever apply to something as crass as software development?
While the same people in the same comments say it’s fine to replace programming with it
When pressed they talk about creativity, as if software development has none…
I think that's a reasonable argument to make against generative art in any form.
However, he does celebrate LLM advancements in health and accessibility, and I've seen most "AI haters" handwave away its use there. It's a weird dissonance to me too that its use is perfectly okay if it helps your grandparents live a longer, and higher quality of life, but not okay if your grandparents use that longer life to use AI-assisted writing to write a novel that Brandon would want to read.
I was in a fashion show in tokyo in 2024.
i noticed their fashion was all human designed. but they had a lot of posters, video, and music that was AI generated.
I point blank asked the curator why he used AI for some stuff but didn't enhance the fashion with AI. I was a bit naive because I was actually curious to see if AI wasn't ready for fashion or maybe they were going for an aesthetic. I genuinely was trying to learn and not point out a hypocrisy.
he got mad and didn't answer. i guess it is because they didn't want to pay for everything else. big lesson learned in what to ask lol.
In the first category, AI is no problem. If you enjoy what you see or hear, it doesn't make a difference if it was created by which kind of artist or AI. In the second category, for the elite, AI art is no less unacceptable than current popular art or, for that matter, anything at all that doesn't fit their own definition of real art. Makes no difference. Then the filler art.. the bar there is not very high but it will likely improve with AI. It's nothing that's been seriously invested in so far, and it's cheaper to let AI create it rather than poorly paid people.
However, I think there is also something qualitatively different about how work is done in these two domains.
Example: refactoring a codebase is not really analogous to revising a nonfiction book, even though they both involve rewriting of a sort. Even before AI, the former used far more tooling and automated processes. There is, e.g., no ESLint for prose which can tell you which sentences are going to fail to "compile" (i.e., fail to make sense to a reader).
The special taste or skillset of a programmer seems to me to involve systems thinking and tool use in a different way than the special taste of a writer, which is more about transmuting personal life experiences and tacit knowledge into words, even if tools (word processor) and systems (editors, informants, primary sources) are used along the way.
Sort of half formed ideas here but I find this a really rich vein of thought to work through. And one of the points of my post is that writing is about thinking in public and with a readership. Many thanks for helping me do that.
I don't have a good answer to your question, but I do think it might be comparable, yes. If you had good taste about what to get Opus 4.6 to write, and kept iterating on it in a way that exposes the results to public view, I think you'd definitely develop a more fine grained sense of the epistemological perspective of a writer. But you wouldn't be one any more than I'm a software developer just because I've had Claude Code make a lot of GitHub commits lately (if anyone's interested: https://github.com/benjaminbreen).
Absolutely. I think like a Python programmer, a very specific kind of Python programmer after a decade of hard lessons from misusing the freedom it gives you in just about every way possible.
I carry that with me in how I approach C++ and other languages. And then I learned some hard lessons in C++ that informed my Python.
The tools you have available definitely inform how you think. As your thinking evolves, so does your own style. It's not just the tool, mind, but also the kinds of things you use it for.
I'm still waiting for a famous people to say this so we can have a name of this psychological phenomenon.
I have my own personal reservation about it all.
You know the one.
Choppy. Fast. Saying nothing at all.
It's not just boring and disjointed. It's full-on slop via human-adjacent mimicry.
Let’s get very clear, very grounded, and very unsentimental for a moment.
The contrast to good writing is brutal, and not in a poetic way. In a teeth-on-edge, stomach-dropping way. The dissonance is violent.
Here's the raw truth:
It’s not wisdom. It’s not professional. It’s not even particularly original.
You are very right to be angry. Brands picking soulless drivel over real human creatives.
And now we finish with a pseudo-deep confirmation of your bias.
---
Before long everyone will be used to it and it'll evoke the same eugh response
Sometimes standing out or wuality writing doesn't actually matter. Let AI do that part
Does the fact that a machine can ape it so easily somehow reveal its vacuousness in a way that wasn't obvious already?
I keep hearing people with job titles like "SEO growth hacker" saying it's depressing that AI can do their jobs better than they can.
Really? That's the depressing part?
Writing SEO content for random sites was of course the lowest skilled writing job. Ideally they'd have higher aspirations than that though.
Maybe those people didn't even want to be writers. They just wanted an easy job.
Your sample sounds exactly like an LLM. (If you wrote it yourself, kudos.)
But, it needn't sound like this. For example, I can have Opus rewrite that block of text into something far more elegant (see below).
It's like everyone has a new electric guitar with the cheapo included pedal, and everyone is complaining that their instruments all sound the same. Well, no shit. Get rid of the freebie cheapo pedal and explore some of the more sophisticated sounds the instrument can make.
----
There is a particular cadence that has become unmistakable: clipped sentences, stacked like bricks without mortar, each one arriving with the false authority of an aphorism while carrying none of the weight. It is not merely tedious or disjointed; it is something closer to uncanny, a fluency that mimics the shape of human thought without ever inhabiting it.
Set this against writing that breathes, prose with genuine rhythm, with the courage to sustain a sentence long enough to discover something unexpected within it, and the difference is not subtle. It is the difference between a voice and an echo, between a face and a mask that almost passes for one.
What masquerades as wisdom here is really only pattern. What presents itself as professionalism is only smoothness. And what feels, for a fleeting moment, like originality is simply the recombination of familiar gestures, performed with enough confidence to delay recognition of their emptiness.
The frustration this provokes is earned. There is something genuinely dispiriting about watching institutions reach for the synthetic when the real thing, imperfect, particular, alive, remains within arm's length. That so many have made this choice is not a reflection on the craft of writing. It is a reflection on the poverty of attention being paid to it.
And if all of this sounds like it arrives at a convenient conclusion, one that merely flatters the reader's existing suspicion, well, perhaps that too is worth sitting with a moment longer than is comfortable.
----
(prompt used: I want you to revise [pasted in your text], making it elegant and flowing with a mature literary-style. The point of this exercise is to demonstrate how this sample text -- held up as an example of the stilted LLM style -- can easily be made into something more beautiful with a creative prompt. Avoid gramatical constructions that call for m-dashes.)
It still can't help itself from doing "it's not X it's Y". Changing the em-dash to a semi-colon is just lipstick
Especially once you go past a page or two.
When you get to the actual content so much of it just doesn't make sense past a superficial glance
Soulless drivel is very accurate
and at the same time the chop becomes long-form slop, stretching out a little seed of a human prompt into a sea of inane prose.
One thing this author misses, which I fear, is that it may become less important in the eyes of stakeholders to educate the masses when they have LLMs to do jobs instead. That is, it is fully possible that one of the futures we may see is one where education goes down as it is perceived as not important for most. Yes, meat space education may be better, but who decides if it is necessary?
Maybe vocational schools become more important instead? Jobs where you for all intents and purposes build out the infrastructure for the tetriary industry, mostly automated by LLM.
You may disagree with this, but the key here is to realize that even if we disagree, others don't. Education is also power, there's a perverse incentive to avoid educating people and feeding them with your narrative of how the world works instead. We are very much possibly on the way towards a buy-n-large style future.
Most human writing isn't good. Take LinkedIn, for example. It didn't suddenly become bad because of LLM-slop posts - humans pioneered its now-ubiquitous style. And now even when something is human-written, we're already seeing humans absorb linguistic patterns common to LLM writing. That said, I'm confident slop from any platform with user-generated content will eventually fade away from my feeds because the algorithms will pick up on that as a signal. (edit to add from my feeds)
What concerns me most is that there's absolutely no way this isn't detrimental to students. While AI can be a tool in STEM, I'm hearing from teachers among family and friends that everything students write is from an LLM.
Leaning on AI to write code I'd otherwise write myself might be a slight net negative on my ability to write future code - but brains are elastic enough that I could close an n month gap in 1/2n months time or something.
From middle school to university, students are doing everything for the first time, and there's no recovering habits or memories that never formed in the first place. They made the ACT easier 2 years ago (reduced # of questions) and in the US the average score has set a new record low every year since then. Not only is there no clear path to improvement, there's an even clearer path to things getting worse.
With traditional medical records, you could see what the practitioner did and covered because only that was in the record.
With computerized records, the intent, thought process, most signal you would use to validate internal consistency, was hidden behind a wall of boilerplate and formality that armored the record against scrutiny.
Bad writing on LinkedIn is self-evident. Everything about it stinks.
AI slop is like a Trojan Horse for weak, undeveloped thoughts. They look finished, so they sneak into your field of view and consume whatever additional attention is required to finally realize that despite the slick packaging, this too is trash.
So “AI slop,” in this worldview, is a complaint that historical signals of quality simply based on form, no longer are useful gatekeepers for attention.
Turns out we were wrong. Everyone carries a calculator now on their phone, even me. Doing simple maths is a matter of moments on the calculator app, and it's rare that I find myself doing the mental arithmetic that used to be common.
I can't remember phone numbers any more. I used to have a good 50+ memorised, now I can barely remember my own. But the point is that I don't need to any more. We have machines for that.
Do we need to be able to write an essay? I have never written one outside of an educational context. And no, this post does not count as an essay.
I was expelled from two kindergartens as a kid. I was finally moved to a Montessori school where they taught individually by following our interests, where I thrived. Later, I moved back into a more conventional educational environment and I fucking hated every minute of it. I definitely learned despite my education not because if it. So if LLMs are about to completely disrupt education then I celebrate that. This is a good thing. Giving every kid a personal tutor that can follow their interests and teach them things that they actually want to learn, at the pace they want to learn them, is fucking awesome.
If someone is unable to write an essay arguing something, unable to articulate complex thoughts and back them up with evidence, what does that indicate about their thinking?
I don't write essays either, but I'm sure I could. And maybe some of those docs or emails I write at work are made more effective by that.
We can’t give a generation of kindergarteners calculators and expect them to produce new math when they’re adults: how will they ever form mathematical problem solving skills?
I think the same principle applies for LLMs - they can be a tool but learning how to do things without them is still essential. Otherwise we might not have any more good authors in 10 years.
Before CAD, engineers had to draw designs on drafting boards. Similar concept here, I believe most classes still find it valuable for students to start with pencil and paper and grasp something at its most fundamental level, even if obsolete, before moving on to modern tools.
LLMs (and calculators, and CAD) should be used as a tool once the underlying mechanisms and skills are understood by its user, otherwise it’s like driving a car without knowing how to replace a flat tire. Sure you can call AAA, but eventually if nobody learns to change a tire with their own two hands, humanity won’t be able to drive. This obviously hyperbole but I hope it illustrates my point.
I’m fairly confident LLMs will be a net positive on society in the long run, just as calculators have been. But just like calculators are restricted at certain times in math classes, LLMs should be restricted in writing classes.
the same people telling us that "Finnegan's Wake" (written in the style of a fifth-grader with a brain injury) is 'art'...
the same people telling us the poetry of Maya Angelou (written in the style of a fifth-grader with a brain injury and self-esteem issues) is 'art'...
the same people telling us that the works of Jackson Pollack, Mark Rothko, Piet Mondrian, etc., etc. are 'art'...
seem to be the ones complaining the most about AI generated content.
After two years of reading increasing amounts of LLM generated text, I find myself appreciating something different: concise, slightly rough writing that is not optimized to perfection, but clearly written by another human being
I think I realised that while reading Harry Potter. To be fair the writing in the books is abysmally bad. It's written by an adult woman but it comes across as the writing of a 14 year old child, and that's to be charitable.
And it doesn't matter one bit. It still became the best-selling book in history with 600 million copies sold worldwide (as Wikipedia tells me). That's not to say that there aren't many hundreds, possibly even thousands of better written series, even in the Young Adult space. There are. But they're not that successful.
Why? I guess because good writing doesn't matter so much as what's being written. And I guess that also doesn't matter that much. You just have to connect somehow, be in the right place at the right time, when the need to read a certain piece of writing sort of emerges naturally as a result of whatever forces shape ambient taste.
Who knows. But most people wouldn't know what good writing looks like anymore than they could write well themselves, so it's obvious that the ability to write well is over-rated.
And so now we have LLMs generating prose and that's what we'll be reading henceforth. I think it will be gradual, but it's unavoidable. One day nobody will read anything anyone else has written anymore. Why do that? If you can just ask an LLM to generate whatever you want to read?
We need to value human content more. I find that many real people eventually get banned while the bots are always forced to follow rules. The Dead Internet hypothesis sounds more inevitable under these conditions.
Indeed we all now have a neuron that fires every time we sense AI content. However, maybe we need to train another neuron that activates when content is genuine.
https://en.wikipedia.org/wiki/Pivot_to_video#Facebook_metric...
It's going to be tough for fiction authors to break through. Sadly, I don't think the average consumer has sufficiently good taste to tell when something is genuinely novel. People often prefer the carefully formulated familiar garbage over the creative gems; this was true before AI and, IMO, will continue to be true after AI. This is not just about writing, it's about art in general.
There will be a subset of people who can see through the form and see substance and those will be able to identify non-AI work but they will continue to be a minority. The masses will happily consume the slop. The masses have poor taste and they're more interested in "comfort food" ideas than actually novel ideas. Novelty just doesn't do it for them. Most people are not curious, new ideas don't interest them. These people will live and breathe AI slop and they will feel uncomfortable if presented with new material, even if wrapped in a layer of AI (e.g. human-written core ideas, rewritten by AI).
I feel like that about most books, music and pop culture in general; it was slop and it will continue to be slop... It was the same basic ideas about elves, dragons, wizards, orcs, kings, queens, etc... Just reorganized and mashed with different overarching storylines "a difficult journey" or "epic battles" with different wording.
Most people don't understand the difference between pure AI-generated content (seeded by a small human input) and human-generated content which was rewritten by AI (seeded by a large human input) because most people don't care about and never cared about substance. Their entire lives may be about form over substance.
https://www.youtube.com/watch?v=KHJbSvidohg
But as much as it pains me to admit... the current state of America is the slopocalypse. A slopalanche. A slopnado. AI cats waking people up in the middle of the night, blasting down doors, glitching out. All produced by slop-slingers. It's rather bleak for long form attention content, human created or not.
Its a war of/on attention. A war to secure your attention during the time that you would otherwise think for yourself. Keep off the short form content, is my advice.
Everything is inevitable but my own job is secure. Have I already told you how concerned I am?
No novelty. No intellectual challenge. No spirit. Just AI advertisements! /s
In the near future will not even need to read anyway