In 1998, I'm sure there were newspaper companies who failed at transitioning online, didn't get any web traffic, had unreliable servers crashed, etc. This says very little about what life would be like for the newspaper industry in 1999, 2000, 2005, 2010, and beyond.
AI will get better at making good maintainable and explainable code because that’s what it takes to actually solve problems tractably. But saying “code quality doesn’t matter because AI” is definitely not true both experientially and as a prediction. Will AI do a better job in the future? Sure. But because their code quality improves not because it’s less important.
Where we're headed is toward a world where a ton of software is ephemeral, apps literally created by AI out of thin air for a single use, and then gone.
Guns, wheels, cars, ships, batteries, televisions, the internet, smartphones, airplanes, refrigeration, electric lighting, semiconductors, GPS, solar panels, antibiotics, printing presses, steam engines, radio, etc. The pattern is obvious, the forces are clear and well-studied.
If there is (1) a big gap between current capabilities and theoretical limits, (2) huge incentives for those who to improve things, (3) no alternative tech that will replace or outcompete it, (4) broad social acceptance and adoption, and (5) no chance of the tech being lost or forgotten, then technological improvement is basically a guarantee.
These are all obviously true of AI coding.
It isn't even a good job of cherry picking: we never got mainstream supersonic passenger aircraft after the Concorde because aerospace technology hasn't advanced far enough to make it economically viable and the decrease in progress and massively increasing costs in semiconductors for cutting edge processes is very well known.
There's no broad social acceptance of supersonic flight because it creates incredibly loud sonic booms that the public doesn't want to deal with. And despite that, it's still a bad counterexample, as companies continue to innovate in this area e.g. Boom Supersonic.
At best you can say, "It's taking longer than expected," but my point was never that it will happen on any specific schedule. It took 400 years for guns to advance from the primitive fire lances in China to weapons with lock mechanisms in the 1400s. Those long time frames only prove my point even more strongly. Progress WILL happen, when there is appetite and acceptance and incentive and room to grow, and time is no obstacle. It's one of the more certain things in human history, and the forces behind it have been well studies.
Just as certain: the people and jobs who are obsoleted by these new technologies often remain in denial until they are forgotten.
The death of newspapers is quite the spectacle too. No one seems to understand how bad it is... the youngest generation can't even seem to recognize that anything is missing. We've effectively amateurized journalism so that only grifters and talentless hacks want to attempt it, and only in tiny little soundbites on Twitter or other social media (and they're quickly finding out how it might be more lucrative to do propaganda for foreign governments or MLM charlatanism). When the death of the software industry is complete, it too will have been completely amateurized, the youngest generation will not even appreciate that people used to make it for a living, and the few amateurs doing it will start to comprehend how much more lucrative it will be to just make poorly disguised malware.
It is absolutely the case that virtual reality technology will only get better over time. Maybe it'll take 5, or 10, or 20, or 40 years, but it's almost a certainty that we'll eventually see better AR/VR tech in the future than we have in the past.
Would you bet against that? You'd be crazy to imo.
See https://simonwillison.net/guides/agentic-engineering-pattern...
All else being equal, ofc you'd rather have good code than bad code.
But millions of non-developers who can suddenly build simple software for themselves (or use AI assistants that generate simple ephemeral software on the fly) aren't going to care about the underlying code quality so long as it works, which it does.
And hundreds of thousands of software engineers, who can suddenly build singlehandedly what in the past took a team of 5-10 to build, are going to be okay with the tradeoff of getting massive speed boosts but with quality not quite as high as if they were to build everything themselves. Same for software engineers who are now churning through their list of side project ideas, which has mostly sat dormant for the past 10 years.
Previously they lived in a world where the only axes were, "Build more things vs have good quality" or "Build ambitious things vs have good quality." Now they have a new axis which is, "Allow AI to help vs have total personal control of quality," which is quite similar to the pre-existing axis of, "Hire employees vs have total personal control of quality." But way cheaper and more accessible. Of course some will take advantage. Which means, de facto, a world where code quality (or at least personal control of code quality) is on average less important and prioritized than it used to be.
We're seeing all of this happen right now. People are making these choices in large numbers today.
Whether what they're using in 20 years is produced by the company formerly known as Facebook or not is a whole different question.