> The thing is that even if I was wrong (I'm not) and AI was somehow helpful for software engineering (it isn't), I still wouldn't want to use it.
So even if you were wrong on the facts (you are) you still wouldn't change your mind? In other words, you're unreasonable and know you're unreasonable and think that's totally fine?
Well, cool. Next time, lead with that.
At one point the author writes
> AI is a tool that can only produce software liabilities
which I would argue is completely caused by misuse of AI. Sure, you can have AI write a ton of code that often comes with subtle bugs. But using AI doesn't mean that it has to write any code for you at all. I've been using LLM often for security analysis and the results are quite good. Vulnerabilities that we had collectively missed were shown and we could fix them ourselves.
In this case, instead of creating liabilities, we were able to use LLM to get more information about our code. It's completely possible we could have deduced this information on our own, but we didn't and LLM is capable of doing it much more quickly than humans.
I only commit code that is roughly the same as I would have written anyway.
It feels as good for developer ergonomics as the move away from CRT monitors.
It’s about the same for AI coding, I just get better results.
- Vendors get to know everything about you
- Chips are becoming more politicized; I fear artificial scarcity as with housing will be put on chips, driving up prices.
- It causes a lot of centralisation. No, I cannot run deepseek at home. I don´t have 100.000+ USD laying around. 1TB of VRAM is not chump change.
- It can be a threat to the flourishing of open source. There is no longer a reason for me to work with other devs to build something in public together. I just have the LLM write what I need. It isolates.
These are the only drawbacks. Eveything else is clearly the artisans' ego getting in the way. That being said, if a piece of code is critical infra onto which many other things hinge, I will still hand code it.
People used to drive manual. Now it’s all automatic transmission. Some cars even drive itself.
People used to proudly use Vi to write code. But now IDE is commonplace.
People used to write asm by hand. Transport Tycoon was written in assembly. But these days that would be insane.
Technological progress is an absolute thing. It produces too much convenience and wealth to ignore.
You want to be delivery service that takes 2 days instead of 30 minutes to bring you pizza so that you don't forget how to ride your horse..?
Your craft can be typing out code on a keyboard; or it can be building things in the best possible way with the best available tools.
When doing agentic development, you need to be in control, at least for now. Every frontier model will still do incredibly stupid stuff, and if you let it cook unchallenged, you'll have a codebase that doesn't scale. Claude will happily keep piling turds upon your tower of turds, but at some point, even an LLM will have a hard time working in it.
When you are at the wheel, the engineering hasn't changed. You're still solving all the same problems, but you can iterate a lot faster. Code is now ~free, and the cost of having a bad idea is now much cheaper, because you can quite literally speak the solution out loud and fix it in a few minutes.
When it comes to employment and other people paying you to code, though, not using AI is increasingly a non-starter for most of us.