A guy I went to highschool with complains endlessly about AI generated art and graphics (he's an artist) and like you, just wants to bury his head in the sand.
Consumers don't care if art is generated by AI or humans and in a short period of time, you won't be able to tell the difference.
With the money being poured into AI by all major tech companies, you will be unemployed if you don't keep up with AI.
Maybe not yet. The real "art" consumers were always very sensitive and asking for originality (thus scarcity). It is an essential principle of the art that it is a result of thousands/millions of deliberate choices. If you use machine for creation, you less choices. You delegate most of your talented/crazy/hard choices to the model (which is based on such choices of already talented but combines them in a random way). The result is thin, diluted even it seems like deliberate. In my opinion the most art lovers will continue to seek for the dense art made by human, asking for some kind of proof. :) The real art will be even more appreciated. I guess.
Anyone who spent time learning the AI tools over that period of time has basically wasted their time. Working with agents is nothing like prompt engineering. I imagine whatever comes after will be nothing like agents etc. Sounds like those who try to keep up with AI will be equally unemployed.
I suppose I shouldn't care too much. Less competition for people like me that have embraced the change.
I mean you're not wrong the serious people drop into assembly when they need too. Even if you work in a context where you can't or don't drop down into assembly being able to make your own compilers is incredibly useful.
My point was that comparing the rise of AI tooling to the rise of HLL compilers is a much better comparison than comparing it to crypto.
HLL compilers were originally seen as crutches and inferior tools and that "real" programmers used assembly. Compiler-generated code was derided as inefficient and ugly.
And it was! In the early days, a good programmer who knew the machine could outdo the compiler. But that didn't stop a huge expansion of new programmers who could write COBOL and FORTRAN but never learned assembly. And the compilers got better over time. These days it's a rare wizard who can outdo a compiler's optimizations, and it takes multiple orders of magnitude longer for those rare humans to achieve it.
LLM tooling isn't going away. Even in these very early days, it enables non-programmers to construct basic applications that work, using English requests! And the tools have gotten better on almost a monthly basis.
You can like them or not like them, just like the early programmers could like or not like compilers. But dismissing them as analogous to empty crypto hype is a bad comparison.