story
I propose calling it "thunking"
If you do math in your head or math with a pencil/paper or math with a pocket calculator or with a spreadsheet or in a programming language, it is all the same thing.
The only difference with LLMs is the anthropomorphization of the tool.
also, sorry but you (fellow) nerds are terrible at naming.
while "thunking" possibly name-collides with "thunks" from CS, the key is that it is memorable, 2 syllables, a bit whimsical and just different enough to both indicate its source meaning as well as some possible unstated difference. Plus it reminds me of "clunky" which is exactly what it is - "clunky thinking" aka "thunking".
And frankly, the idea it's naming is far bigger than what a "thunk" is in CS
you guys would have called lightsabers "laser swords" like Lucas originally did before Alec Guinness corrected him
That's prima facie absurd on the face of it, so I don't know what it means. You would have to a philosophical zombie to make such an argument.
Humans are special, we emit meaning the way stars emit photons, we are rare in the universe as far as empirical observation has revealed. Even with AGI the existence of each complex meaning generator will be a cosmic rarity.
For some people that seems to be not enough, due to their factually wrong word views they see themselves as common and worthless (when they empirically aren't) and need this little psychological boost of unexaminable metaphysical superiority.
But there is an issue of course, the type of thinking humans do is dangerous but net positive and relatively stable, we have a long history where most instantiations of humans can persist and grow themselves and the species as a whole, we have a track record.
These new models do not, people have brains that as they stop functioning they stop persisting the apparatus that supports the brain and they die, people tend to become less capable and active as their thinking deteriorates and hold less influence ocer others accept in rare cases.
This is not the case for an LLM, they seem to be able to hallucinate endlessly and as they have access to the outside world maintain roughly their same amount of causal leverage, their clarity and accuracy of their thinking isn't tied to their persisting.