But it’s actively unhelpful in explaining the phenomenon, as there is no justification for equivocating LLM and human behavior. It’s just confusing and misleading.
This is obviously wrong. LLMs are trained on material humans created. Everything they output is a result of a human input, even if not a direct result.