"Bullshitting" seems like a good term for accurate or inaccurate responses.
Let's extend "LLMs have no intention of being wrong" to "LLMs have no inherent sense of being correct" - sometimes their predictions happen to be correct, sometimes they don't. But they're all hallucinations generated from the same process.
Nah, it can be just talking without much rigor or verification of fact or anything, often with a loose boundary between opinion and fact - the "to talk in an exaggerated or foolish manner" definition.
As in "my buddies and I were bullshitting about movies the other day."
ChatGPT definitely talks with an exaggerated manner confidence-wise.