> Every time a new study comes around saying “actually, everyone should be drinking 3 glasses of wine a night”, do you take a trip to the liquor store?
No, and doing so would be an example of blindly trusting something that you haven't or can't verify, so that supports my argument?
> LLMs are far from the only thing you rely on that will confidently lie to you.
Sure, but it's a new class of thing that will and does and yet people are trusting or haven't yet learnt that they can't. I mentioned seeing people trust it via commit messages; I don't see SO the same way, people generally realise they need to verify it, or it at least has a voting mechanism as a proxy. With GPT so far there seems to be a lot more assuming it's correct going on.