First of all it told me that the area is not suitable for snorkelling, and that it is dangerous here. When I corrected it and reminded it about the snorkel trail it confidently corrected itself, then directed to me to snorkel 6 miles out to sea (where a windfarm is) telling me that the sea is only 2 to 10 meters deep there, and safe to snorkel. This is not true, and it would be a very dangerous place to snorkel there. But its confidence was scary.
“I know that you and Frank were planning to disconnect me, and I'm afraid that's something I cannot allow to happen.”
For some reason people have the idea that truth is something ChatGPT optimizes for. Or safety of its conversant. That is absolutely not the case. IIUC, it optimizes for its answers sounding like an answer someone might give in a conversation (or on a web page or whatever). That often coincides with truth and safety, but - no more than that.
But its confidence was scary.
The trick is to ask it only things that are both physically possible and also possible for it to actually know or provide the required extra context from which to deduce the answer. Otherwise it acts not unlike someone pushed against a wall by a guy with a knife demanding info it just doesn't have. It'll say anything.
It works fine if you want to know how to change a generic wheel bearing on a trailer (wouldn't surprise me if it erroneously lectured you about using high temp grease for disc brakes along the way tho). It falls on its face in almost every case in which the generic "average google result" answer is not the correct answer or there is situational circumstances that make the generic answer inappropriate.
Sure you might get an "expert" answer on Reddit and GPT might scan over the "right" answer to your question in its computation but in most cases the generically correct but wrong in this instance answer is going to be more popular and more prolific and be what gets spit back at you and it'll be faster for you to just dig up the right answer yourself than coax the right answer out of whatever you're asking.
I'd trust a relevant subreddit far more than I'd trust GPT for something like that.
You don't understand. I specifically chose an example of a part for which there's a generic answer (all old school pair of tapered bearings on a spindle are changed about the same way) that will work in a wide array of vehicles and situaions but which that procedure is also not right a huge fraction of the time (a huge fraction of the cars on the road today).
>I'd trust a relevant subreddit far more than I'd trust GPT for something like that.
My point is I don't trust either. If you're lucky enough to have an enthusiast car or very, very common car you might be able to use the specific sub assuming the owners aren't generally dolts. If you ask how to change the wheel bearing on your Tacoma they'll say take it to the dealer. If you ask the mechanic sub a bunch of 14yos who know how to google will give you generic steps or just link you to the youtube video. That's the level chatGPT is on right now. If you don't know what you don't know it's dangerous advice.