Is that really possible to fix that just from a plug-in? All it has to do is admit when it doesn't have the answer, and yet it won't even do that. This leads me to think that ChatGPT doesn't even know when it's lying, so i can't imagine how a plug-in will fix that.
E.g. if you could just ask [THING] for the true answer, or verify an answer trivially with it... just ask it directly!
I ran into this issue with some software documentation just this morning - the answer was helpful but completely wrong in some intermediate steps - but short of a plugin that literally controlled or cloned a similar dev environment to mine that it would take over, it wouldn't be able to tell that the intermediate result was different than it claimed.
As an example, I once asked it to show me the diff between two revisions of the code it was writing an it made something that looks like it might be a valid patch but did not represent the difference between the two versions.
Of course this specific problem could be fixed with a simple plug-in that runs the unix diff program but that wouldn't fix the root-cause, and i would argue that providing a special-case for every type of request is antithetical to what AI is supposed to be since this effectively is how alpha and Google already work.