> But of course the output contains errors sometimes. So do search engine results.
That's not true.
Search engine results are links and (non-AI generated) summaries of existing resources on the web. No search engine returns links to resources it generated as the result of the search query. Those resources can have innacurate information, yes, but the search engine itself does not returns errors.
LLMs output do not contain errors "sometimes". The output of an LLMs is never truthful nor false in itself. In the same way that the next word your keyboard suggests for you to type on a mobile device is never truthful nor false. It's simply the next suggestion based on the context.
These two methods of accessing information very clearly do not have the same limitations. A search engine provide link to specific resources. A LLM generates some approximation of some average of some information.
It's up to intelligent thinking people to decide whether a LLM or a search engine is currently the best way for them to parse through information in search for truth.