That’s true but I think he’s suggesting it generates ideas which you can then research. You would know that it was hallucinating when you go to research a topic and find nothing. So using it as a discovery tool basically.
Heavy caution... I tried this with GPT3 on a topic I know well (electric motors) and beyond what you might find in the first page of a search engine it went to hallucination station pretty quickly.