Previously grounding was $35 per 1,000 queries.
1) AIs are not reliable. Even if they are not hallucinating, their training may have been compromised by misinformation, either accidentally or deliberately.
2) People who fall victim to misinformation are often not interested in fact checking, so your product may be preaching to the choir.
3) The most important claims that need fact checked are often made about something that happened recently, or where the facts lay buried deep in large government reports. AI is not going to use such information in real-time to give an accurate response.
1) 100% agreed. I would never release a tool like this that just used an LLM. But this tool uses Grounding with Google Search. So in this case Gemini uses Google Search as a tool to try to verify whether the statement is true or not. If it cannot ground its response, it returns a message saying that it can't fact check the statement - it will never return an LLM's "opinion". If it can ground it, it returns an answer with citations and links to Google searches.
2) That may be true. But even I have found it useful as I'm reading comments on YouTube posts and in Facebook groups to verify whether what people are saying is true or not. Even people I agree with spread (intentionally or not) mis- and disinformation since it's so easy to do.
3) Per the first point, since it's grounded in Google Search it is giving very real-time responses. Try it with the headlines on any news site.
Synonyms: detain, stem, clog, bottleneck, balk, baffle, hobble, bridle, block, obstruct, hamper, hinder, impede, to cause a reduction, such as in rate or intensity; diminish.
Use it in a sentence.
He checked their facts.
fact-check : to verify the factual accuracy of
"Our fact-check work is supported in part by a grant from Meta."