Call us back when the verification layer is verified to detect errors and hallucinations.
So exactly the wrong way to use LLMS? Cool story bro.