A use case we have been working on with LLMs is to let people know when an answer to their query changes due to revisions of source documents. Obviously, we want to avoid periodically re-computing all queries for the LLM.
Why I think it’s cool? - We don’t spin in a loop to repeat with the LLM. - Alerts are LLM-deduplicated - no spamming users with typo fixes - And the best - our framework, Pathway takes care of handling the updates, the example looks nearly like a regular, static RAG chatbot.
More context + GIF of how it works for Google Drive document alerts: https://pathway.com/developers/showcases/llm-alert-pathway
Happy to have your thoughts!