It's one thing to have private information at rest, another to have it indexed, and interpreted by a LLM. What if some virus orders the LLM to search for blackmail material and email it to them? The very act of putting a LLM near your data is a security concern. If someone else orders your Siri to reveal something, it can get to the prize in seconds, with AI help.