story
Edit:
This line from the keynote is also suspect: "And just like your iPhone , independent experts can inspect the code that runs on the servers to verify this privacy promise.".
First off, do "independent experts" actually have access to closed source iOS code? If so we already have evidence that this is sufficient (https://www.macrumors.com/2024/05/15/ios-17-5-bug-deleted-ph...).
The actual standard for privacy and security is open source software, anything short of that is just marketing buzz. Every company has an incentive to not leak data, but data leaks still happen.
>> Independent experts can inspect the code that runs on Apple silicon servers to verify privacy, and Private Cloud Compute cryptographically ensures that iPhone, iPad, and Mac do not talk to a server unless its software has been publicly logged for inspection. Apple Intelligence with Private Cloud Compute sets a new standard for privacy in AI, unlocking intelligence users can trust.
If Apple says it, do they have any disincentives to deliver? Not really. Their ad business is still relatively small, and already architected around privacy.
If someone who derives most of their revenue from targeted ads says it? Yes. Implementing it directly negatively impacts their primary revenue stream.
IMHO, the strategic genius of Apple's "privacy" positioning has been that it doesn't matter to them. It might make things more inconvenient technically, but it doesn't impact their revenue model, in stark contrast to their competitors.