Now: You upload images to iCloud Photos. When doing so, your device also uploads a separate safety voucher for the image. If there are enough vouchers for CSAM matched images in your library, Apple gains the ability to access the data in the vouchers for images matching CSAM. One of the data elements in the voucher is an “image derivative” (probably a thumbnail) which is manually reviewed. If the image derivative also looks like CSAM, Apple files a report with NCMEC’s CyberTip line. Apple can (for now) access the image you stored in iCloud, but it does not. All the data it needs is in the safety voucher.
Lot of words spilled on this topic, yet I’d be surprised if a majority of people are even aware of these basic facts about the system operation.
I think Apple has botched the rollout of this change by failing to explain clearly how it works. As a result, rumors and misunderstandings have proliferated instead.
https://digit.fyi/apple-admits-scanning-photos-uploaded-to-i...
So the author of the article is technically correct: Apple intentionally uploads CP to their servers for manual review which is explicitly forbidden by law.
He even describes the issue with thumbnails
When you choose to upload your images to iCloud (which currently happens without end-to-end encryption), your phone generates some form of encrypted ticket. In the future, the images will be encrypted, with a backdoor key encoded in the tickets.
If Apple receives enough images that were considered a match, the tickets become decryptable (I think I saw Shamir's Secret Sharing mentioned for this step). Right now, Apple doesn't need that because they have unencrypted images, in a future scheme, decrypting these tickets will allow them to decrypt your images.
(I've simplified a bit, I believe there's a second layer that they claim will only give them access to the offending images. I have not studied their approach deeply.)
In no step of the proposal does Apple access the images you store in iCloud. All access is through the associated data in the safety voucher. This design allows Apple to switch iCloud storage to end to end encrypted with no protocol changes.
They could instead send the list of hashes to the device (which they already must trust is faithfully computing the local hash) and just let the device report when there are hits. It would be much more CPU and bandwidth efficient, too.
The PSI serves the purpose that if Apple starts sending out hashes for popular lawful images connected to particular religions, ethnicity, or political ideologies that it is information theoretically impossible for anyone to detect the abuse. It also makes it impossible to tell if different users are being tested against different lists, e.g. if Thai users were being tested against political cartoons that insult the king.
Not having backdoors is a hard requirement for end to end encryption offering privacy guarantees.