i dunno, reading this thing sure does make it sound like it's only for photos stored in icloud. and the existence of such an elaborate system sure does seem to indicate that photos are end to end encrypted on their servers, otherwise they could just hash them there.
it does the matching client side, so the hashes never leave the client. what it does send with the photos is this threshold thing where if enough of them are hits, it reveals them, otherwise they see nothing.
it seems pretty unequivocal that this is all a system to support scanning of e2e encrypted images on icloud. there would be no need for such an elaborate system if they could scan the images on the server. as far as i know, google does not e2e encrypt images, and for all we know, they actually do engage in server side scanning.
to quote their statement https://www.apple.com/child-safety/:
> To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.
> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
> Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.
> Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.
> Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.