Not only that, it commits a felony when it transmits found CSAM to a party other than NCMEC (i.e. Apple itself).
Your phone and photos app ask for this and you have to accept it before it is enabled. Sounds voluntarily to me.
Worth reading: https://pingthread.com/thread/1424873629003702273
I know someone at Apple who knows their head of privacy... so #2 may be in question, given the design of this system and its capability of further compromising the privacy of millions of Chinese citizens on the Chinese government's whim (and any other strongly-authoritarian government).
And do you seriously think they didn’t check the legality of what they built? Really?
That's the catch. Nothing in the system design prevents them from adding hashes of other types of photographs to that database.
> And do you seriously think they didn’t check the legality of what they built?
IANAL, but the law clearly states that transmitting CSAM to any party other than NCMEC is a felony. Apple != NCMEC. More info: https://www.hackerfactor.com/blog/index.php?/archives/929-On...
The very same law you’re citing describes, in detail, the good faith diligence process that requires the service provider to verify the suspected material before transmission to NCMEC. But no, some random blog and you, you two have a handle on legal analysis that the most litigiously sensitive entity on Earth must have missed while designing one of the most litigiously sensitive systems ever fielded by humans. How’d they miss felony criminal liability, right? It’s just too easy to overlook while designing a system whose sole purpose is to gather legally actionable evidence against other people.
As someone who’s built these systems for over a decade, it’s remarkable how one Apple press release can make everyone so hopelessly uninformed and confident that they know the score. Nobody used the acronym CSAM until a week ago except for people like me, those of us haunted by (actual) nightmares of this shit while HN distantly pontificates on the apparent sacrilege of MD5ing your photos of a family vacation to Barbados to see if you happen to be sharing images of children being raped.
Nobody commenting on this has ever seen child pornography. I’d take that to the bank. Did you know the organized outfits design well-polished sites like PornHub, complete with React and a design palette? 35 thumbnails of different seven year olds right on the front page, filterable by sexual situation. Filterable by how many adults are involved. With a comment section, even!, and God help you if you even begin to imagine what is said there. You’re right, though, let’s think about your privacy and the criminal liability for Apple for taking action on something that clearly doesn’t matter to anyone except those stuck with dealing with it.
Get real. Sometimes the lack of perspective among otherwise smart people really worries me, and this conversation about Apple’s motives for the last week or so has worried me the most yet.
People should really stop using this “conspiracy theory” as a reason for Apple to not scan for CSAM in a privacy-preserving fashion. There are way too many “hot takes” that don’t take into account any legal ramifications of their “what if” scenarios.
And the law clearly states that that doesn’t apply if an immediate effort is made to involve law enforcement. Besides, Apple is not transmitting it to Apple, the user is.