The very same law you’re citing describes, in detail, the good faith diligence process that requires the service provider to verify the suspected material before transmission to NCMEC. But no, some random blog and you, you two have a handle on legal analysis that the most litigiously sensitive entity on Earth must have missed while designing one of the most litigiously sensitive systems ever fielded by humans. How’d they miss felony criminal liability, right? It’s just too easy to overlook while designing a system whose sole purpose is to gather legally actionable evidence against other people.
As someone who’s built these systems for over a decade, it’s remarkable how one Apple press release can make everyone so hopelessly uninformed and confident that they know the score. Nobody used the acronym CSAM until a week ago except for people like me, those of us haunted by (actual) nightmares of this shit while HN distantly pontificates on the apparent sacrilege of MD5ing your photos of a family vacation to Barbados to see if you happen to be sharing images of children being raped.
Nobody commenting on this has ever seen child pornography. I’d take that to the bank. Did you know the organized outfits design well-polished sites like PornHub, complete with React and a design palette? 35 thumbnails of different seven year olds right on the front page, filterable by sexual situation. Filterable by how many adults are involved. With a comment section, even!, and God help you if you even begin to imagine what is said there. You’re right, though, let’s think about your privacy and the criminal liability for Apple for taking action on something that clearly doesn’t matter to anyone except those stuck with dealing with it.
Get real. Sometimes the lack of perspective among otherwise smart people really worries me, and this conversation about Apple’s motives for the last week or so has worried me the most yet.