Still: they scan photos locally - those are not cloud photos, those are local photos. And they have deployed the technical capability. You can bet that once capability exists, they will bend to government demands - there's ample precedent for that.
SO, yes, Apple, unlike all others, scans your photos locally. If they are going to be uploaded to cloud, or if they are forced to.
They are cloud photos. I say that because:
1. The photos are in the process of being uploaded to the cloud when they are scanned
2. The result of the scan is attached to the photo only when it is uploaded to the cloud. If the photo is deleted from the cloud, or the upload is canceled, the scan result is discarded
Practically, the system works precisely the same whether or not the scanning happens on device before the image reaches the cloud, or on the server after the image reaches the cloud.
The only well-intentioned argument about why on-device vs. on-server scanning matters is that "slippery slope" argument, which presupposes that:
1. Apple putting this scanning code in iOS not only somehow makes it easier/more tempting to use it for non-CSAM, but all but guarantees it will be used for non-CSAM.
2. Apple does not already have the ability to run whatever code they want, on any of your devices, without you ever knowing
3. Apple folds very easily to government demands, especially when it comes to privacy, their core differentiator
I don't think any of these are true. You might think they are, but then I'm not sure what point there is in discussing any more.
> or if they are forced to.
I'm not sure what this implies. If someone forces you to upload a photo to the cloud, surely that will get scanned regardless of whether the scanning is performed on-device or on-server?
Therefore, the scanning is local. There's really nothing more to it: The distinction is based on where the input is read from, in addition to where the input is processed. Both are happening inside the phone while you hold it in your hand.
It is scanning images locally.
This is totally unacceptable, and should never become acceptable.
This is what I don't understand about the whole argument about this CSAM debacle. I've read quite a bit of the discussion about this, as I'm someone who takes privacy fairly seriously, and it never really gets discussed. Could someone maybe point me in the direction of some literature about this? Is someone doing extensive load and packet analysis? Don't they(Apple) upload at least some E2E data?...
My iPhone already does an insane amount of "indexing", including image classification. This is all under the hood and I have no idea what else its doing, for all I know its mining Monero. Additionally all my iOS devices seem to send an inordinate amount of data to the cloud; I'm particularly sensitive to this because I don't have a strong internet connection, and frequently have to turn off WiFi on my phone or iPad when playing online games to stabilize my ping.
I'm also skeptical that you can really insure privacy from a 5 eyes country. Maybe I just read too many spy novels as a kid, but it doesn't take a lot of imagination for me to guess how any given decently large western company could be completely infiltrated by a multinational espionage coalition.
Idk, I tend to like that Apple is fighting against Ad-tech, as that power dynamic is at least believable. I do think that playing around with deGoogled Android is fun and in my experience is much more suited to dropping off the cellphone grid. I have an Android running Lineage and microG and with OSM and Kiwix(wikipedia is indispensable, IMO) as well as a handful of other apps, it serves the majority of the purposes of a cellphone without the need for data. I still daily drive my iPhone, mostly because the UX is a lot better than deGoogled Android.
Now if Apple developed a special update that they sent to only a few choice targets, that might be able to go under the radar.
You can wrap intrusions in form of 'think about the kids' (what is used here), think about security/terrorism and so on. This playbook has been used ad nausea, isn't it about time to learn?
If _Apple_ are forced to (e.g. by a judge), and they can't claim the ask is technically impossible.