- All cloud providers scan for it. Facebook, Google, Amazon, Apple, Imgur ... There's a list of 144 companies at NCMEC. There must be a damn good reason for that consensus...
- Because they scan for it, they are obliged (coerced, if you will) to report anything they find. By law.
- Facebook (to pull an example out of the air) reported 20.3 million times last year. Google [1] are on for 365,319 for July->Dec and are coming up on 3 million reports. Apple reported 265 cases last year.
- Using e2e doesn't remove the tarnish of CSAM being on your service. All it does is give some hand-wavy deniability "oh, we didn't know". Yes, but you chose to not know by enforcing e2e. That choice was the act, and kiddy-porn providers flocking to your service was the consequence. Once the wheels of justice turn a few times, and there becomes a trend of insert your e2e service being where all the kiddy-porn is stored, there's no coming back.
The problem here is that there's no easy technical answer to a problem outside the technical sphere. It's not the technology that's the problem, it's the users, and you don't solve that by technological means. You take a stand and you defend it. To some, that will be your solution ("It's all e2e, we don't know or take any ownership, it's all bits to us"). To others, it'll be more like Apple's stance ("we will try our damndest not to let this shit propagate or get on our service"). Neither side will easily compromise too much towards the other, because both of them have valid points.
You pays your money and you takes your choice. My gut feeling is that the people bemoaning this as if the end-times were here will still all (for reasonable definitions of "all") be using iCloud in a few months time, and having their photos scanned (just like they have been for ages, but this time on upload to iCloud rather than on receipt by iCloud).
[1] https://transparencyreport.google.com/child-sexual-abuse-mat...