I think we have seen "think of the kids" used as an excuse for so many things over the years that the pendulum has now swung so far that some of the tech community has begun to think we should do absolutely nothing about this problem. I have even seen people on HN in the last week that are so upset by the privacy implications of this that they start arguing that these images of abuse should be legal since trying to crack down is used as a motive to invade people's privacy.
I don't know what the way forward is here, but we really shouldn't lose sight that there are real kids being hurt in all this. That is incredibly motivating for a lot of people. Too often the tech community's response is that the intangible concept of privacy is more important than the tangible issue of child abuse. That isn't going to be a winning argument among mainstream audiences. We need something better or it is only a matter of time until these type of systems are implemented everywhere.
I think the actual idea, which the nuance is often lost (and let's be honest, some people aren't really aware of and are just jumping on the bandwagon), is that privacy is a right, and erosion of rights is extremely important because it has been shown many times in the past to have far reaching and poorly understood future effects.
It's unfortunate that some would choose to link their opposition to the idea with making the thing it's attempting to combat legal when it's something so abhorrent, but in general I think a push back on any erosion of our rights is a good and useful way for people to dedicate time and effort.
While we shouldn't lose sight that there are real children in danger and that this should help, we also shouldn't lose sight that there is plenty of precedence that this could eventually lead to danger and problems for a great many other people, children included.
The encryption everywhere mindset of a lot of the tech community is changing the nature of the right to privacy. 50 years ago all these images would have been physical objects. They could have been found by the person developing the photos or the person making copies. They could have been found with a warrant. Now they are all hidden behind E2EE and a secure enclave. To a certain extent the proliferation of technology means people can have an even stronger degree of privacy today than was practical in the past. It is only natural for people to wonder if that shift has changed where the line should be.
Not only has communication become easier, so has surveillance. The only difference now is that it's easier for people not to be aware when their very personal privacy is invaded, and that it can be done to the whole populace at once.
Is it intangible? 18% of the world lives in China alone. That's more people than the "1/10 who are victims of child abuse*", and I'm sure that 18% will only grow as other authoritarian countries get more technologically advanced.
I think "Think of the kids" applies very well to the CREATORS of pornography. Per wikipedia, there isn't any conclusive causal relationship between viewing CP and assaulting children.
* Per a google search "A Bureau of Justice Statistics report shows 1.6 % (sixteen out of one thousand) of children between the ages of 12-17 were victims of rape/sexual assault" which is a lot less than 10% figure you're citing. Non-sexual abuse wouldn't really have any bearing here, right?
You didn't mention any tangible results here. How would this system by Apple make my life worse? Can you answer that without a slippery slope argument?
>I think "Think of the kids" applies very well to the CREATORS of pornography. Per wikipedia, there isn't any conclusive causal relationship between viewing CP and assaulting children.
Why does the causality matter? A correlation is enough that cracking down on this content will result in less abusers on the streets.
>* Per a google search "A Bureau of Justice Statistics report shows 1.6 % (sixteen out of one thousand) of children between the ages of 12-17 were victims of rape/sexual assault" which is a lot less than 10% figure you're citing. Non-sexual abuse wouldn't really have any bearing here, right?
I wasn't the one citing that, but you are also citing an incomplete number since it excludes younger children.
So far any defense of this whole fiasco can be boiled down to what you are trying to imply in part. When you say "The possibility of abusing this system is a slippery slope argument", as if identifying a (possible) slippery slope element in an argument would somehow automatically make it invalid?
The other way around if all that can be said in defense is that the dangers are part of slippery slope thinking, then you are effectively saying that the only defense is "trust them, let's wait and see, they might not do something bad with it" or "it sure doesn't affect me" (sounds pretty similar to "I've got nothing to hide"). This might work for other areas, not so much when it comes to backdooring your device/backups for arbitrary database checks.
And since "oh the children" or "but the terrorists" has become the vanilla excuse for many many things I'm unsure why we are supposed to believe in a truly noble intent down the road here. "No no this time it's REALLY about the kids, the people at work here mean it" just doesn't cut it anymore. So no, I'm not convinced the people at Apple working on this actually do it because they care.
When "but the children" becomes a favourite excuse to push whatever, the problem are very much the people abusing this very excuse to this level, not the ones becoming wary of it.
> some of the tech community has begun to think we should do absolutely nothing about this problem
I don't believe that people think that, I believe that people rather think that the ones in power simply aren't actually mainly interested in this problem. The trust is (rightfully) heavily damaged.
That's a weird goal-post.
>> Why does the causality matter? A correlation is enough that cracking down on this content will result in less abusers on the streets.
Obviously of the people who look at cp, a higher percentage of those will be actual child abusers. The question for everybody is -- does giving those people a fantasy outlet increase or actually reduce the number of kids who get assaulted. At the end of the day that's what matters.
>> I wasn't the one citing that, but you are also citing an incomplete number since it excludes younger children.
[EDIT: Mistake] Well you didn't cite anything at all, and were off by a shockingly large number. Please cite something or explain why you made up a number.
“With the first link, the chain is forged. The first speech censured, the first thought forbidden, the first freedom denied, chains us all irrevocably. The first time any man's freedom is trodden on we're all damaged...."
(And why do you think it's morally right, or the responsibility of a foreign private company to try and force anything into China against their laws? Another commenter in a previous thread said the idea was for Apple to refuse to do business there - but that still leads to the question, how would that help anyone?)
“Think of the Kids” damn well applies to the consumers of this content - by definition, there is a kid (or baby in some instances) involved in the CP. As a society, the United States draws the line at age 18 as the age of consent [the line has to be drawn somewhere and this is a fairly settled argument]. So by definition, in the United States, these are not consenting victims in the pictures.
Demand drives creation. Getting rid of it on one of the largest potential viewing and sharing platforms is a move in the right direction in addressing the problem.
What I haven’t seen from the tech community is the idea that this will be shut down if it goes too far or beyond this limited scope. Which I think it would be - people would get rid of iPhones if some of the other cases privacy advocates are talking about occur. And at that point they would have to scrap the program - so Apple is motivated to keep it limited in scope to something everyone can agree is abhorrent.
Yeah, that focus has worked really well in the "war on some drugs," hasn't it?
I don't pretend to have all (or any good ones for that matter) the answers, but we know interdiction doesn't work.
Those who are going to engage in non-consensual behavior (with anyone, not just children) are going to do so whether or not they can view and share records of their abuse.
The current legal regime (in the US at least) creates a gaping hole where even if you don't know what you have (e.g., if someone sends you a child abuse photo without your knowledge or consent) you are guilty, as possession of child abuse images is a felony.
That's wrong. I don't know what the right way is, but adding software to millions of devices searching locally for such stuff creates an environment where literally anyone can be thrown in jail for receiving an unsolicited email or text message. That's not the kind of world in which I want to live.
Many years ago, I was visiting my brother and was taking photos of his two sons, at that time aged ~4.5 and ~2.
I took an entire roll of my brother, his wife and their kids. In one photo, the two boys are sitting on a staircase, and the younger one (none of us noticed, as he wasn't potty trained and hated pants) wasn't wearing any pants.
I took the film to a processor and got my prints in a couple of days. We all had a good laugh looking at the photos and realizing that my nephew wasn't wearing any pants.
There wasn't, when the photos were taken, nor when they were viewed, any abuse or sexual motives involved.
Were that to happen today, I would be sitting in a jail cell, looking at a lengthy prison sentence. And when done "repaying my debt to society" I'd be forced to register as a sex offender for the rest of my life.
Which is ridiculous on its face.
Unless and until we reform these insane and inane laws, I can't support such programs.
N.B.: I strongly believe that consent is never optional and those under the age of consent cannot do so. As such, there should absolutely be accountability and consequences for those who abuse others, including children.
Why not police these groups with.. you know.. regular policing? Infiltrate, gather evidence, arrest. This strategy has worked for centuries to combat all manner of organized crime. I don't see why it's any different here.
Here's a relevant extract sourced from Wikipedia:
"Most sexual abuse offenders are acquainted with their victims; approximately 30% are relatives of the child, most often brothers, sisters, fathers, mothers, uncles or cousins; around 60% are other acquaintances such as friends of the family, babysitters, or neighbours; strangers are the offenders in approximately 10% of child sexual abuse cases.[53] In over one-third of cases, the perpetrator is also a minor.[56]"
Content warning: The WP article contains pictures of abused children.
I don't think full scale surveillance is the way to go in free, democratic societies. It is the easiest one, so. Even more if surveillance can be outsourced to private, global tech companies. It saves the pain of passing laws.
Talking about laws, those along with legal proceedings should be brought up to speed. If investigators, based on existing and potential new laws, convince a court to issue a warrant to surveil or search any of my devices, fine. Because then I have legal options to react to that. Having that surveillance incorporated into some opaque EULA from a company in US, a company that now can enforce its standards on stuff I do, is nothing I would have thought would be acceptable. Not that I am shocked it is, I just wonder why that stuff still surprises me when it happens.
Going one step forward, if FANG manages to block right to repair laws it would be easy to block running non-approved OS on their devices. Which would than be possible to enforce by law. Welcome to STASI's wet dream.
We’re talking a life-long impact from being abused in that stuff…