I don't want to see dick pic spam. Actually I don't want anyone to send me naked pictures. Previously it was up to every messaging app to figure this out themselves. Now they can use the Sensitive Content Analysis framework. It also means they don't need to give PTSD to humans building a model to classify crap like CSAM images.
All it does is detection. It is up to the app whether to prohibit sending/receiving the message, how to notify the user, etc. The API documentation says this:
> Apple provides the SensitiveContentAnalysis framework to prevent people from viewing unwanted content, not as a way for an app to report on someone’s behavior. To protect user privacy, don’t transmit any information off the user’s device about whether the SensitiveContentAnalysis framework has identified an image or video as containing nudity. For more information, see the Developer Program License Agreement.
If you’re seeing so many dick pics to where this is positive trade off for you, you need to reflect on how you’re interacting with the web.
I suggest you expand your social circle and talk to some women that are moderately active on social media.
Women are nowadays confronted with their face generated on fake porn by AI, hell some are confronted with generated sexual abuse as a form of threat and harassment if they piss off the wrong person, and you’re out here talking about dick pics.
What’s worse is that you just couldn’t resist to add a dash of misogynistic victim blaming as a cherry on top.
How is on-device image classification "ceding more privacy"? Because an OS vendor provided it?
> If you’re seeing so many [...] you need to reflect on how you’re interacting with the web
Are you seriously claiming not to understand the concept of SPAM and info leaks?
Of course they are OK with a machine being an obnoxious prude.
If any kind of metrics leave the phone's chassis as a result, then its quite creepy, but I was operating under the assumption that they are not.
Of course, there is also a real issue with the fact that, as closed source software on a locked down platform, we can't know what happens next. But that is just part of the deal with iPhones; there is already a lot of data like that (eg, I'd expect the US uses iPhone GPS data froom targets to hunt them down).
[0] Not sure what the feature actually does because nobody has posted details here, so there is some guesswork here.
If it detects cat pictures, what evil thing is it going to do? Label it as a photo of a pet (I don't even know, I don't use a scanning phone)?
If it detects nudity, what kind of unwanted behavior might it exhibit then, report to legal guardians? Not the picture itself but even just that the device is being used for that.
I can see how this scanning+warning is more creepy than scanning+labeling cat pictures, even if the information screen tells you it was just used for this warning screen.
You touch on an interesting element. A lot of iPhone users are kids. Even young kids. Logged in with their parents account.
Not trying to justify what’s going on. Just add context
Given the internet as it is, and as it has been even back when FOSS discussions included hating GIF because of patent enforcement, kids shouldn't be on the (general) internet at all.
Smartphones are even worse, given the deliberate attempts to make content more addictive.
I'm not sure how to square that particular circle with the likelihood of social exclusion from not being online — it's not like me putting (general) in brackets in the first paragraph will convince the right people that there's money to be made in a genuinely safe subset, despite the existence of YouTube Kids and whatever Netflix' thing is called.
A lot of kitchen users are kids, it's the parents responsibility to ensure they're supervised or that the sharp knives are put away out of reach.
Is this something that iOS (or some other client) does? Or just hypothetical. I don't keep up with these things aside from when they reach HN
It is optional and disabled by default, just like the child-oriented Communication Safety feature set. They call the adult-oriented version "Sensitive Content Warnings".
Extrapolating from that we may touch hiding queer or black people, or mixed couples one day, which will be an interesting situation.
This feature is entirely opt-in for adults.
Are we going to dedicate a post to every optional feature that we enable and act shocked about?
As someone who's used earlier versions of iOS for some years now, and who knows a bunch more people who also have, that's not a problem I'm aware any of us ever experiencing. I realise that anecdotes are not data, but it doesn't seem like it should be a common issue at all...
Oh yes.
https://www.psypost.org/2020/08/new-research-uncovers-womens....
I would consider that surprisingly high lol. 1 in 4 women is happy about receiving unsolicited dick pics, really?