> "Parliament needs to have the opportunity to debate whether nude and sexually explicit images generated digitally without consent should be outlawed, and I believe if this were to happen the law would change."
And it's about restricting the tools, not just what's done with them:
> "If software providers develop this technology, they are complicit in a very serious crime and should be required to design their products to stop this happening."
so we should outlaw any piece of technology that can be misused?
This is what happens when technologically inept people make laws to regulate technology. By this standard, cameras should be outlawed in case they're used to take sexually explicit pictures without consent. Computers should be outlawed in case they're used to distribute anything illicit.
Outlawing the production (by any means) and/or distribution (by any means) of non-consensual, sexually explicit material should be enough. If someone can show evidence that someone has produced to shared such content, then that's enough to demonstrate the offence.
No need to go stopping the rest of us from using technology just because some disgusting excuse for a human decided to demonstrate some of the worst of humanity with said technology.
How?
How are you going to stop people from sharing images of what your software produced?
There might be more context to this quote but is insane how out of touch lawmakers often appear to be with respect to technology. Just look at all of the cookie banners plastered with dark patterns which completely nullify the idea behind them.
How do we create working legislation for technology?
So, if I were to run DeepSukebe on an analog AI chip the generated pictures would be totally fine to distribute?
It seems to me that this should instead be 'images generated without consent'; that is, without specifying how this can be generated.
> A person (“A”) commits an offense if—
> (a)A discloses, or threatens to disclose, a photograph or film which shows, or appears to show, another person (“B”) in an intimate situation,
> (b)by doing so, A intends to cause B fear, alarm or distress or A is reckless as to whether B will be caused fear, alarm or distress, and
> (c)the photograph or film has not previously been disclosed to the public at large, or any section of the public, by B or with B's consent
(my highlighting: "appears to show" would cover realistic fakes).
It’s a shame the headline communicates the former when it seems the proposal is the latter.
Because a headline blubbing about AI-X-ray-specs grabs attention. Making people more likely to click the link.
You can see the effect in action right here: the link made it to the top of the front page in less than an hour of being submitted.
Besides that tough, the article does contain the typical misunderstanding of politicians about how software works ("software developers should be forced make $SOFTWARE so that it can't do $MY_SPECIAL_EXCEPTION_CASE"), which is problematic per se, unrelated to any machine learning aspect.
Especially if "this image is similar to me" is factored in, what degree of similarity makes an image a representation of a real person?
We're not far from this being less about privacy and dignity, and being more about whether the idea of nudity is permitted.
https://en.m.wikipedia.org/wiki/Personality_rights
It seems right to me that you should not be able to publish something that looks like a recording of person X doing/saying/participating in something, without either getting the consent of person X, or making it very, very clear in the published material that it is a fake. (I'm not claiming this is how the law currently works; I'm not an expert on that)
For nudity and pornography I think there is an added twist that even if the viewer knows that it's fake, there is an element of violation of the real person there. I'm not quite sure where I would draw the line about that.
I think your second sentence trips back into the distinction the parent comment is making. When people post themselves nude, no one cares (barring some specific contexts); but, if someone posts a picture of you nude without your consent, then you probably care and consider it a violation of your privacy. The question of whether nudity is permitted is not really at question. What is, is whether computer programs that generate nude images for express purposes of making the real or generated image indistinguishable from a self-posted image is a violation of consent and privacy laws.
This isn't talking about 100% generated content. It is talking about instances where apps (either automated examples like the one named or photo editing suites) are used to manipulate an image of a real person to make it look like they are naked.
Would you want a convincing an image of yourself in your birthday suit out there?
Or to take it a bit further, if that image that makes it look like you attended a party with Trump & Epstein in such a state?
Even if it were possible to argue no privacy had been invaded because that isn't really the person's naked form, there is the potential reputational damage to consider, both professional and personal.
Obviously fake nudes have been a thing for a long time, even convincing ones, but the issue was limited by an amount of time and/or skill being required. The newer tech available in certain apps today makes it very easy to make truly convincing images.
One of the reasons nudes being released is damaging is because it's a rare enough and noteworthy event. If because of this tool everyone has nudes of them floating around then it would become a normal thing and would actually remove most of the damage around real nudes leaking by providing plausible deniability (assuming anyone ever cares at this point - if the world is drowning in nudes of everyone, the real thing will probably go unnoticed anyway).
Outlawing the tool wouldn't actually stop malicious usage of it but because only criminals would use it it would make its (rarer) usage more damaging than if anyone could legally use such a tool and nudes stop being a noteworthy event.
But under reigning American First Amendment law, it gets a lot harder to explain why a law like the one being proposed here would be acceptable. The Supreme Court has, for example, held that the distribution of animal-cruelty videos cannot be forbidden. And it’s not clear to me how one could proscribe the distribution of an imaginary visual depiction of an adult who was nude. You could call it defamatory, I suppose, but if it’s concededly fictional… I don’t know.
There is 3 part test.
The SCOTUS actually heard appeals on the "obscenity" of material on a case by case basis for a while decades back.
More specific to this case is the PROTECT act [1]. I don't know whether it's every been ruled against or whether SCOTUS has accepted that all depictions of minors are obscene...
[1] https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn...
(see the US under Grey Area.)
(granted that was in the context of links to a terrorist organisation, but it shows how different the landscape is to the US)
The ongoing "fake news" crisis proves that we are, in general, bad at spotting fakes - that is the purpose of a fake - and will also vehemently disagree about what is fake. Especially when it can be used for political purposes. Expect the first world leader brought down by a deepfake in the next decade.
I'm reminded of https://www.theverge.com/2017/7/12/15961354/pakistan-calibri...
I remember when deepfakes first was released there was a group who would deepfake coworkers, Facebook friends, etc for a really low cost (like $100) as long as the target had a few hundred public photos.
This is without consent as well, but it’s also not real. It seems like the equivalent of imagining people nude. Kind of creepy if I know it’s happening but not truly a violation of my privacy.
It's not quite the equivalent of imagining people nude as there is an artefact that can potentially cause its own pain when distributed.
Maybe the better analogy for imagining nude would be to just make an image and not show it to anyone. So the issue is commercialization or distribution.
In the future when we have digital consciousnesses running on Google or Amazon or whatnot, will we be prevented from imagining people nude without their consent because the mind will be replicated across multiple availability zones?
The "AI" aspect will amplify the offense because of how life-like the end result can be.
But something about nudity makes it different.
OR
Streisand effect intentional to make this so common that it no longer draws attention
Edit: That being said... this naked faking app can cause of lot of problems in the workplace and home.
Reality is, unless most countries ban it, it's gonna be on the internet.
Most people don’t grok computers, so a commercial ban would probably cover most people, yet Bush-bin-Laden photoshop I saw back in the early noughties would still get made and shared.
https://law.lis.virginia.gov/vacode/title18.2/chapter8/secti...
Where's the outcry over Photoshop?