Society didn't collapse after photoshop. "Responsibility to society" is such a catch-all excuse.
They did this to stop bad PR, because some people are convinced that an AI making pictures is in some way dangerous to society. It is not. We have deepfakes already. We've had photoshop for so long. There is no danger. Even if there was, the cat's out of the bag already.
Reasonable people already know to distrust photographic evidence nowadays that is not corroborated. The ones who don't would believe it without the photo regardless.
We've been through this many times, with books, with movies, with video games, with Internet. If it *can* be used for porn / violence etc., it will be, but it won't be the main use case and it won't cause some societal upheaval. Kids aren't running around pulling cops out of cars GTA-style, Internet is not ALL PORN, there is deepfake porn, but nobody really cares, and so on. There are so many ways to feed those dark urges that censorship does nothing except prevent normal use cases that overlap with the words "violence" or "sex" or "politics" or whatever the boogeyman du jour is.
Cheap and plentiful is substantivly different from "possible". See for example, oxycontin.
Do not be deluded that our own governments are not manufacturing the narrative too. The US has committed just as many war crimes as Russia. Of course, people feel differently about blowing up hospitals in Afghanistan rather than Ukraine. What the Afghan people think about that is not considered too much.
This AI has the potential to absolutely automate the very long Photoshop work, leading to an even worse stat eof things. So, yes, "Responsibility to society" is absolutely a thing.
But notice how all of these deep faking technologies weren't actually necessary for that.
People believe what they want to believe. Regardless of quality of provided evidence.
Scaremongering idea of deep fakes and what they can be doing was militarized in this information war way more than the actual technology.
I think this technology should develop unrestricted so society can learn what can be done and what can't be done. And create understanding what other factors should be taken into account when assesing veracity of images and recordings (like multiple angles, quality of the recording, sync with sound, neural fake detection algorithms) for the cases when it's actually important what words someone said and what actions he was recorded doing. Which is more and more unimportant these days because nobody cared what Trump was doing and saying, nobody cares about Bidens mishaps and nobody cares what comes out of Putins mouths and how he chooses his greenscreen backgrounds.
> People believe what they want to believe. Regardless of quality of provided evidence.
That is a terrible oversimplification of the mechanics of propaganda. The entire reason for the movements that are popping up is actors flooding people with so much info that they question absolutely everything, including the truth. This is state sponsored destabilisation, on a massive scale. This is the result of just shitty news sites and text posts on twitter. People already don't double check any of that. There will not be an "understanding of assessing veracity". There is already none for things that are easy to check. You could post that the US elite actively rapes children in a pizza place and people will actually fucking believe you.
So, no. Having this technology for _literally any purpose_ would be terribly destructive for society. You can find violence and Joe Biden hentai without needing to generate it automatically through an AI