On the other hand somebody can say "if we don't, then others will". So regulate and ban the technology, then.
At least putting a ban on it will make people think, I hope.
I'm not sure how I feel about "ban it" -- I can argue that either way -- but I do think that arguments that banning something is pointless because people will do it anyway are misguided.
Regulations don't eliminate the effects of bad actors, but they do reduce the number and severity of them.
When has that ever worked? What banned technology is conceived of but not developed because some government entity said not to?
I think the "if we don't, then others will" is part of the natural progression of technology. Whatever the next logical step of development is, that's where development efforts will flow. Some might not want to go there, but some will. Someone banning it will likely only fan the flames and drive more interest into the space - ala the "Streisand Effect".
When the US government (et al) labeled certain numbers "illegal" because they could be used to break DRM or certain encryption types, academia and hackers alike openly mocked the notion. T-shirts, stickers, and websites sprung up further spreading this "illegal" knowledge. People who had no idea about how a number could be so "dangerous" suddenly wanted to know. By telling people they can't know or do something will absolutely drive people toward that knowledge.
The hacker mentality often answers the question "why?" with "because I can". Saying you cannot only encourages more to jump in.
I was thinking about throwing a wrench to the machine while saying "ban it" to make it stutter, like breaking the chain of obedience in Milgram Experiment, or like the woman who walks up to Zimbardo and stops The Prison Experiment (see his TED talk).
Because, as a hacker and programmer, I believe that we have ethical obligations, and this "We're doing something amazing, we need no permission" stance in these communities genuinely worry me.
Technology is not only technology, it affects people's lives. Anything which can damage it beyond a certain point by exploiting human nature is in the same category for me.
It's like banning Iliad, because it describes troyan horse.
That works too, like we accept the algorithms which tie people to screens and used to spread misinformation and manipulate the masses.
> We all know what this is used for and its tasteless at best.
Yeah, like interview fraud, misinformation campaigns and what not. These are tasteless, but not harmless.
Except for criminals and CIA.