Please let me know if you know how to fix this unfortunate typo.
You can click the edit button. But you’re probably trying to make a snarky point.
If we are going to say that censorship is justified because you want to prevent violence or insurrections, we open the floodgates to some seriously bad things. Here’s a scenario:
Claiming that people are being systematically discriminated against could lead to violence and death. Saying that police officers kill minorities could lead to another CHAZ/CHOP, which was also an insurrection where people died. As a result, we need to silence ACAB and anyone who delegitimizes our police.
If it were just as bad as "The censors want to prevent the bad things", I think I might feel that it would be justifiable. This type of thinking also robs the broader public of its agency to reason. The broader public is not comprised of petulant children, regardless of what the media outlets would tell you and what the behavior of Twitter might show (very few of those people display that sort of behavior in actual life, where consequences happen).
This "moral highgrounding", to protect us from the corruption of verbiage that may occur on one platform or another is a ridiculous idea to think that functional adults might need. We are not so weak as to be protected from ideas that "must not be named". Words are not magic, and people cannot be enchanted by their mere utterances. They can be enticed by them being outlawed and the verboten mystique that makes the forbidden seem so savory.
People may be uneducated, but it is impossible to educate them through silence and the silencing of ideas. These verboten ideas will get spread around, major platform or no. They'll spread without any kind of resistance or discussion and no disinfectant light of truth by debate will be shone on them because they will be pushed down to the nooks and crannies, off of the popular platforms where the light of day can show them to be the BS that they are. They'll find their place in small groups and factions, I felt and grow and split people apart. They'll make people not talk to each other and they'll make us a weaker country and a less educated one. A country that tries to run away from hard questions and conversations because it's easier to tell someone or some group to shut up and don't talk about that, rather than address a topic with an open and honest conversation with facts and dialogue.
Don't think so.
> If we are going to say that censorship is justified because you want to prevent violence
Nope. Just agreeing with my parent comment that there is a significant difference between harm-causing misinformation and this very good election business.
While there is certainly a case for criminalizing direct prompts to imminent violence, attaching consequentialism to ideas is a very slippery slope, given enough creative hermeneutics by whoever currently wields state power. (I recall no shortage of conservatives circa 2004 who genuinely believed Michael Moore was guilty of treason.)
You tell me: is this comedy sketch satire, or an explicit call to violence? https://www.youtube.com/watch?v=qhWCk2f2alI
The key comparison is this: who decides what is misinformation? who decides what information/disinformation is "dangerous"? The right consequentialist narrative can frame any idea as dangerous.
Obviously it's dangerous to advise people to drink bleach. But you can find millions who'd make the same claim of vaccines (both "the jab", or pre-COVID childhood vaccinations); or, of ivermectin (while claims of its COVID efficacy are dubious, neither is it categorically unsafe for humans).
I'm not going to lose much sleep over YouTube attaching "The more you know..." links on controversial subjects, paternalism and "the backfire effect" notwithstanding. The concern is algorithmically putting their thumbs on the scale of discourse (and with second-order self-censorship effects). Even if/when they're mostly right (COVID vaccines work; global warming is anthropogenic), that is a disturbing level of unaccountable power.