People take free speech as always benefiting society, where good ideas are destined to win out in the end, when the reality is that it depends on if the system encourages good ideas to spread while inhibiting the bad ones. A system that rewards people disseminating bad ideas would result in a weak society.
I'm not going to attempt to unpack "inherent superiority", but on the surface this seems like a terrible false belief or conclusion. Free speech has social difficulties. It sucks when we hear ideas we disagree with distributed widely through media. Amazingly, the dissemination of bad ideas is not one of it's problems.
> "good ideas are destined to win out in the end"
Free speech permits individuals to make _informed decisions_. Good information. Bad information. Evaluation must be separate from dissemination.
This is the Millennial parent's crisis. How to teach children growing up in the age of the internet (free speech), to make good decisions (evaluation) when they can find literally anything online (good, bad, lies, etc.).
I apologize if I'm being too strong in my reaction to what might just be an off the cuff _idea_. Any other time I would ignore the comment, but its being conflated here with a very real problem of our time--the dissemination of deliberate misinformation and lies, and the difficulty and _high cost_ of navigating this ecosystem.
The latter is an issue of pollution.
(BTW, do you know the itemized costs of your water service to your home is typically 3:1 or 4:1 sewage:fresh?)
The canonical case for me is creationists, who apply some of the most trivially bad reasoning I can conceive, yet it doesn't actually mess up their daily lives. It even benefits them, since it reinforces membership in their tribe. Sure, it cuts them off from certain careers -- more than they realize -- but most people don't directly apply evolution all that often. (Indeed, many who do, like armchair evolutionary psychologists, usually do it wrong.)
People can often compartmentalize their bad reasoning in ways that the negative effects are distant enough that they get along just fine. It may bite them long term, but in the long term we're all dead anyway. The time frame in which "good ideas win in the end" may be several human lifetimes.
This was mentioned in ComicCon@HOME video panel: "Watchmen and the Cruelty of Masks" which I watched last night [1][2]
And it's been mentioned by Richard Dawkins in the context of "flat-Earthers" [3], and, of course, religion itself.
Right now I'm thinking a lot about memetics as having explanatory power for the long-term effects you describe.
[1]: https://www.comic-con.org/cciathome/2020/wednesday [2]: https://www.youtube.com/watch?v=H5R-9kcV0WY&feature=youtu.be [3]: https://www.sciencefocus.com/science/richard-dawkins-flat-ea...
Evaluation is inevitably linked to dissemination. We cannot evaluate what we have not received, yet we are now receiving too much to effectively evaluate.
And then there is the matter of bias - once an idea has been repeated enough times it takes hold, regardless of rationality.
>the dissemination of deliberate misinformation and lies, and the difficulty and _high cost_ of navigating this ecosystem.
Isn't this distinction arbitrary? To an observer, the intention of the messenger is opaque. Deliberate lies and innocent misinformation arrives together for us to evaluate. Dealing with it at the source introduces a variety of problems in our current model, and judging intent with regards to speech is problematic.