Moderation is the only known effective solution. Some times.
FAANGs and others abandoned human moderation because it "doesn't scale".
So what? When did we decide that scale was more important than civility?
--
I also have questions.
What is the feedback loop? For every medium, every style of communication?
How do we break those loops into discrete steps? Then where can we add friction? To slow down the negative psychological and sociological pathologies.
Defuse, disable those dopamine hits we all get from these social interactions. Maybe even figure out how to make it a positive feedback loop.
We used to have curated content indexes with human moderators guiding them - in print, Yahoo/DMOZ, AOL keywords, web rings, LiveJournal. Those required significant human labor compared to algorithmic text searching, and were not able to scale as rapidly as content creation tools did.
Was that lack of scaling a problem? No, not necessarily. Some percentage of people enjoy curation (thus Pinterest) and we could have celebrated their efforts and given them top billing in search.
Instead, we “democratized” search by harvesting all of those human rankings and feeding them into a machine algorithm that produces seemingly better-than-human results. Unfortunately, in doing so they did not highlight whose curation led to content being shown, and so curation became less popular over time.
Unfortunately, that curation is what led to Pagerank being so valuable. Without it, spam and liars and malicious activities have infected all “search” and “ranking” systems. Without human curation distinguishing “valuable” from “unevaluated”, search does not scale either.
We did society great harm when we sidelined curation, and no amount of machine-learning algorithms will heal that wound.
Recommenders, like bureaucracies, are misanthropic (anti-human).
The rules are meant to remove human judgement. While denying the baked in bias and dysfunction of the imposed ruleset.
Per Goodhart's Law, they arbitrarily state some things are worthwhile & meaningful, and everything else are not.
Black boxes which thwart inspection, transparency, accountability, explanation.
--
Forgive me for flogging this horse; I do have a point.
Also missing from online social networks are the concepts of fair and impartial adjudication.
Curation, adjudication, transparency, accountability... I'm sure we're omitting many other missing features. Because we're too close to the problem.
Going meta meta here: the common trait of all these "regulation arbitrage" unicorns is they profit by the destruction of our society's laws, checks & balances, social norms, and so forth.
I mean in contrast to a negative feedback loop. Socially, negative or positive denotes impact. Virtuous vs vicious cycles.
Suggestions for better phrases?