And they aren't totally wrong? I have a qanon-er in my family; she's carefully curated her media sources to guarantee a pure, steady stream of nutjob garbage. The thing is, censoring twitter/facebook/youtube makes no difference - these people will find their fringe no matter how hard they have to look. At least on youtube there's a chance the algorithm will show some non-bullshit.
I don't know what "the answer" to any of this is, but I suspect we all need to be a bit more tolerant of online stupidity.
Also: If your social media feeds are showing you crappy content, you are to blame. It's incredibly easy to like/dislike/hide-like-this content. My feeds are fantastic. I suspect the people complaining about their feeds actually like garbage and even more so like to complain.
it’s much more likely that folks are concerned because the populism wishes to destroy pluralism.
I'd imagine most people have reasonable assumption than YT will only show you more stuff that you like (as in press the like button), not that any engagement, even downvotes, will continue to put more shit into your stream.
Ever wondered why YTbers ask for comments and don't shy from saying "if you didn't like it, downvote and tell me in the comments"? That still drives engagement and that is all to algorithm.
Once you get it, it's easy to keep the shit you don't want in the stream, but not everyone is well versed in algorithmical ways so it's not that hard to get into algorithmics hole
This is likely true. Most of the discussion around "the algorithm" is mainly centered around whether there's something about the way the major online sharing channels are selecting what to present that acts as a funnel towards QAnon and other radical theories, not whether people already dedicated to finding such information can find it.
I had to click do not recommend this content on a ton of videos.
Hypothetically: all sorts of people are interested in the history of World War II... it's a big war, it defined politics for over half a century, and it's taught every year in the history classes of at least a hundred nations. So you're looking at a topic with hundreds of "also likes" connections but no strong winner.
Perhaps there is a 0.000001% chance that if you watch a WWII video, your next video will be one on modern Nazism (including recruitment videos). I wonder if YouTube's algorithm boosts that small fluctuation in a sea of noise into a single recommendation result (or even grabs the top 5, but the signal is so noisy that this one still shows up in that top 5)?
Yes and no. Youtube radicalisation / algorithmic extremism is a thing, because the engagement-maximising algorithm has a strong bias for rabbit-holes, kooks, con-artists and incendiary nonsense. The "funnel towards QAnon" that the sibling comment mentioned.
Unless you lean hard against it - which first requires awareness of the problem and the skills or knowledge to spot it - it will be in your feed. "you are to blame" is to an extent blaming the victims.
It took until 2021 for people to figure out that quote-tweeting politician hot takes to dunk on them, actually helped politicians' reach because Twitter counted it as engagement.
Dislike button does nothing.
Hide because I don’t like this video option does nothing. It hides that video and keeps recommending competing channels on the same topic.
Don’t recommend channel button blocks the channel so that three more competing channels on the same topic take its place
What I mean is, I don't care if someone reads a politician I hate. I care if they read a politician who enables the next pizzagate or worse. Or if they read state sponsored misinformation while thinking it's genuine opinions. And yeah, I do acknowledge there's some overlapping grey area.
As do I, and you. I was pretty happy when all of those ISIS accounts that were spewing violence and hate got shut down. I'm sure you have your boundaries on what you'd want society to allow and restrict on public mediums.
The specific details of what those boundaries are will depend on your personal notion of what constitutes communication which can be considered abusive. As it will for any other person.
Copyright violations. Beheading videos. "Pornography" or pseudo-"pornography" involving minors. Direct threats of violence towards individuals or groups. Deepfakes generated without consent. Etc. Etc.
I guarantee you if we have a back and forth discussion we will discover where your boundaries lie across the myriad issues where people typically want to control public discourse.
The printing press was seen as dangerous. Then the phone, then TV, and then the internet. And now it's social media.
Each time the arguments are the same. People worry that the "wrong people" will have the ability to share ideas to a broad audience.
And yet each time we've shown that more speech is correlated with an expansion of civil rights and liberal governance.
It was definitely a step forward for humanity but it wasn't all unicorns and rainbows.
doesn’t the FCC heavily limit broadcast television and radio who and what behaviors are tolerated? like very heavily, no?
and aren’t there limitations on what you can and can’t do with a telephone?
Path dependence, it's a funny thing.