The legislation is based on the principle that everyone has the right to say what they want to say, but they have no right to be heard by others. This phraseology was used by several during the recent committee hearings, including by Facebook employees for example. So it will effectively enshrine in law: shadow banning, delisting from Google search etc.
Also enshrined in the legislation is the principle of banning content which is 'legal but harmful to adults'. 'Harmful', amongst many other terms, is not defined.
Also enshrined in law will be the principle that certain types of 'journalistic' content will be protected from this censorship if it meets criteria to be decided upon by the new regulator, Ofcom.
This will be an enabling act. It is full, from front to back, of opportunity for scope creep through secondary legislation.
As I said, the UK is leading the charge with this, but the EU has been keeping pace [1]. I haven't seen similar in the US as of yet.
[0] - https://www.gov.uk/government/publications/draft-online-safe... [1] - https://ec.europa.eu/info/digital-services-act-ensuring-safe...
The root cause of this is that our society has forgotten how to argue for free speech as a moral case. We've taken it for granted for so long, we can't remember why it's good. Meanwhile the enemies of it have spent years steadily attacking it, often with lies (e.g. the numerous fake claims of Twitter bots controlling elections).
I try to listen to podcasts, occasionally, that are contrary to my own beliefs, not because of free speech but rather to make sure that I don’t end up in a “bubble.”
Coincidentally, I listed to something by The Hill on YouTube last week and it was reasonably good because they had three people speculating on the future economy with slightly different points of view. Not great stuff but worth having on while I was cooking dinner. Also, as a liberal, there is a conservative think tank at Stanford University that I occasionally listen to, not always comfortable for me, but worth a little time every month.
Free speech, in supporting views different from own own, is so very important.
Actions like this have turned this into political red meat for some. These companies really need to learn how to read the room. Ratcheting downward the ability of people to freely use these platforms just feeds into whatever persecution complex they have, and justifies it. You don't get rid of bad ideas by allowing them not be spoken no matter how well meaning you are.
They've already gone far enough for the market to start to solve this. Rumble and Gettr and whatever else are gaining users. But it's producing an ideological split which is pretty dangerous. The other platforms don't set out to attract only the right, but when the censorship by the incumbents is one-sided, it's them who want to leave for somewhere else. Which only makes the incumbents lean more to the left.
What we need is some way to get left-leaning people to move to platforms without excessive censorship, which would both keep the balance and deter the incumbents from continuing to be heavy handed.
A lot of this is cultural. The thing where people cheer because their ideological opponent got silenced is cancer and needs to go away. Everyone of all stripes should quit platforms that do this, and then they wouldn't.
This is kind of like the thing where violence is impermissible except in self-defense. We need to cancel anyone who tries to cancel anyone.
There's no shortage of people trying to make these platforms but the problem is free-speech oriented sites get deplatformed. "Build your own hosting company!", "Build your own mobile operating system!", "Build your own payment processor!", etc, etc. A small handful of tech and payment companies can veto any website or app.
It's like the old days of broadcast television where if your show was too controversial for ABC, CBS, or NBC it didn't get on TV, period.
I agree, but Section 230 also contributes to this. It eliminates the legal consequences for that hamfisted behavior and enables these sites to grow far beyond their ability to properly manage the site.
It's often been said that clumsy AI moderation is necessary because there's no other way to moderate sites with so many users. If true, I see that as an argument against enabling sites to grow "too big to moderate" rather than an argument for excusing hamfisted behavior.
The far right despises technology companies, and there will be a huge push by their members to bring them all to heel.
That risk is likely one reason why Biden's state of the union speech echoed Trump policies: strengthening the border, moving jobs (like chip manufacture) to America, improving police funding, etc.
YouTube isn't a news network. It's a video platform catering to a global audience, many of whom are really really bad critical thinking. Same with Facebook. We all know what happens when content isn't curated, and it's really, really bad.
I'm glad that you're an adult and can make your own grown up decisions, and actually get news from a wide range of sources, and curate it yourself. I obviously wish everyone had that skill, but they don't, and it's actually really really dangerous to take that privilege you've earned for granted.
Who cares later if the "misinformation" later is found to be true ? The most important thing is to obey and follow the will of the collective - set by the whatever the media overlords and political masters judge as "correct thinking". Any attempt to deviate from the same is "radicalization" and must be suppressed by any and all means.
Who are you or YouTube to make this judgement?
More often than not, the "I'm smarter than the average person" types are the ones making the worst decisions.
Taking away their ability to hear different viewpoints especially those they believe to be true just drives these people to radicalize even further. I’m not even touching the matter of dismissing almost half of a country. Very shortsighted perspective.
Not true because law enforcement is still in place to take care of crime. Some amount of crime and confusion are also the price you pay to get the benefits of freedom.
Curation is about selection, not exclusion. As in selecting art for an exhibition on the basis of its value or quality. Curating is not about excluding or banning content via clumsy and vague censorship.
> "and it's really, really bad."
When trigger-happy censorship is the norm, really really really really bad things can happen.
> "become radicalized and violent when stupid bullshit like this starts gaining traction".
Bullshit like what? They (The Hill) were discussing and analyzing news, not propagandizing vulnerable youtube viewers.
It's disappointing you imply we need mandatory shielding from information, including analysis of news and events, with penalties for those not falling in line. Putin would agree with you. He threatens and shuts down media outlets for mentioning the word "war".
Mindless youtube viewers are not the tipping point to society meltdown.
I've watched a few 'The Hill' clips on youtube before, and I noticed they disagree with each other and provide alternate views. It's not a propaganda machine, so shouldn't be treated like one.
You may want your news and analysis to be grounded in loyalty to a monolith narrative and distributed via sanctioned memes. Others want more than that, such as counter-points, alternate views, robust debate, transparency and discussion. We learn more that way.
In my country Australia, we have a national "Q&A" TV panel show on the public broadcaster, covering news and politics. The idea is discussion, yet on the recent episode a young Russian Australian man was booted out of the studio audience because he mentioned he supported Putin's action in Ukraine. Instead of challenging the man's views, he was booted out live on air.[1]
[1] https://www.youtube.com/watch?v=NqPIxtJd2uU&t=304s
This isn't how civil discourse should operate.
All that being said: i think that censorship does way more harm than whatever nonsense these channels put out.
Network news runs on the same model.
Sometimes the filler (news or other content) is good but not always.
How do you solve that? Education? Too late - they’re adults.
I also encourage them to form their own opinions.
If you form your own opinions through a mediator, you will be deceived. If you form it via direct connection, you will be overwhelmed. Nothing is new or old, here. It is just difficult to have good opinions while also not letting your platform be abused.
Yeah just go build your own platform they will say as if it had not been attempted numerous times only to be censored off every hosting/ payment/ app platforms they could use or hacked to have all the personal information of their users doxxed because they had "bad opinions." Even the ones that actually exist just become hiveminds of their own because they become populated by people who have been censored everywhere else, as if having every community devolve into hive minded safe spaces is actually desirable.
Was this really something that stopped any of the services?
>or hacked to have all the personal information of their users doxxed because they had "bad opinions."
Come on this is not fair. This was caused by these services choosing to hire low skilled developers. The guy building Parler didn't even understand the proper way to set up his authentication strategy nor did he really understand his database and its advantages/limitations. If anything, this should teach the other side to respect the effort that goes into developing these platforms(this disrespect for the skillset is something I see A LOT among the right wing youtubers I follow). They are so freaking proud to be not one of the "elites" by eschewing the desire for knowledge and a love of learning so much so that it bites them in the ass when they are booted off an existing platform and then realize this tech stuff is so hard.
>Even the ones that actually exist just become hiveminds of their own because they become populated by people who have been censored everywhere else, as if having every community devolve into hive minded safe spaces is actually desirable.
These platforms need to welcome the other side then, the ones doing the censoring.
If I go to a restaurant and eat some food and complain that food tasted awful, I am not saying that the restaurant should be forced by the government to make good food nor am I saying that everyone is forced to eat at that restaurant. I'm simply saying that the product served to me was not good.
Similarly when someone complains about Youtube engaging in censorship, you should not interpret that to mean that Youtube is infringing on first amendment rights, or that everyone is forced to use Youtube... rather it's an argument that the product Youtube provides is not as good as it could otherwise be. People are welcome to have an opinion and discuss whether Youtube's policies improve its service or are detrimental to its service without it devolving into a discussion about legal rights and government enforcement.
You can disagree with a position about censorship and make good arguments that Youtube censoring certain content or being the arbiter of truth makes for a better product, just as you can disagree with someone about whether a restaurant serves good food... but don't change someone's argument about the quality of a product into an argument about someone being forced into something or having their rights violated since no one ever made any such claim.
But, the issue that people are starting to react to is - are some platforms a natural monopoly?
Is it useful to have all user-generated public video clips be aggregated by one central provider? The answer seems like it’s yes.
If there’s only gonna be one platform, or it’s socially optimal to have one platform - is it right to give control of that to one entity into perpetuity?
That a political question.
And see, there's always a fine balance between private property rights and other rights. For example: you can't prevent people from a given ethnicity from shopping in your store, even though it is a private property and in many cases you can refuse service to a particular customer - just not for this reason. And this is perfectly okay, because private property rights aren't absolute.
I must have missed the covenant to not enforce their patents that youtube extends to parties they've deplatformed and the indemnity they offer against the enforcement third party patent rights that they've obtained through cross licensing.
This isn't even a call to violence or targeted harassment. It's the opinion of a man about election results.
Nothing to do with youtube’s first amendment right, only Section 230 privileges
Lets strip them down to shreds in the public square for not censoring fast enough! Woohoo
If you disagree with certain ideas, you have to fight it in the open. That's how we get a big tent with general consensus across key issues, and avoid parallel universes of polar opposite information. The end result of all this is just going to be a truly Balkanised / multipolar world, but instead of geographical boundaries, it's all going to be in the mind. Your neighbour and you could have totally different ideas about certain issues - because you are consuming and contributing to totally different sets of information. And that will lead to immense physical conflict in the future.
Very worrying times right now.
What if the dangerous outcome is what's happening right here in this thread, and the thousands of others like it here on HN and elsewhere? Right in front of our eyes every day but we cannot see it, because it is normal.
[1] https://twitter.com/medvedevrussiae/status/14986197503347507...
The social media platforms can't prevent if Russia goes to war with NATO. Censoring the russian official won't do anything.
You can’t force people or businesses to listen to you or broadcast your speech. That’s American liberty. That’s freedom.
You can’t claim that a business is “the public square” and then force them.
If you do want a digital public square look for something run by the government with a mandate to publish everything.
I’m pretty sure people wouldn’t be too happy with that either… too many libtards, too many nazis, too many racists, too many whatevers…
Why not? We place some pretty serious obligations on a few industries - one example is electricity providers of last resort.
Electricity is deemed a common utility and a regulated, granted natural monopoly - one considered a necessary requirement for all citizens and organizations.
Video streaming is not a monopoly, and no number of bad faith arguments about popularity or awareness in consumer headspace change the fact that there are a number easily accessible competitors.
Or you could call your congress people and ask them to increase funding for PBS or NPR or whatever. Or start a tax funded service.
Or start your own service. I’m sure you’ll welcome everyone’s suggestions on how to run it.
Or is it that you want someone else to run it, you don’t want to pay anything for it, don’t want to pay taxes for it, but you do want a say in how it’s run?
(Not you personally but y’all)
Its amazing watching progressives grovel at the feet of corporations bc you agree with their censorship, and for free market conservatives to turn on them bc the corporations they rallied their whole lives turned on them.
I want Twitter and YouTube treated like my ISP, or my electric company.
You can't take away the public square and then demand people only air grievances in the public square.
The line between corporations and government is so blurred online that we must blur the lines between public and private spaces online.
I was basically addicted to YouTube a few years back, first thing every day was go there, I could stop going to it many times a day, even when I felt it was too much. Now, I maybe actively go to YouTube 2-3 times a week, otherwise I just watch embedded videos on websites or Twitter.
I’m not surprised tiktok has taken over.
I think YouTube have basically neutered themselves over trying to “do the right thing” politically
You could automate moderation with ML, automatic recognition, user-flagging but the sheer number of false positives, and alleged false positives, means you basically need to have a full human moderation team to have real accuracy.
You could just use human moderators, but then you have to pay for them all. Ads based websites can't afford that.
You could charge users money and not be purely ads-based, but then you won't be a top web destination because people aren't willing to pay for the internet with money.
So the only remaining option is to automate, ignore the false positive problem entirely and rake in money while abusing your content creators and users. This is 'The Google Strategy'.
First off, it obviously needs to be relative reputation, and not centralizedI think some sort of cross platform web of trust, where you publicly endorse some friends, some content creators, and some investigative journalists, etc. You also publicly repudiate sources of information that you don't find credible. This means that if anyone goes off the rails there's a visible trail of distrust, and that information is de-prioritized in your network.
This also has a bit of sybil resistance built in, because upvotes or downvotes of thousands of bots isn't relevant at all to your trust graph unless friend of friends actually endorse some of these bots or something. Still though, it would probably also be good to have some burnable staking mechanic. Everyone needs to put in a dollar, or 10 dollars, or something to participate (which could be removed at any time), and people who clearly violate terms of use would have their stake burned, and all of their endorsements or repudiations would be invalidated.
I feel like eventually someone is going to have to build something like this. Incentives are hard to get right, but if by some miracle it works, and the trust graph takes off, then lots of these problems just go away. Imagine something like DNS that you could just query to see how much you should trust some chunk of information. It would be a game changer.
I'm being quite reductive, but at the same time your points could be interpreted to resemble the concept. Fruit for thought
Actually, I see this as both not the only remaining option and also as not an option.
If you can't moderate, then you're being irresponsible on your platform. Reduce the load (disable comments, limit uploads, etc), hire, or shut down.
"We're too big to moderate" is an absurd and irresponsible statement by some of the biggest companies in the world.
If these idiots were taking just themselves off the cliff, then sure. But they are taking villagers who cant tell the difference between legit or not off the cliff too.
Even worse, those idiots are turning these misguided villagers onto the rest who know better to not listen to these idiots.
This is a failing of the education system here. But no one is talking about how to really distinguish propaganda from real. Sure they misguided people have freedom of speech, but they are growing like a virus and a virus needs to contained. Social literally has the term "viral" for crying out loud.
The other more important thing to contain would be the medium on which this infection is spreading. Which mean reducing the impact of the feed algorithm. Access to the feed should be limited. Also, facebook and youtube dont want to give up the feed. That is what keeps the humans hooked.
If one part of the body is cancerous, you cut it off. You dont stand around arguing if that part of the body has right to free speech.
"Like a boil that can never be cured as long as it is covered up but must be opened with all its pus-flowing ugliness to the natural medicines of air and light, injustice must likewise be exposed, with all of the tension its exposing creates, to the light of human conscience and the air of national opinion before it can be cured."
I’m not arguing either way. I lean more toward keeping information uncensored, but let’s recognize that the ready ingestion of poison isn’t in the majority’s interest. And society always is willing to say “we’ll censor our taboos”, we didn’t really mean uncensored anyway.
Most of this stuff is really just memes being shared around. And memes, as much as they sometimes look like utterly uneducated trash (complete with errors of spelling / grammar / logic) are usually created by one person then shared by many. I suspect that one person is something of a mastermind; the many, of course, may be morons.
The discussion always seems to go toward censoring the many. I don't support that, even if I vehemently disagree with them. If my aunt, who I don't want to call a moron but does share some utterly insane memes, continues posting things, that's fine. I vehemently disagree with her and hate her viewpoint. She should get to share it anyway.
But I feel like social networks could do a lot more in locating and halting the mastermind that's 50 shares back in the chain, whispering in in their ears. It's like the story from a couple weeks back about a 3-person media shop being responsible for a huge amount of misinformation on Facebook. That was exposed by journalists. Facebook, I feel, would let them go on forever and invests basically nothing in finding out where some of the more virulent content comes from.
[0] https://thehill.com/homenews/administration/595709-white-hou...
[1] https://twitter.com/ryangrim/status/1499545712668905473
[2] I'm speaking of course of LumenDatabase, which used to be called "chillingeffects.org" back when Google was ostensibly in accord with its philosophy.
In my opinion all elections are rigged, why on earth would you risk losing power to the whim of the 'unwashed'.
Both parties are two sides of the same coin and go back and forth to give everyone the illusion of choice. But when people mass together and demand social change, BLM ect its allowed to happened only until it starts working, then the authorities crack down with a iron fist.
Its well documented that the USA rigs elections in other countries. So if the technology exists and the appetite exists, what's stopping them? Laws? Morals?
Do we really think that the intelligence community is going to let a random popular citizen have command of the biggest military in the world and a finger on the nuclear button, just because a couple of extra people put their name on a piece of paper?
I think that is absurd, all elections are rigged and to suggest otherwise is naive, holding on to a romantic notion that we choose our leaders.
Then you could respond with how things are effectively systemically rigged through gerrymandering/news media/misinformation/propaganda/mind control/harassment/selective enforcement but that’s something else.
Iceland maybe, but that would be the exception to the rule.
Much better alternatives exist. Support decentralized internet ideas, save the internet for future generations.
particularly appropriate quote:
> What casual observers might not understand, however, is just how far the policy goes. Not only does YouTube punish channels that spread misinformation, but in many cases, it also punishes channels that report on the spread of misinformation. The platform makes no distinction between the speaker and the content creator. If a channel produces a straight-news video that merely shows Trump making an unfounded election-related claim—perhaps during a speech, in an interview, or at a rally—YouTube would punish the channel as if the channel had made the claim, even if no one affiliated with the channel endorsed Trump's lies.
[1]: https://reason.com/2022/03/03/youtube-rising-the-hill-electi...
The Uber strategy, services with no possible path to profitability being propped up indefinitely by billionaires that refuse to stop believing the dream of infinite growth, is incredibly damaging to any ecosystem it infests. Youtube only exists because Google wants data. Video hosting is just too expensive for any realistic business model to support storing and distributing every vacation video ever made, for free, forever. This cradle-to-grave life support preempts all competition and creates monopolies so large they begin to creak and splinter under their own weight.
The strategy for Youtube moderation is maximum efficiency per dollar: deal with as much muck as possible with the smallest budget possible, and just let the errors happen because it's too expensive to prevent them (but if the public outcry is big enough, they can reverse the decision.) They can't make things better. There's just no money for it, because Youtube is just too big and it doesn't, cannot, and will never make any money.
I was in a chat with someone who worked with Google once and they said, paraphrased, "Our company suggested we could have a team for providing support for non-Partner users, and Google flat-out said 'It doesn't matter how cheap you go, it cannot be done.'"
I'm really glad Patreon exists. I think the only solution to all of this is to break the notion that web hosting (and, on another topic, creative work) is free, or that it can be supported by ads. Things cost money. People should be expected to contribute to support services they use.
(And of course, that brings up the issue of those who don't have money being excluded. There is no societal problem more than six steps away from poverty/income inequality. Probably no more than four steps, even.)
EDIT: all these talking points are the same ones I remember the alt-right getting in a tizzy about when Google started adding information about the Holocaust under holocaust-denial sewage.
I think we need a better word than censorship.
More like inept penny pinchers.
With this shift in society, it makes sense for our AI banning tools to also conflate use/mention.
It is less a programmatic problem and more a real problem. Since even if you don’t use but mention you are considered to have signal-amplified and therefore used. That is, mentioning is now using.
A disclaimer is a way to demonstrate transparently that a reasonable consideration was made before magnifying false claims.
There may be better ways, but it at least accounts for the cost of signal amplification in some way.
Welcome to our new algorithmic future, brought to us by VCs and software engineering.
Suppression of misinformation is an important function if you want a marketplace of ideas to actually work. However building massively scaled platforms for information dissemination ensures that it will be done by robots with no judgement (either the machine kind, or poorly paid people following a script).
The article makes it seem as if a human made the decision, not an algorithm. At the very least, humans came up with the policy and wrote the algorithms, and could override them if they wanted to.
This isn't a sophomoric forum like reddit. This is meant to be a place where adults discuss. Adults can see that there are democratic implications of the scenario you're describing worthy of discussion. 'Another platform' may take years to develop, and nothing stops Youtube or another hostile player from simply buying it out and kneecapping it, and how many elections are influenced while that takes place?
Please don't just repeat talking points, it adds nothing to the conversation.
If anyone wants to give a reason why these arguments are always about freedom of speech VS censorship, instead of being about private property + free markets, please I'm open to this.
But this actually is a potentially thinly veiled way to spread disinformation, right? If you wanted to push a dishonest narrative, all you'd need do is show clips of someone else speaking the words and call it "reporting".
And this is all within the context of a massive surge in d/misinfo across these platforms. They're already having trouble keeping up. I think that context is important, because it warrants a different level of "aggression" in policing. If someone is spouting lies is it reasonable to, say, block the original producer of those lies, but let thousands of other accounts repeat them and generate millions of impressions for the lies?
Because, the lies would still be getting out there, and these platforms are trying to combat being the conduit for them no matter where it comes from. Seems reasonable.
This happens all over the place; policies around abuse end up getting applied to people that report that they are being abused, policies around misinformation get applied to people who report that misinformation is happening. It's kind of big news that Trump is still pushing this narrative in 2022. It actually is entirely legitimate to report that he's still saying this, he's likely to be the Republican candidate in 2024, people should know that he hasn't stopped making this claim.
There's a debate about disinformation censorship that's all about whether or not censorship is the right way to stop people from telling these lies, but even before we get to that debate, regardless of whether or not someone believes that censorship actually helps prevent the spread of misinformation -- the reality is that platforms like Youtube/Twitter are too big to effectively apply many of these rules, and often they end up affecting the opposite of their intended targets.
We have these high-level conversations about whether or not censorship on broad platforms that are nevertheless private is appropriate, and there's some value in having those conversations. But how often do we step back and ask, "are the platforms even capable of pulling off the censorship that they claim in the first place? Is Youtube even capable of censoring election misinformation without also censoring a bunch of other stuff that they didn't intend to?"
A conversation about censorship on tech platforms that doesn't take into account how clumsy these filters are in practice isn't really an accurate reflection of the reality. We have debates about whether or not private censorship on a broad scale is a good idea, but it's not even clear that any of these tech platforms are capable of doing censorship in a targeted way for any context-dependent topic.
It reminds me of how we used to have a bunch of conversations about the morality of self-driving cars making moral choices during accidents, but you zoom out and realize that even those debates are just incredibly optimistic, and in reality the cars just crash into guardrails because their sensors aren't good enough. We should not go into these conversations assuming that accurately targeted censorship of the kind that Youtube proposes even is possible at the global scale Youtube wants to operate on, we have a lot of data showing that large platforms aren't really capable of moderating with the same degree of precision as smaller platforms.
The state of our world is comically broken, and it’s getting worse by the year. I cannot fathom any viable solution to turning things around. Roger Waters album “Amused to Death” was astonishingly prescient.
If it's not nonsense, then why censor it?
The discussion has to be one of nuance, but i think back to the days of TVs (lol). TV and radio had (to my understanding) laws about what you can say on air, different than what you would encounter on the sidewalk. It seems to make sense. After all, piping instructions on drinking bleach into everyones home seems bad. Inciting riots in mass seems bad. I don't know, but it seems logical to me.
The internet is obviously different.. but not different in every way, in my view. It seems like we need to be honest with ourselves. The internet is different than just yelling things on street corners. What should be allowed? I don't know. But i don't think it has the same freedoms as yelling on the sidewalk.
Not comparable.
How to drink bleach: Unscrew cap from bottle. Pour bleach down your throat.
Was that so bad?
“The media’s the most powerful entity on earth. They have the power to make the innocent guilty and to make the guilty innocent, and that’s power. Because they control the minds of the masses. The press is so powerful in its image-making role, it can make the criminal look like he’s a the victim and make the victim look like he’s the criminal. This is the press, an irresponsible press. It will make the criminal look like he’s the victim and make the victim look like he’s the criminal. If you aren’t careful, the newspapers will have you hating the people who are being oppressed and loving the people who are doing the oppressing.” ― Malcom X
An argument doesn't have to make logical sense for it to convince millions of people. Especially not if they like the conclusion.
Because people believe the nonsense and then storm the capital, and/or support overturning democratic elections in order to allow would-be dictators to seize power.
Misinformation can have a serious and deeply negative impact on the world, and the general public is clearly not immune to it.
In this specific debate, if "misinformation" is forbidden, whoever gets to decide what is classified as "misinformation", will control society.
Thus they censor because they cannot claim to be impartial anymore.
But there used to be platforms on the internet.
But they got threatened by the left and the right for different reasons. Censor or be broken up.
We will see more ex-platforms this year but for different reasons.
Imagine in a totalitarian regime, you would like to inform people on system critical thoughts others have been executed for, so people can avoid them.
The problem of newspeak is that most of us don't see it, until it's too late. When we feed the machine with our fears, it will cut us off.
Most of us still like each other.
There's a always enough people who would enjoy the killing, if only someone would employ them to do it. And most people would probably deal with it as long as they aren't the victims. It's really sad. Our societies are very vulnerable to falling into such messes.
We have the internet, where you can rent or run your own server and storage and post your conspiracy theory videos.
I don't believe any entity should ever have the power to frame what is and is not acceptable to believe to such a large amount of the world, be it Google, Facebook or a government.
It doesn't matter whether they are shaping it politically or profitably or via AI for clicks. It isn't neutral and drowns out or outright eliminates competing views.
Broadcasters are forced to do all kinds of stuff. In Canada they have to carry a certain percentage of Canadian content.
(I don’t necessarily think it’d be the best idea for the most part though.)
Between the trio of media, tech, and government there's enormous control over what we hear and learn about. The filters they apply are powerful, yet inconsistent, and often populist, political, or irrational.
These 3 groups have this control, yet are incapable of using it to optimise for citizenry well-being.
For the same reasons that AT&T can't terminate your phone service if they hear you talking about something that's opposed to the political preferences of AT&T or its leadership. These companies have become the public square and the law needs to be updated to reflect that. If they have to become public utilities fine.
There's a big difference between a small company being coerced by the government into some action, versus the same action on a very small handful of multi-billion dollar companies that have a collective monopoly on our entire public and political discourse.
What's really bizarre is how the left is supposed to be opposed to, e.g. Citizens United, and the establishment neoliberal take is ultra-pro corporate rights and autonomy and 0 government intervention. IMHO thinking a few companies should be free to make a cooperative agreement and shut whoever they want out of communicating on these popular platforms is a radical right-wing position.
If you get enough traction, you’ll need to host a whole operation that can be managed by a full staff and serve millions of views every day. There’s now in the picture one or more platform providers, and payment processing that you depend on for your income.
A bunch of entities are now allowing you to have your soapbox, that can be swept away at any turn if you’re too much of a problem for any of them. No need to arrest you or kidnap you, “violate your rights”, they just cut your supplies.
If people banned from that is "censorship", then by that definition the whole world was censored 2 decades ago.
It's difficult when the backbone of the internet isn't available to you. For instance, AWS can be denied cutting off vital internet infrastructure we take for granted. Further payment processors like Visa can cut off access. Even banks could cut you off and even seize your funds if they got the directive.
You can spin your own service is a nice fiction we tell ourselves but with financing getting cut off it becomes impossible to build a service that reaches a lot of people without a lot of money and a disregard for profit.
In China maybe someone would come knocking on the door to take you away to jail. But in the US, we don't need to go that far. All it would take is the press secretary telling big tech they need to help fight "misinformation" and pointing them in the right direction.
Yeah, right up until we don't. Wishful thinking doesn't scale.
Note all the attempts in the article to paint them as "armed" and "extremists". There was no violence, no destruction of property. It was a legal demonstration until it was made illegal by invoking the Emergencies act. And this use of the law in my opinion was an abuse as it clearly is for terrorist threats. Not Canadian working class people peacefully demonstrating.
EDIT>> My mistake, censorship falls clearly under Orwell
By the way, not that I would support this at all (since I believe in free speech), but any idea if any speeches from Putin or Xi, or official government accounts from either country have been banned from Twitter or YouTube et. al. like our own US President's social media accounts were? I'm guessing not likely, given that they've really done nothing about ISIS accounts in the past, or posts from those on the "left" side of the culture wars calling for and supporting outright violence in US cities or against individuals.
This is inaccurate. The precedent for removing extremist content was set in the late 2000s when Facebook, Google, and Twitter started massive campaigns to investigate, report and remove Islamic extremist content on their platforms. A lot of legitimate non-extremist Islamic content and users got caught up in the censorship. I didn't hear anything about it from free speech absolutists, though.
Search for the HN submissions reporting on G Suite for even more ideas.
https://twitter.com/glen_mcgregor/status/1495146891646013443
This is a path where all this censorship will lead to if they don't stop it. It will be as bad for the modern day Oligarchs as it was for the French nobility during the revolution, if they can't control it. But the problem is during the last year only more and more trust has been eroded. All week on twitter Pfizer, and vaccine side effects has been trending. Today on twitter it's "They lied" is trending. You can see how alarming this is to some at the top.
Hypernormalization
Looked it up and found a Wikipedia article https://en.wikipedia.org/wiki/HyperNormalisation
And the video https://www.youtube.com/watch?v=thLgkQBFTPw
It's one thing to have people making false claims, but covering the former president tying of election his claim to the current Ukraine crises should not fall into that category.
When did Youtube become A Russian censor?
Perhaps the right answer is to “divide” it up and set requirements for each category of video, similar to how “YouTube kids” is a thing now. News and podcasts would go under its own section and have some level of sanity checks while home videos can be as random as they want to be.
I think we need to focus on how to make it easier to consume content from multiple platforms together. Small starts are things like web notifications (which are unfortunately abused) so that you can be notified about new videos from an author your just discovered, even if they aren't on your regular platforms. Of course RSS is an awesome version of this but needs some UX built for the average user.
The harder thing is getting curation and recommendation across platforms. But this definitely isn't impossible.
To your point, Youtube has a strong legal incentive to make Youtube Kids succeed, and they failed. I’m not sure we can expect better for other category splits that are no less controversial.
Once you start compromising on principles, the rest is just slippery slope that we are observing.
A decade ago that would be pure heresy on any tech forum. Now it's (currently) the top comment on one of the top stories on HN.
That said, I guarantee that a government-controlled or otherwise heavily regulated platform would have far, far more instances of benign content removed for violating obscure rules. Once the consequences rise to the level of regulatory violations, the companies will pull content without thinking because nobody wants to risk the fines or worse.
These companies are rushing to self-censor their own content as a way to get out ahead of potential future regulation. Politicians and journalists can, and will, seize upon even the slightest missteps or unpopular comment to lambaste tech companies as evil. This move was definitely a mistake, but it's part of a larger movement to aggressively err on the side of safety when it comes to any potentially controversial content.
https://rumble.com/vtyr34-democrats-are-pressuring-companies...
Due to network effects there is no legitimate competition to YouTube, so YouTube needs to be regulated as a monopoly (unless maybe TikTok has started hosting longform videos instead of just people doing stupid dances).
YouTube can play politics if they want. It isn't a good thing, but life has survived CNN and Fox News and it doesn't get worse than that for biased publishing.
I think of YouTube (and many others) as akin to state media anymore.
The WEIRD world is becoming increasingly more authoritarian. I’m already detached from real conflict and scarcity, knowing nothing but peace and prosperity, and the TikTok generation behind me meming about WW3 doesn’t inspire much confidence.
On the other hand, the world is probably the most united it's been in a while because of Russia so there's that.
But now you hear about it. The world you planned for was always an illusion. It never existed outside a narrow slice of time in post-WW2 peace and prosperity, and only for the thin slice of humanity that got to enjoy it. Now the hard part is: what do you do about it now that you know you were fed pleasing lies?
The world is certainly dark but I find comfort in knowing that the majority of us simply want to live peaceful lives, and watch our families prosper.
If course mistakes happen when you stop blindly accepting everything S valuation.
But other changes are for the better. And you don't need war to make these changes. If you make them before hard times come, you'll be better prepared for them: becoming more stoic (less emotional, less offended at everything), becoming more serious, more resourceful, more self-reliant, lowering your expectations, being more humble, being more careful with your words and who you speak openly with, knowing your place instead of speaking for the whole of humanity, being better able to negotiate, being better able to compromise, being more tolerant of other opinions, appreciating the wisdom of our ancestors more, and focusing more on what really matters.
Perhaps people want a more authoritarian world (without realizing it). The ratio of static to noise has been increasing year by year. People don't want to take (nor have) the time to balance alternate view points and spend the time to independently determine fact from opinion. I think the current lack of direction creates turmoil--there are too many options, too many sources of information, and the quality of all of them have gone down while they compete to pull at our amygdalas for seconds of our time. Maybe there is a higher percentage of people who just want answers from someone they trust so they can get on with their lives.
I think I've been on HN long enough to know that (re)implementing limitations on the control of information would make most of you scream. But maybe its for the best. It's already been proven many times over that people aren't going to research facts themselves. So having fewer organizations spoon feed choice bits of information can't be worse. Note that I am not suggesting to deliberately block information, but rather have organizations that are more selective (i.e. exactly like it was during the newspaper era).
I think it's incredibly unlikely for the USA to become a dictatorship. We have too many checks and balances, and the decision making is cordoned off into many compartments, and the country is split 50/50 on who they vote for. An authoritarian-bent president would have to fight three branches of government, 50 governors (all of whom have their own state-level army, by the way), 100 senators, hundreds of congressmen, thousands of judges and lawyers, 150 million citizens, and so on and so forth. And on top of that, manage the country on a national and international level. I just dont see it happening. Trump did not bring us any where closer to that reality by the way (name one legislative/judicial change that moved the needle closer in that direction).
All this to say, there's nothing to be scared about. The internet has been a fun experiment but it's given a microphone to too many people, and has crumpled the gatekeepers who had the job of filtering out the diamonds from the mud (and did that job extraordinarily well, by the way, because they had to, or die). It feels like our culture has been dying. The influence artists had up until the 90s was comparatively huge. If resources were pooled to support the most talented (be it artists, journalists, architects, musicians, corporate / government leaders, etc), I think we would experience another golden era of culture.
They keep removing videos to keep their advertisers or customers happy. If they are not allowed by the government to delete videos that are not illegal, they can just blame the government.
Of course, they should not be forced to keep every video. It has to be profitable for them.
This would eliminate most the edge cases.
The problem is trusting the algorithm to make decisions and then pretending you aren’t responsible for them.
1. At what point will the information on the internet become so obfuscated that communication and "truth" is difficult to discern from "lies", if not presently today? Consider a person using the internet for the first time in order to find more information on a current event. How many sources (and what qualifies as one) does one have to traverse to weigh what one can consider to be "truthful enough"? Does a person have enough time to sift through all the available sources? To me, this is why media outlets filter their content, in order to protect their version of "truth". News outlets have always been biased, I don't see why YouTube or Twitter or FB cannot do the same thing.
2. Democracy will always be a thin line. There must be a point of "truth" of recording votes that the population is willing to accept. If the majority consistently challenges results or insists that the results are false, then democracy cannot exist, and another form of governmental control, possibly an evolution of authoritarianism or dictatorship, will take its place. Is that more acceptable than less-than-perfect democratic outcomes?
We are living in an era where everyone is on aol. Leaving those comforts will bring a richer experience.
This is a good thing.
Basically every media outlet mentioned here has a long history of publishing really big lies about important stuff they know to be false for monetary gain.
The Hill had to quietly oust one of their own top propagandists recently because of how badly his obvious lies about Ukraine helping to steal the US election from Trump went down with people familiar with objective reality.
Here's their own internal review:
https://thehill.com/homenews/news/483600-the-hills-review-of...
We're seeing on a big scale where this leads right now, and it is no coincidence that all these people seem big fans of Putin and dislike democracy.
And we've got all these people on HN carefully avoiding mentioning that they believe, or worse? support, these lies, and taking the pure and noble stance that we should never censor lies. Which would be a bad stance even if that itself wasn't just another obvious lie for political gain.
When reached for comment, YouTube policy communications
manager Ivy Choi confirmed that the channel had been
suspended for posting content in violation of YouTube’s
policies. “We removed content from and issued a strike
to this channel for violating our election integrity
policy, and as a result, this channel is suspended from
publishing new videos or livestreams for seven days,”
Choi told the Daily Caller News Foundation. “We do allow
for content with sufficient educational, documentary,
scientific or artistic context, which the removed content
did not contain.”
Wow. If you want another Trump elected, Ms. Choi, then by all means, proceed.If you're Republican, conservative or "libertarian" (riiiight), you need to be quiet and go sit in the corner while the rest of us decide what to do with you.
Removing the dislike count was YouTube's renunciation of democratic ideals.
Now it's a media outlet just like the others.
(This feels meta: HN links to a Tampa newspaper, which links to a Twitter thread, which embeds a clip from a video released by The Hill, which contains a clip from Fox News, where one half of the audio comes from a phone line to Donald Trump. Pointers to pointers to pointers. Oh, and now you get me, talking about it.)
The framing is interesting. (I have always found it hard to pin down The Hill.)
First, what Trump said: Mostly, he tried to take credit for cancelling Nord Stream 2. This is a little weird, because it didn't seem especially cancelled. Related to this, he told an unverifiable but plausible anecdote about giving Angela Merkel a white flag, to say that she was "surrendering" to Russia: I say "plausible" because he was pushing Germany (and Europe in general) to increase its defense spending (by threatening not to defend NATO, which you could say emboldened Putin, but you could say that about other things too, like Obama's "red line". "Mistakes were made."). And at the end of the Trump clip, a little unnecessarily, they leave in one sentence from him to the effect that "this would have never happened if I'd been re-elected", which of course he expresses in terms of a supposed "steal".
Could the Hill have cut the clip just before that last sentence? Yes. They would have lost none of their main message. They would have lost something to react parenthetically to, though.
Did they include that one sentence simply because it was adjacent and it gave them some shock value? Or was this their way of smuggling the message to their (not-so-Republican) audience? ("We are ostensibly laughing at Trump, but at least this gets you to listen to him"?) I don't know.
Their reaction is one of implicit mockery. They also imply that Trump typically would also insert some talking point about his supposed healthcare plan (which never happened). Which again they laugh at, because everybody knows there was nothing there.
They also refer to Trump's supposed "defense" of Putin: From context, it sounds like Trump said that Putin's use of the word "peacekeepers" was clever. But there is a weird moment where the host repeats the word "peacekeepers" several times, almost unnecessarily. A paranoiac could say he was trying to reinforce the message.
So what is The Hill's slant? With Rising they lean towards a kind of moderate populism that I have always associated with Russian propaganda. Though that has only ever half- made sense to me: Sure, Russia would want to get support away from the more interventionist centrists ("first choice Trump, second choice Bernie, last choice Hillary"), but you'd think they'd also want to push division, and The Hill's moderate populism is actually not so inflammatory (it does not shove wedges into cracks between identities, like, honestly, CNN/BBC/(CIA?) do. Or did until Biden got them to moderate themselves, a little?)
And The Hill has tended to give time to writers like Matt Taibbi who emphasize that they think "Russiagate" is fake. (Maybe they're right though?)
So then how do we put it all together?
Maybe I should just take The Hill at face value: They're trying to do an "inside baseball" thing, and this really is the compromise political position they've decided they believe in (presumably because it looks like a way forward that seems "good" to them while also protecting their interests).
Anyway, I guess the lesson for The Hill is that, if you're using sarcasm to deal with Trump's claims, you're being too subtle. And if I can write a post this long wondering what their angle is, then they're being too subtle.
Which might be necessary and true, but would still be a little sad: "We can't have interesting things because the other people are too stupid."
[0]: https://www.realclearpolitics.com/video/2018/09/13/breitbart...
We've only had nuclear weapons for 70 years. We used them to blow each other up on day 1. The odds are not in our favor that nuclear warfare will be contained.
I imagine many of these same people championed the rights of private businesses to decide who they did and didn't serve when it came to the infamous cakeshop [1] yet when Youtube or Twitter or Facebook makes exactly the same choice not to be a vehicle for disinformation, those same people lose their minds.
The point here is that the majority of people complaining about this aren't standing up for principle. They just want to get their way.
Live by the sword.
[1]: https://en.wikipedia.org/wiki/Masterpiece_Cakeshop_v._Colora...
https://reason.com/2022/03/03/youtube-rising-the-hill-electi...
> YouTube has taken the position that merely acknowledging an utterance of the false claim is the same thing as making the claim yourself unless you correct and disavow it elsewhere in the video. It is also sufficient to post a warning label in the video's description that a false election claim makes an appearance.
Not saying it’s a smart policy, but just to be clear, YouTube does allow you to play a video of Trump saying the election was rigged, provided you warn viewers about it.
Why not just stick it in every video you make regardless of content if the alternative is potentially the loss of your career?
This authoritarianism is powered by the people, not by direct Government action.
What should you be fighting for? Making these platforms public utilities so using them would confirm protection of the 1st amendment.
I'm a cord-cutter and yes, I use YouTube constantly.
It obviously has a lot more than kitten an puppy videos, it's not that cut and dry.
Less educated people need to learn to trust their instincts, and be skeptical at all times.
Politics is about really simple stuff because you can’t over complicate at such a high level of abstraction.
I think there is no relationship whatsoever between education level and common sense. If anything the correlation is negative. Highly educated people don’t need common sense because they deal with complex things.
Amazing how many people support tyranny by thinking they are on the side of righteousness.
Everyone has rights, or no one has rights. Remember that