The thing is, truth does exert a pull on our beliefs. It's a slow force. It may take years for people to come around to it. Sometimes it even happens on a generational scale. But we are approaching the truth. Everything in history, and everything in our daily experience tells us this. A couple of experiments where researchers manage to fool the people in their studies does not disprove this overall trend.
What scares me about this narrative, is that people are using it to discredit democracy. "Look how stupid people are! We have to spoonfed them the cherrypicked facts that lead them to the right beliefs. We have to decide everything for them."
The point isn't that evidence has no power, it's that it has dramatically less power than most people think. However there are strategies for getting us out of the personal bias quagmire, such as the scientific method and the approach described in the article of providing an account or explanation of your position and the reasons for it. The debating rule of first explaining your opponent's position in your own words, but in a form they accept as being accurate, before trying to rebut it is also hugely powerful. These do seem to work and help lead us to better outcomes, so this is valuable and actually useful work.
I presented a case for why I think it is wrong. And I presented a case for why people have reason to push it despite it being wrong: it gives them more power. Considering peoples motivations is important, and when someone stands to gain we should be suspicious and have to go over everything extra carefully.
OP: I dislike the claim because I reject the claim.
Your version of OP: I reject the claim because I dislike the claim.
I don't think that is the real "narrative" here, although reading this article may make it seem like that. The "narrative", or rather the modern scientific understanding which this article tries to present to a lay audience, is that real rational thinking is not our default, even though it seems to us that way.
But we can think more rationally. It just takes a lot more work. We can do all of the following...
* Subject our thinking to a rigorous framework such as the scientific method, in which we have to declare what evidence would falsify our argument (hypothesis) as we make it.
* Study cognitive biases to become more aware of their effects on our thinking and hopefully "immunize" our mind against some of their effects.
* Train our capacity for meta-cognition with mindfulness practices to become more aware of why we think what we think as we think it.
As for using the limitations of human rationality as an argument against democracy... I don't think that's a logical conclusion at all since leaders and lawmakers are subject to these limitations no matter how they come into power. But it is an argument that we still need to improve the processes by which policy is decided and that we need to watch out for and guard against those who would abuse the specific ways that humans can be tricked because of these factors (such as the Cambridge Analytica crowd).
I’m not sure this works. In fact, Daniel Khaneman has said in the into to “Thinking: Fast and Slow” that all his study into cognitive biases has led him to believe he’s powerless to stop them in himself, and still only able to recognize them in others
To mirror simonh's comment, it's rather ironic that you're responding to a claim of fact, with an opinion. Whether it's true or not is a question of science, not wishful thinking, and you've not given a solid reason to reject the findings of this research.
It's like the way the theory of evolution remains true whether or not some nasty elements of the far right try to use it to justify an atrocious ideology like 'social Darwinism'.
> people are using it to discredit democracy
Who does this? I don't see researchers like Dan Ariely [0] lurching to the far right when they make discoveries about our psychology. (It's odd that neither Ariely nor the field of behavioural economics [1] are mentioned in the article.)
Nothing about this research indicates that non-democratic systems of government are the best way to run things after all.
> truth does exert a pull on our beliefs
Broadly speaking mankind seems to get less ignorant over time, but sometimes the pull on our beliefs can act in the opposite direction. [2]
[0] https://en.wikipedia.org/wiki/Dan_Ariely
[1] https://en.wikipedia.org/wiki/Behavioral_economics
[2] https://en.wikipedia.org/wiki/Confirmation_bias#backfire_eff...
Are they? "But we are approaching the truth. Everything in history, and everything in our daily experience tells us this." sounds like a claim of fact. Do you disagree with it? Have we not, collectively, changed our minds on a great many things?
Evolution, heliocentrism, the importance of doctors washing their hands, the non-determinism of quantum physics - all of these are a result of people changing their minds when presented with new facts. Even the importance of car safety belts and harmfulness of smoking. You are literally surrounded by evidence of people changing their minds when presented with new facts, but choose to instead focus on a few experiments where some people didn't change their minds when presented with some evidence on certain topics.
How did the article call it.. confirmation bias?
It's like boring headlines don't get voted up in Hacker News. It has to be something interesting. Facts change peoples' minds. Facts don't change people's minds. Both are true. But only one is interesting.
I do think too many people get blinded by that hysteresis and the want to be 'right'.
I do agree that, over generations, the correct and truthful views tend to gain the upper-hand. This arises from each generation downloading a new set of facts and learning in school, when they are young and their minds haven't formed their belief system yet. However, if we allowed all children to enter school at their place of worship from 5 to 18, we'd find college students remarkably unwilling to learn many more facts.
So, more broadly, why does it bother you that facts don't change our minds and we're all irrational? We are Homo Sapiens, a mammalian primate who made the jump from the jungle to the Savannah and learned to work together to gather food and hunt game. We haven't left behind our animal software, it is still active in and exploited by our modern society.
I think this needs substantiation. You present this like a fact, but it looks entirely like an opinion: your interpretation of history.
It presents a sense of inevitability that I find extremely dangerous.
The only thing that keeps us from losing what we have today - as many civilizations have done in the past - is our actions. Presenting it as historical inevitability cheapens both the meaning of our actions but also discourages people from seeing just how important active effort is.
Truth on others, like medicine, psychiatry, nutrition, etc.. Are really really conflicting. And people build cultures and tribes around their truths, each backed by science. Get some keto people, vegetarians and run of the mill nutrtion experts to sit around debating and your head will spin
There's such a complexity there, conflicting studies, poorly done studies.. Finding a "truth" in how we should eat, how often, etc.. Is near impossible.
And that's just that subject.
If you didn't believe in ghosts before and then suddenly switched to believing in them, then your entire worldview changes - the soul exists and can exert itself into reality, the afterlife is real, maybe emotions exert some real phenomenon too, and so maybe wishing makes things happen too!
That, in and of itself, would be irrational, to change your entire worldview based on single pieces of evidence, even if they do appear to be true
1. The person learning the fact trusts the source
2. The fact can be easily proven if the person doesn’t trust the source
3. There are no other facts which provide context that are missing
These are all critical in how the general public receives “facts”.
You can't trust any major media publication, because they'll play both sides of a story. For instance, a major newspaper reports that someone is predicting a recession. If there's a recession, the newspaper will crow about how smart they are, but if there's not a recession, the newspaper will run reports about why the recession predictions were wrong, and then crow about how smart they are. What were the facts?
You can't trust any major political figure, because they have an agenda by default of existing in the political system. If there exists a fact that damages their agenda, they would lose their career if they admitted it.
A lot of the time the "facts" that people are often the most angry about are actually either predictions of the future (ex: it's a 'fact' that global warming will lead to +2 deg C by 2100) or they're summaries of statistical models (ex: it's a 'fact' that X subgroup is n% more or less likely to earn less/more).
Even things we know are "facts" don't necessarily lend real understanding to the person believing them. Every high school physics student knows the fact that light is a particle and a wave. Does that mean they actually understand light?
How many "facts" were known to citizens in the past that we now laugh at?
You are doing a remarkably effective job at demonstrating this phenomenon.
What is the nature of this force? Where does it come from and how does it influence our minds?
I'm more cynical than that. I believe that facts do change people's minds, but most people harbor hidden agendas that they try to adjust convenient facts to while ignoring inconvenient ones.
I think it makes a lot of sense, when one is trying to identify patterns in information, that it's easy to over- or undervalue novel information. We don't necessarily know what a new fact means, so ignoring it is one common error while paying too much attention to it is another.
We also rarely even know if a "new fact" is actually true. So many studies don't replicate that it makes sense to hold off on updating core beliefs whenever "new facts" seem unlikely or in contradiction with previously known (and reliable) facts.
SSC had a nice article (now gone) that discussed this for a scientific theory that had literally hundreds of confirming studies done for it. All wrong. The "new facts" were bullshit. So even with tons of studies, it's reasonable to be skeptical in some situations.
It's also great that, eventually, science was able to figure out the "new facts" were bullshit. Yay, science. But it also means that people aren't being irrational when they don't immediately alter their fundamental beliefs while the ink is still dry, especially "new facts" that seem in contradiction with everything else we know…
I guess we just need to tune our relaxation factors [0] or, perhaps better, recalibrate our Kalman filters.
[0] https://en.wikipedia.org/wiki/Successive_over-relaxation
1. if I have a uniform/undefined prior (how the fuck should I know how risky/conservative firefighters are?)
2. and then I'm given an anchor
3. and then told the anchor is bunk
4. the anchor still affects me
But I suspect this hinges very heavily on the fact that our initial prior is basically non-existent. By contrast, if you:
1. picked a topic where I actually have some prior belief (What country is colder: Sweden or Germany?)
2. gave me some information "Germany is actually colder on average than Sweden because of a weird atmospheric thing that affects the nordics"
3. told me that 2 was BS
I highly doubt you'd be able to replicate 4.
Then I thought back to that Bezmenov interview with what he said about "demoralization". When a population is demoralized, they cannot discern true information when it is staring them in the face.
I think ignoring facts has less to do with some kind of esoteric psychological process and more to do with raising multiple generations to believe that they've been lied to and the whole "system" is evil.
I agree that the public is lied to. But that is usually through editorialization of headline news that omits or emphasizes convenient information for the sake of a narrative. What I’m talking about is being presented with raw information and considering it.
So why was it expected of the participants to change their minds? Nothing they could verify disproved their initial position.
For me all this proves is what I already knew: "garbage in, garbage out".
edit: as below comment pointed out this might not be the problem of the studies but of how the article tries to use them to prove its point.
On the invented studies, bear in mind that the point wasn't to measure changing the participant's mind, only for them to rate the value of a study that either supported or contradicted their initial position. Their only basis for evaluating the value of either study was their own pre-existing bias, so objectively they had no reason to evaluate them differently.
That's quite different from expecting them to change their minds, as the reasons for them holding their position might not even have been addressed by the study. For example someone who disagrees with capital punishment on moral grounds may not care whether it is an effective deterrent or not so may no have any reason to doubt a study that it is an effective deterrent.
Many theories that are ridiculed deserve it.
Many ideas that are violently opposed should never see the light of day again.
Very, very few of those that reach either the first or the second stage ever make it to the third, and it is a classic logical fallacy to argue that being ridiculed implies that an idea will be proven true in the end.
Aaaaany minute....
Part of the reason, “facts” don’t change our mind is that a lot of “facts” aren’t really facts like physics, but are rather the result of statistical games.
Finally, and I think the biggest issue is that a lot of facts rely on trust, since they are practically impossible for the average person to fully verify. And I think, for a variety of reasons, trust has been lost. Think about vaccines. Say back in the 1950’s, you probably knew or heard of someone who died from polio. You mom, might have had a sibling that died from one of the other vaccine related illnesses. The doctor recommending the vaccines, was seen as a trusted friend. He(it was usually a he back then) probably spent his whole life in your town. He knew your grandparents. Maybe he delivered your parents. He would spend hours at the bedside of a sick child or a dying grandparent. Maybe he was the one who delivered your children as well. Now when he says that he recommends you give your child this vaccine, you are going to listen.
Now forward to modern times. You book your appointment. You go to the office where you wait for hours. The pediatrician comes in and rushes through a 15 minute visit. Says your kid should get vaccinated. On the way home you listen to an investigative report of how doctors are paid by big pharma to prescribe drugs. By the way, you have never heard of anyone you know getting one of these vaccine preventable illnesses.
Now the gap between the educated elites and regular people in this country is widening. They do t interact much socially. They do t even live together. In the United States, the non-college educated have seen a steady decline in their real wages and well-being. Of course they are going to distrust “facts” put out by the elite who are seen as out of touch.
I say this as someone who totally believes in vaccines and have persuaded many of my friends that they should have their children vaccinated. The growing gap between the rich and poor in this country is at the root of many issues.
Facts are closely related to statistics. It’s possible to be both true and a complete lie at the same time.
Abusive people will often use “facts” to control victims. You learn to be very mistrustful after awhile.
According to the article they have many times, yes, it describes many examples of similar experiments along these lines.
This evolutionary function of reason, and it's resulting flaws in our implementation of it supports my belief that in the grand scheme of things we are actually only just barely sentient. That is, we're at the very lowermost bound of the set of possible intelligences that are capable of technological civilisation. I think this because, well, we only just recently evolved enough intelligence to actually do it. If we'd become intelligent enough earlier, we'd have done it earlier.
If that's true then sure, it would be natural to expect that our reasoning powers are still impaired by flaws and fallacious tendencies. The scientific method then is a procedural set of rules we've invented to prevent our naturally somewhat irrational tendencies to mess up our ability to determine accurate actionable information. Yay us!
That can explain non-movement of opinion when presented with contrary fact, but not movement away from the fact. The article here notes the experiment when students were presented with dueling articles on capital punishment: the ambiguous data acted to bolster their original position no matter the original stance.
A lack of trust in authority is one thing, but to use the authority's agreement with your pre-existing opinion to determine trust in that same evaluation is inherently circular -- even if it is human.
> Thousands of subsequent experiments have confirmed (and elaborated on) this finding.
That’s a lot to unpack.
Yes, vaccines have very low concentration of compounds of those metals.
Yes, those metals at the concentrations in vaccines can cause minor side effects, though there is no evidence of serious side effects.
But, more to the point, vaccines more broadly than those ingredients occasionally cause serious injury. This is a rare but known risk.
> I not allowed to sue about that
It is true that one cannot sue in the US over vaccine injuries but presented alone in this context that is misleading to the point of dishonesty, since there is an alternative compensation program (one where, unlike with regular court where if you win you are still out legal costs without extra proof of particular egregious conduct, you can be awarded costs and fees even if you aren't eligible for compensation for actual harms.)
https://www.hrsa.gov/vaccine-compensation/index.html
That's because the “medical establishment” doesn't view vaccines as perfectly safe, but instead that they are viewed as being safe in the same sense as other prescription medicines but, further, than the public health establishment—the relevant part of the government—feels there is sufficient public health benefit from vaccination that even harms which would not compensable for other approved drugs are compensable on a no-fault basis for vaccines, to encourage their use.
> and no one in the medical establishment would acknowledge that the only cause of my daughters sickness could have been the vaccines.
If no experts would agree with your claims of causation, then allowing you to sue would just be allowing you to incur a bunch of costs to no end. A more likely explanation for no experts agreeing that the one thing you've focussed on as the cause being the cause is that there is not evidence for the claim of causation.
> There is propaganda about vaccines being harmless
There may be somewhere, but it's not coming from the “medical establishment” or the government, both of which acknowledge that there are both the common minor and less common severe harms from vaccines.
I don't think it's the only way to change peoples minds and I hesitate to dive into "just employ emotional reasoning" as that seems dangerous.
From personal experience, another effective way is to change people's minds is by giving them "skin in the game".
I've tried, over the years, to convince friends of the solution to the Monty Hall [2] problem. After explaining the solution and them either not believing it or not understanding it, I then play the game with them with 100 doors and revealing 98 after the first pick. Once this game is played a couple times, they understand the solution much more readily.
My take on this is that they suddenly have a personal stake in the game, even if it's weak. There's a personal cost that takes the form as social shame or loss aversion, even for a game that's played between friends with no money involved, that gives them a stake. Once they start wanting to actively avoid losing, they're much more willing to listen to reason.
The article points out that our anti-rational behavior is at odds with survival but I would bet there's a level of abstraction below which our survival minded rationality kicks in and above which we don't have enough of a stake in the answer to use our rationality to good effect.
I've thought about this too on my own strong feelings. The more I know about something, the more I understand its nuances, pros and cons, etc, the less I feel strongly about it. Now when I spot myself with a strong feeling about something I try to remind myself that I'm most likely missing something.
We see this constantly in the dev world. Younger devs feel very strongly about languages, libraries, frameworks, etc, probably because they have a shallower understanding of the thing.
Mostly people want to validate their intuition and gut feelings and don’t want to experience the discomfort of finding out that their intuition is not magically correct.
Why they didn't at the time: https://news.ycombinator.com/item?id=13810764
The study found that facts do indeed change people's minds, just not as much as we'd like, because the initial impression sets expectations. Caldini talks about this in some of his books on persuasion.
"When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration."
And:
"(They can now count on their side—sort of—Donald Trump, who has said that, although he and his wife had their son, Barron, vaccinated, they refused to do so on the timetable recommended by pediatricians.)"
The thing is with studies like this is it's used by people on the losing side of elections to start complaining about "low information voters" with the subtext being "If only everyone was as clever as me and all my friends that think the same then [thing I disagree with] would never win elections." Ironically this also lets them avoid any introspection as to whether they may lose because there are defects with their policy positions.
it's pretty backed up by evidence (and honestly attending a Trump ralley), that the average voter of Trump is less educated, much more prone to misinformation, and simply holds a ton of trivially wrong beliefs about the state of the world.
That's without making a value judgement about the voter or saying they shouldn't have their vote which they should of course because there's no requirement for voting in a democracy, but it seems silly to pretend that such a thing as an uninformed group of voters does not exist, or even cannot exist because it would be offensive in a way.
Autocrats and corrupt leaders have banked on them throughout all of history, and measured, intelligent and truthful discourse is not always found in the majority.If we're concerned with truth then "they keep losing elections" or might makes right style arguments hold no value, in fact they're quite dangerous.
This is just as true of your "average" Democrat. The "average" person is woefully misinformed about most things. It's probably safe to say that nearly everyone, myself and the majority of the HN crowd included, is misinformed about many things that aren't critical to our day to day life.
It wasn't necessary, however it gave the authors the opportunity to test in just one line if the summary was true, and I guess it worked.
I also don't want politics injected into scientific topics, but the role of politicians is to rule for people's good, and talk with extreme caution and responsibility because of the trust people give them. When a high profile politician says "this is good", a lot of people will follow the advice blindly, so when a politician put people lives at risk by telling for example that Hydroxychloroquine works as a cure for the Coronavirus (to date at least one dead and one intoxicated after following that advice), it's politics actually harming lives with dangerous information, which makes everyone's duty to inject back common sense into the debate. If only because scientists don't have the same exposure, and it becomes so hard or even impossible for them to undo the damages done by clueless politicians who talk about things they don't know squat.
BTW. I would have the same exact opinion even in the case it was Obama or Clinton doing what Trump did.
In what way did it work?
> I also don't want politics injected into scientific topics, but the role of politicians is to rule for people's good, and talk with extreme caution and responsibility because of the trust people give them. When a high profile politician says "this is good", a lot of people will follow the advice blindly, so when a politician put people lives at risk by telling for example that Hydroxychloroquine works as a cure for the Coronavirus (to date at least one dead and one intoxicated after following that advice), it's politics actually harming lives with dangerous information, which makes everyone's duty to inject back common sense into the debate.
I haven't encountered many officials who have consistently spoken with "extreme caution and responsibility" on Hydroxychloroquine, or anything related to this pandemic really. As far as I can tell, it is unknown whether Hydroxychloroquine is or is not effective in treating covid patients (there are severe limits on our ability to know many things), but the vast majority of reporting I've been exposed to the matter has a very strong propaganda odour to it.
https://edition.cnn.com/2020/07/02/health/hydroxychloroquine...
https://i.redd.it/cppndepg1s851.jpg
Personally, I now start from the default epistemic position that anything said in the media is untrue, but the degree and manner in which that is the case is unknown, and that is the part of the claim that should receive significant mental attention (which parts are objectively untrue, misrepresented (cherry picked, deliberately framed), and what noteworthy "facts" are suspiciously absent). Rare is the news story these days where nothing sets off my suspicion.
because it's the new yorker. Facts are optional, bias is required.
Given that it is discussing the results of actual psychological studies (that I have seen talked about in a number of other places), it is vanishingly unlikely that it is in some way intended to study anyone's gullibility.