I have noticed this. I've also noticed many areas where I agree with the vast majority (e.g. "smoking is unhealthy"). If you'll look carefully, you've committed a subtle form of selection bias. The "many people with opposing views" limits the opinions to controversial ones, that people tend to hold due to various "irrational" reasons (such as group belonging or ethics).
I may not have been clear, and I think we mostly agree - I don't think people choose their opinions through pure data, logic, and reason. But they do play a bigger part than the article implies, and the article greatly overstates the strength of those experiments. If you have time, lets look at them closely, starting with the one about the death penalty:
> Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.
There are so many confounding factors it's hard to choose where to begin:
1: The topic should immediately raise suspicion. Highly emotionally charged, instead of something boring like the optimal tire pressure for road safety. Meaning we probably can't generalize it to more boring topics. But it's exactly what I'd pick if I wanted a surprising result.
2: Despite how the sentence carefully implies but not states, deterrence is probably not even close to the main reason why someone would favor or oppose capital punishment, so someone is unlikely to change their opinion based on that. But intellectual laziness can result with someone not examining why they truly hold their opinion, and pick an easy reason instead, as long as they think the data supports it. This perhaps supports "we hold some opinions for irrational reasons" or "we're not honest with ourselves why we hold some opinions", but has little bearing on if we change our minds in face of new evidence.
3: The students didn't enter the study as blank slates, perfectly naive and willing to believe whatever some study told them. They could easily have been exposed to many prior studies and word of mouth, claiming capital punishment does/doesn't work as deterrent. And like the perfectly rational agents versed in Bayesian statistics that they are, they examined this new study in light of their prior data, and accepted it or discarded it as an outlier. If the study had claimed regular baths in bleach improve skin health, would we expect them to start bathing in bleach? Then why are we surprised when they are equally skeptical regarding studies on capital punishment. Yes, it's confirmation bias, but so is dismissing bleach-bath studies. And after all, confirmation bias requires there to be some facts that confirm our beliefs, and anything after that is the difficult task of judging who is credible.
The study shows motivated reasoning in an instance where we hold a belief for different or irrational reasons, but not the implied immunity to facts.
The firefighter study is also interesting. This time, instead of choosing a controversial topic, they instead gave the participants barely any data to work with. It would have been so much simpler if they were first shown study A, that says risk-taking firefighters save 50% more lives, and then told them now, that was made up, study B is the real one, that says risk-taking firefighters save 50% fewer lives. But that's not what they did - instead, they gave a single data point, Frank, that was or was not put "on report" for unspecified reasons, and it's also unclear if being "on report" even means he's less successful, or if he's like the stereotypical detective that has to turn in his gun and badge because the commissioner is upset he's digging into powerful people. Then, after they've had time to think up some plausible reason why risk taking is/isn't good, even that single data point is taken away. The participants, left with no data, simply kept their old beliefs. Facts didn't change their minds because they have no facts. Yes, the correct thing to do would be to revert to "I don't know", but 1) we don't know if that was even an option in the study, and 2) aversion to agnosticism falls very short of "facts don't change our minds". Let me also mention this sneaky wording:
> Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,”
But the evidence hasn't been "refuted". No new, more credible evidence, from which we would draw the opposite conclusion, was presented. Instead, the evidence was simply removed. Just one of the many ways in which the article tries to overstate its case.
The suicide note study is very similar. There's an elaborate song and dance, but in the end, the students are again left with no data, and asked to make some guess about that data.
The last study, with the reasoning problem, is also overstated. Lets start with "fewer than fifteen per cent changed their minds in step two." - lets go with 14%, despite the author trying to imply it was lower. "About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical. Nearly sixty per cent now rejected the responses that they’d earlier been satisfied with." - so, "about half" of "nearly sixty per cent", lets go with 58/2 = 29%. In other words, 71% either reasoned so consistently, they could identify the deception, or came to the same conclusion again, despite being told their past self came to the opposite conclusion. This study does show people will scrutinize someone else's argument more closely than their own, but doesn't show how big this effect is, other than the coarse limit of 29%-58% of people switching sides in some apparently somewhat ambiguous 'reasoning' task.
Lets re-cap. The capital punishment study shows people rationalizing opinions probably based on ethics. The firefighter and suicide studies show people keeping old opinions despite not having any concurring or opposing evidence. The last study is hard to draw any straight-forward conclusions from, but yes, it shows people will do motivated reasoning, but only about at most half the time, and at least half the time they will be consistent. And all the studies except the last had to throw lots of emotions and ambiguity into it to be able to squeeze out an irrational result. I honestly think not pointing this out borders on deception.
So I don't "disagree" with any of the studies, or think that people are based on pure logic. But the studies are far more limited than the article wants to admit, and even "why facts often don't change our minds" is way overstating it.