Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted.
From a story in the New Yorker.
Why Facts Don’t Change Our Minds
New discoveries about the human mind show the limitations of reason.
Excerpts :
“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”
Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.
The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?
In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.
…
Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.
The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse. At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.
If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”
Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.
…
Steven Sloman, a professor at Brown, and Philip Fernbach, a professor at the University of Colorado, are also cognitive scientists. They, too, believe sociability is the key to how the human mind functions or, perhaps more pertinently, malfunctions. They begin their book, “The Knowledge Illusion: Why We Never Think Alone” (Riverhead), with a look at toilets.
Virtually everyone in the United States, and indeed throughout the developed world, is familiar with toilets. A typical flush toilet has a ceramic bowl filled with water. When the handle is depressed, or the button pushed, the water—and everything that’s been deposited in it—gets sucked into a pipe and from there into the sewage system. But how does this actually happen?
In a study conducted at Yale, graduate students were asked to rate their understanding of everyday devices, including toilets, zippers, and cylinder locks. They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again. Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped. (Toilets, it turns out, are more complicated than they appear.)
Sloman and Fernbach see this effect, which they call the “illusion of explanatory depth,” just about everywhere. People believe that they know way more than they actually do. What allows us to persist in this belief is other people. In the case of my toilet, someone else designed it so that I can operate it easily. This is something humans are very good at. We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins.
“One implication of the naturalness with which we divide cognitive labor,” they write, is that there’s “no sharp boundary between one person’s ideas and knowledge” and “those of other members” of the group.
This borderlessness, or, if you prefer, confusion, is also crucial to what we consider progress. As people invented new tools for new ways of living, they simultaneously created new realms of ignorance; if everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much. When it comes to new technologies, incomplete understanding is empowering.
Where it gets us into trouble, according to Sloman and Fernbach, is in the political domain. It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about. Sloman and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea. Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map. The farther off base they were about the geography, the more likely they were to favor military intervention. (Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.)
Surveys on many other issues have yielded similarly dismaying results. “As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write. And here our dependence on other minds reinforces the problem. If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.
“This is how a community of knowledge can become dangerous,” Sloman and Fernbach observe. The two have performed their own version of the toilet experiment, substituting public policy for household gadgets. In a study conducted in 2012, they asked people for their stance on questions like: Should there be a single-payer health-care system? Or merit-based pay for teachers? Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals. Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one. Most people at this point ran into trouble. Asked once again to rate their views, they ratcheted down the intensity, so that they either agreed or disagreed less vehemently.
Sloman and Fernbach see in this result a little candle for a dark world. If we—or our friends or the pundits on CNN—spent less time pontificating and more trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views. This, they write, “may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes.”
One way to look at science is as a system that corrects for people’s natural inclinations. In a well-run laboratory, there’s no room for myside bias; the results have to be reproducible in other laboratories, by researchers who have no motive to confirm them. And this, it could be argued, is why the system has proved so successful. At any given moment, a field may be dominated by squabbles, but, in the end, the methodology prevails. Science moves forward, even as we remain stuck in place.
Full story here
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
One of the most significant findings related to human development is the discovery of neuro-plasticity by brain scientists. Whenever we receive new information the brain makes new connections to process that information. If the information is useful or provides a better way of doing things it will become the dominant pathway and old patterns will fade away due to non-use.
“A man hears what he wants to hear and disregards the rest” is the biggest impediment to using the unlimited growth the brain is capable of. Confirmation bias is a natural function of the primitive brain which seeks understanding of the world through it’s infinitesimal range of personal experience. The correlations it makes to create this understanding become the foundation for circular reasoning . Reality then becomes a projection based on limited experience rather than an objective witnessing of how cause and effect brings everything into being. Most people accept what they hear without critical thought so can have their realities programmed by others easily. Alarmists scientists don’t understand that they are stuck in a closed loop of circular reasoning.
Byron Katie has developed a methodology for questioning assumptions and getting past the projections of mind to clearer states of consciousness. A key component of “The Work” is the turnaround. After questioning beliefs to see if they can be proven true you try believing the opposite to get a different perspective.
https://thework.com
Most commenter’s here already posses critical thinking skills and have trained their brains to question assumptions. That is what separates fantasy from reality.
As many here understand, what it’s all about as much as anything, is control – control of other’s lives and property. And forcing them to behave as they should – for their own good, of course.
And of course, the rules apply only to the peasants, not to the elite rulers whose opinion of the peons is always quite revealing…
Royal messenger: Sire, the people are revolting!
King: You said it, they stink on o r!
They stink on ice…dang autocorrect!
So, I guess this little article that I just wrote will have no effect on a lot of people, most notably my would be target audience:
https://hubpages.com/politics/Climate-Change-Reality-Review-for-City-State-and-National-Leaders
Facts may not change our minds, but they change history. Climate alarmists try to ignore the essential practicality, and close empirical connection between science and its subject: reality. Alarmists publish modeling study after model. Never do controlled experiments. In terms of social evolution alarmism is a dead end. Because it’s a waste of time; it ignores facts; or corrupts them one-sidedly to manufacture scare.
No pseudoscience has staying power. I think it’s more likely alarmists will move on to some other scam. Consider that: 30 years ago Linear No-Threshold reigned supreme as dose-response theory. Today it is shot to pieces, and it’s Nobel winning creator accused of scientific fraud; with few brave enough to defend him!
From following this blog in its fight against climate science bullshit I have gained the necessary skepticism to see through political bullshit. I wonder how many other of the readers of this blog are Trump supporters as I am.
Limitations to reason?
Kind of like confirmation bias?
https://tambonthongchai.com/2018/08/03/confirmationbias/
Whilst it is true that my explanation of the reason for a belief in that Jesus is the son of God , matter was a bit off topic, the point that I was trying to make was that this 2000 year old “Faith” is a perfect example of Confirmation Bias.
I hope that I did not upset too many people about what I myself “Believe” .
I was only 12 years of age when I started to wonder why people went to
Church.
MJE VK5ELL
Re. Meteorologist in research, and Christ floating up to heaven. I would
suggest that you bring up the book “Christ died in Kashmir”. The book
is badly written but thought provoking.
MJE VK5ELL
I’m still waiting for some evidence. I’ll drop my denialism as soon as reproducible evidence supporting falsifiable conjectures is forthcoming.
The alarmist scientists need to act like real scientists to become believable and “taken serious”. Few do.
None of the CAGW/CC predictions have come true so far…not even close.
We’ve only had to suffer from a slightly milder climate. Weather events haven’t gotten worse. Warming is almost non-existent…aside from the recent recovery from the LIA.
I think Mercier and Sperber just found the results they wanted to find, and their conclusions thus are false. Nothing can dissuade me from this conclusion, either.
My impression: Anyone over 60, like me, knows our grandchildren have been dumbed down by global public education curriculum (creating ideologically assimilated students), planned search engine results tailored for them and social media profiling planning what they see as search results or social media feeds…..while their parents, our own children, having been working to put food on the table weren’t controlling their own children’s access to the media delivered propaganda… Anyone in my age group likely knows this was a simple strategy of intentional profiling to launch a sneak attack upon our grandchildren, encouraging them to disrespect lessons of the the past (even their parents and grandparents) for a Utopian quick-fix stupidity of climate koolaid which they have drank oh so willingly, to fit in on their peer bandwagon..(age typically) predictable tactics of propagandists….our grandchildren march to the drum of irrationality because they have no interest in knowing history..even of 50 years ago..even ours, their grandparents. (I posted this above, as a comment to another post, so it is duplicative)
Once upon a time, long ago, we in school were taught History. I myself loved it as it was fascinating to hear what had occurred before we ourselves were ever born.
But on of the big things about history was that it does tend to repeat itself,so if one really studies it, then it can be very rewarding.
My mind always did tend to jump around somewhat, still does, so for example a study of transportation of goods leads to first pack animals, then to horse or cattle drawn wagons, then to canal building and the one horse pulled the barge, this was followed by railways, first just the tracks and horse drawn wagons, then when steam engines became smaller the locomotive changed everything.
This is one of the big things I find when using a computer, that one thing leads to another thing, and so on. One can learn a vast amount just by moving through any subject.
Remember the old saying “”Use it or lose it””. This applies to both one’s physical condition like walking instead of driving a car for a short distance, butits most important in regard to using the brain.
As one gets old and with it the usual aches and pains. Yes we hate them, but as long as the brain still works then be thankful.
MJE VK5ELL
This article is such a pile of garbage that it is hard to know where to start to refute it, and why would one want to refute a pile of garbage? It is what it is. Anyway, personal experience tells me that not only can I change my mind when the facts don’t fit my theory, but I have done so repeatedly in my life. A pertinent example is Anthropogenic Global Warming – I was once a true believer. Then, in spite of all the dire predictions, it never showed up. Then people in my fields of study started publishing handwaving garbage that exposed their, and the reviewers’ and editors’, ignorance of history and logic. Then came Climate Gate and, although I’d already left the CAGW bus, I finally understood what was going on. Maybe undergraduates at Yale don’t know how a toilet functions – I didn’t when I was an undergraduate, but I knew how to stuff one up. I now know the basics and do minor repairs, but when it gets serious I still call a plumber: Australian toilets are complicated and I am not under the misapprehension that I am a plumber. This article is all about social junk science. The good news is that anyone who believes it will have a hard time using this misinformation to control my life, so let them continue to feel superior to the masses and think we are too dumb to see through them.