Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted.
From a story in the New Yorker.
Why Facts Don’t Change Our Minds
New discoveries about the human mind show the limitations of reason.
Excerpts :
“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”
Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.
The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?
In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.
…
Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.
The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse. At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.
If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”
Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.
…
Steven Sloman, a professor at Brown, and Philip Fernbach, a professor at the University of Colorado, are also cognitive scientists. They, too, believe sociability is the key to how the human mind functions or, perhaps more pertinently, malfunctions. They begin their book, “The Knowledge Illusion: Why We Never Think Alone” (Riverhead), with a look at toilets.
Virtually everyone in the United States, and indeed throughout the developed world, is familiar with toilets. A typical flush toilet has a ceramic bowl filled with water. When the handle is depressed, or the button pushed, the water—and everything that’s been deposited in it—gets sucked into a pipe and from there into the sewage system. But how does this actually happen?
In a study conducted at Yale, graduate students were asked to rate their understanding of everyday devices, including toilets, zippers, and cylinder locks. They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again. Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped. (Toilets, it turns out, are more complicated than they appear.)
Sloman and Fernbach see this effect, which they call the “illusion of explanatory depth,” just about everywhere. People believe that they know way more than they actually do. What allows us to persist in this belief is other people. In the case of my toilet, someone else designed it so that I can operate it easily. This is something humans are very good at. We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins.
“One implication of the naturalness with which we divide cognitive labor,” they write, is that there’s “no sharp boundary between one person’s ideas and knowledge” and “those of other members” of the group.
This borderlessness, or, if you prefer, confusion, is also crucial to what we consider progress. As people invented new tools for new ways of living, they simultaneously created new realms of ignorance; if everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much. When it comes to new technologies, incomplete understanding is empowering.
Where it gets us into trouble, according to Sloman and Fernbach, is in the political domain. It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about. Sloman and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea. Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map. The farther off base they were about the geography, the more likely they were to favor military intervention. (Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.)
Surveys on many other issues have yielded similarly dismaying results. “As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write. And here our dependence on other minds reinforces the problem. If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.
“This is how a community of knowledge can become dangerous,” Sloman and Fernbach observe. The two have performed their own version of the toilet experiment, substituting public policy for household gadgets. In a study conducted in 2012, they asked people for their stance on questions like: Should there be a single-payer health-care system? Or merit-based pay for teachers? Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals. Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one. Most people at this point ran into trouble. Asked once again to rate their views, they ratcheted down the intensity, so that they either agreed or disagreed less vehemently.
Sloman and Fernbach see in this result a little candle for a dark world. If we—or our friends or the pundits on CNN—spent less time pontificating and more trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views. This, they write, “may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes.”
One way to look at science is as a system that corrects for people’s natural inclinations. In a well-run laboratory, there’s no room for myside bias; the results have to be reproducible in other laboratories, by researchers who have no motive to confirm them. And this, it could be argued, is why the system has proved so successful. At any given moment, a field may be dominated by squabbles, but, in the end, the methodology prevails. Science moves forward, even as we remain stuck in place.
Full story here
I noted the comment about Trump as well as others, sad, the author is a victim of his own subject. Maybe he needs to reconsider whether Trump is really a plus or a minus and why. Main point however, the author does not mention another possibly far more cogent reason for confirmation bias in our mental makeup. It reduces the information processing load. Without it we would be constantly reviewing and re-analysing past situations and issues, thereby drowning in a morass of data. But by employing it we can dismiss these already decided issues reducing the amount of data we have to process and moving on. In most cases, whether right or wrong, leaving the original decision standing does little obvious harm. In rare cases it does visible harm and particularly personal harm. When that becomes obvious people do indeed start to reassess their earlier stance. Confirmation bias may seem to be a negative but paradoxically it is not, it is a way of maximising our ability to interact with the world around us.
I’ve found that beyond showing someone that they are wrong, also show them that they”ve been misled by deliberately false and deceptive information. This allows them to shift the blame for their incorrect knowledge to someone else. This makes it easier for them to accept the truth by being relieved that they weren’t wrong: they had been given bad information. Doesn’t always work but sometimes worth a try.
My mind WAS changed by the January 24, 2012 post by Robert Brown on this site regarding Stable Thermal Equilibrium Lapse Rates.
https://wattsupwiththat.com/2012/01/24/refutation-of-stable-thermal-equilibrium-lapse-rates/
Overlooked in the study is the fact that the world is complex. No matter which side of an issue we’re on, we’re pretty good at spotting flaws in arguments coming from the opposing side.
Where is Robert Brown, by the way? He used to comment occasionally.
Yep, here’s how well confirmation bias worked with me: in late 2007, I believed manmade climate change was real, and went online looking for the actual facts about it in order to be able to argue factually. I was actually looking for the evidence for what I already believed was true. Totally backfired I’m glad to say, but it was shocking at the time.
Now that they have convinced themselves that they can’t change the minds of those who disagree with them.
The next step is to just mandate it and ignore the complaints of those being squashed.
Indeed there are signs of stonewalling, deaf ears, and talking about regulating, defunding and censoring.
That’s the last resort when opposition needs to be squashed to proceed to political goal-setting.
IMO … there are several reasons that minds can’t be changed in certain people. Number ONE of those reasons is that the opposing evidence comes from a source that the individual has accepted as having an agenda they view as negative. Usually the agenda is fed to people via propaganda, and most people are easily deceived. The problem is, there are large swaths of the population that lack critical thinking skills, and came up short on the “logical” stick. Otherwise, if someone is giving you data that you can see for yourself is false, then you are less likely to believe anything they say in the future.
The KEY for persuading another’s opinion is to be perceived as lacking an agenda. In order to be perceived as lacking an agenda, you have to be able to gain trust in the individual such that you can disprove in the individual’s mind that the opposing party’s attempts to define your agenda are false.
Take Climate Change for example. The Alarmists have successfully, but falsely, framed the skeptics as having an agenda for the fossil fuel industry. The Alarmists put out reports that so-n-so received money from Big Oil, and thus their research is tainted. Likewise, many on the Skeptical side, have convinced their minions that the Alarmists research is connected to an alternative agenda connected to Socialism and Globalism. THUS …. the minions of both sides, who lack critical thinking skills and have no desire to discover for themselves the correct position are entrenched in their position based on the perception of the other sides propaganda and agenda.
The logical person doesn’t believe either side’s propaganda, and thus, goes to the source documents, the data, and become knowledgeable about the subject, and come to their own conclusions. This is the case for most climate skeptics, as they do not take the word of any “tangent authority” on climate, but rather go to the source information, and assess the actual body of evidence. Unlike the Alarmists, the Skeptics do not run from a debate, have no desire to change the peer review process, invite opposing views for discussion, and acknowledge all information is relevant. As such, they are much more informed, and not easily deceived by the propaganda of the Alarmists, but rather, raise important questions about the body of evidence, like .. why did you adjust that data and was it warranted, how can you put forth a model that fails to perform as evidence, why would you propose X action item when your own data show it will have no effect … and more. It is the recognition of provable inconsistencies that makes the case for the skeptics to the logical mind. BUT … unfortunately, they are not connected to a group with any kind of mechanism to spread the truth of an issue, and by default, they are lumped in with the “deniers” as having an agenda for fossil fuels … even when in truth, they have no agenda other than to reach the truth of a matter.
I think the #1 reason is “Never argue with a man whose job depends on not being convinced.”
Something fishy in the air :
Cognitive science shows that humans are smarter as a group than they are on their own
By Philip FernbachApril 18, 2017
Cognitive scientist and professor of marketing, University of Colorado’s Leeds School of Business
https://qz.com/960175/cognitive-science-shows-that-humans-are-smarter-as-a-group-than-they-are-on-their-own/
So stuffy groupthink is their game?
It gets even smellier : the NYT Editor’s pick :
THE KNOWLEDGE ILLUSION: Why We Never Think Alone, by Steven Sloman and Philip Fernbach. (Riverhead, $28.) Two cognitive scientists argue that not only rationality but the very idea of individual thinking is a myth, and that humans think in groups. That’s not necessarily bad news, writes Yuval Harari in our review: “Our reliance on groupthink has made us masters of the world.… From an evolutionary perspective, trusting in the knowledge of others has worked extremely well for humans.”
A good flush is needed here.
“Cognitive science shows that humans are smarter as a group than they are on their own” this concept came up in a management class I took. NASA ran several experiments about how people in groups come to decisions and how to get the best decision. Say something like the Challenger Disaster and the O-rings.
The bottom line was that an experienced expert in the problem area virtually always arrived at the best solution and was the fastest. A well organized, but less knowledgeable team mostly came up with acceptable or good solutions but took considerably longer and only rarely found the optimum solution.
From the article: “When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.”
The “information” Trump’s supporters ignored were the obvious lies told about Trump by the Left and the Leftwing Media, because they saw them as lies, and it turns out they were absolutely right. The correct position for a conservative to have is to never take anything said by the Left at face value. Proof is required in all instances. Leftwing assertions are not good enough.
It’s the Left and the Leftwing News Media who are ignoring all information that contradicts their delusion that Trump is really Adolf Hitler and needs to be resisted at all costs. Talk about Groupthink! You can’t call it confirmation bias because there is nothing to confirm their bias. It’s mass psychosis on the part of the Left.
LOL, this article is so typical of the smug left wing. They start with an argument that has some merit and then twist it into a piece of disharmonious nonsense. Yet they seem to have no idea that their original argument describes themselves better than those they are trying to denigrate. In other words it all boils down to, “I know you are, but what am I?”
When I was young I looked at controversies and often picked a side. As I got older I saw evidence that the side I picked was correct or sometimes switched sides when I decided i was wrong. At first I thought that many political positions were equally valid but were the result of differing backgrounds or preferences. I am almost ashamed to say that it took me until I was over 50 to realize that some political positions were just plain WRONG and created bad or evil results.
When global warming was first suggested, I looked into it and quickly discovered that it was a flawed theory. As information about it came out I looked at the papers and found that the pro-warming papers were either seriously flawed or were being misrepresented (they didn’t demonstrate the proof of the theory at all). Now I only take a quick look to see if there’s anything new (there never is).
Call it “confirmation bias” if you want but I’m not going to throw away a couple of decades or more of actual examination and thinking about a topic on the basis of one stray claim.
You only need to convince the people who matter, those who allocate resources and make decisions. If needed persuade them with lawsuits. That’s how the believers do it.
“As a rule, strong feelings about issues do not emerge from deep understanding” That one statement was my take-away from the article and confirms what I’ve experienced, it is hard to tell someone they are stupid without evoking anger.
As far as the Trump dig, grain of salt, I’ll always remember Obama’s hit on Sarah Palin about $2 dollar gas…
The concept of depth of understanding is something you can use in a discussion with most people, provided you yourself are prepared. People love an invitation to explain things to you and if they begin to question their depth of knowledge, that makes them more willing to productively engage.
Why limit changing of minds to the climate? In World War 2, both Germany and Japan kept fighting in a hopeless situation. Fortunately for Japan, Emperor Hirohito had brains. For a more recent example, look at Venezuela and North Korea.
Excerpts from published article:
The silly claim by Hugo Mercier and Dan Sperber that “reason is an evolved trait” is asinine and delusional, ….. and therefore, is proof positive that what they are referring to as “reason” is, in actuality, a nurtured trait. ……. a subject dependent mental trait that was nurtured by one’s environment.
“DUH”, faulty thinking, …… HUH.
“YUP”, just like “ugliness is in the eyes of the beholder”, …… so is …… “faulty thinking is in the eyes of the beholder”.
And for everyone who identifies the “faulty thinking” of another person, …….. that other person identifies the “faulty thinking” of their accuser. And there ya have it, …… “confirmation bias” fighting “confirmation bias”, …. and the winner is, …… the one with the microphone.
The above published article is nothing more than …… lefty liberal anti-Trump agitprop ….. that cites/quotes opinions of several “psychobabblers” in order to convince the Democrat partisans of its legitimacy, to wit, the “confirmation bias” version:
To wit, the response “confirmation bias” version:
“If your position on, say, the Russian collusion between Trump’s 2016 presidential campaign team and Moscow is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Congressional Democrats and their partisan proponents.”
Similarly, some years ago I read a BBC article titled “Why do people vote for things/people that are against the own interests.” It was a simply breathtaking display of BBC arrogance that the world is full of plebs who need to be told what their best interests actually are, presumably with the BBC doing the telling about who you should vote for.
Seems like an over simplification. Consciously or not, the interviewee almost certainly makes extra assessments about how the topic will affect them personally, adding extra layers of complexity to the problem and thus making the answers very context-dependent.
This seems to be indicated in the questions about Ukraine: The further away it was thought to be, the less likely it is that military intervention will have any immediate consequences for the individual.
There is clearly some logic in this and some idiocy. The explanations of why we are this way probably have some merit but, like most psychological research the conclusions go way beyond the data – another symptom of the phenomenon the authors discuss. I cover this tendency to think by belief rather than logic in talks I have given on scientific process and the mistakes made by faulty reasoning. I fully agree that the most people, the most of the time are thinking by reflex, not analysis. Using reason and objective testing of hypothesis is the rare exception that allowed modern society to evolve as it has. The conclusion that the Trump administration is a result of faulty reasoning is the most egregious conclusion in this piece and shows that the authors allowed their own political convictions to color what should be a purely reasoned discussion. While accessible, effective, equitable and affordable care for all may be a very worthwhile goal, the Affordable Care Act is not a sustainable or highly successful answer to that goal, and no administration in the US to date has actually come up with a solution that reaches that goal.
Almost by definition psychological science goes beyond the data. Physics, such as the Large Hadron Supercollider looks for data with “nine 9’s” significance. Psychologists are blown over if they can get a 20% confidence statistic. At that level there really isn’t much to discuss. All the data is at the “well, maybe?” level.
A must read: How to Sell a Pseudoscience by Anthony Pratkanis. https://tinyurl.com/yaznu9zb
It’s short and outlines how people become true believers. The first few steps:
1) Create a phantom
2) Set a rationalization trap
3) Manufacture source credibility and sincerity
4) Establish a grandfaloon
5) Use self-generated persuasion
wow, that is an almost perfect step-by-step description of AGW
They write….. “If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.”
…. and with that rather unscientific statement, we can see that the ENTIRE story is nothing but anti Trump Propaganda. By inference they seek to smear one side of politics as deficient…. They never stop trying these people.
Socialism is the ideology of deceit. Everything they say is designed to deceive. They’re not hard to spot though.
All this fancy analysis sounds good but you can get the same information from taking an entry level sales course. I have said it here many times, it’s Sales 101, you don’t sell the steak, you sell the sizzle. People make decisions on emotion and “what’s in it for me”. You can start with the facts but until you get them to the emotional level of “wow that’s good for me” you cannot make the sale. Any sales person knows that you have to get to the hot button to connect with the buyer. It is how the left wing has operated for decades and it works. The general population, who are mostly ignorant to the science of climate, give themselves a big pat on the back for helping the cause or at least being concerned enough about it. The subsequent emotional satisfaction they get is their hot point. And the use of images of “helpless” animals is far more effective than charts and graphs as a connection to that emotion.
OK, so in your or my own mind, we may make rational decisions but to sell those ideas to others, FIRE UP THAT STEAK and let it SIZZLE. Lucky for me, my sizzle (Liberty) is supported by much of the data so I can be both emotional and rational.
It’s how all successful politicians Right and Left have operated since forever. Pushing buttons is far easier and more effective than “rational” persuasion.
There is tons of room in science for mybias. As an example, there was the measure of the charge of the electron. Milikan came up with his number, and then over time it drifted ever higher until it came to its current value. No one wanted to come up with a number that was too far off from what was expected. And so the measure of the charge of the electron took far longer than it likely needed to, as each researcher biased their findings towards the expected value.
This is rich …. the Author .. Elizabeth Kholer .. or something like that, engages in exactly the reason’s she gives for Fact no mattering ….
We are talking about the New Yorker ….
77% of the readership is left of center ….
She puts in a dig at Trump ….cause that is what 77% of her readership already believes.
No facts needed ….
Hey Elizabeth …. I’d suggest the reason facts don’t matter to your readership … is because you never give them any. Try it … Facts may just have a bigger impact than you thought.
Interesting story. Scientists lie to people, show them bogus studies and the people don’t fall for it. So the scientists conclude people are irrational for not believing them. That wasn’t my conclusion.
When you have a strongly religious community of people, they will always label unbelievers as “bad people”.
AGW is a religion. Anyone that dares to disagree or challenge the Canon is a “bad person”.
This is simply human behavior. People who have never been educated in critical thinking just follow their instinct. Faith cannot be questioned as long as the strong community of zealots supports, monitors, and polices each other.
“A man convinced against his will,
Is of the same opinion still.”
― Dale Carnegie, How to Win Friends and Influence People
I have seen this repeatedly as I present the science to staunch believers in the global warming religion. They can’t dispute the evidence, but stubbornly cling to their fundamentalist faith.
Is there anything more entertaining than reading a psychologist on confirmation bias and see him demonstrate it in sentence after sentence?
“If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.”
Or climate scientists who dismiss observation, and insist their models are valid?
If the whole human race suffers from the malady, it’s disengenuous to blame one side of a debate for having it.
Thank you Anthony, for this post.
“If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner.”
This is an ignorant idea! That is not the way we think. Confirmation bias does not apply equally to all human experience. It is primarily associated with negative experience and is a perfect mechanism for survival. Our cognitive ability is built on pattern recognition, and the patterns built from negative experiences are far stronger and more prevalent in our thinking than the patterns built on positive experiences.
For example, if we walk down a dark alley 10 times in a row without incident, we may develop a thought pattern that such activity is fairly safe. But if we are mugged the eleventh time we do it, we may never feel safe in an alley ever, no matter how may times we safely walk that path again. The thought pattern will extend in our minds and we will be leary of any dark place where someone could be hiding. If the mugger was wearing a ski mask, the sight of anyone in a ski mask may cause us great anxiety for decades to come. The same thing does not happen with positive experiences, as positive experiences are not key to our survival.
In other words, confirmation bias is far stronger when there was trauma or even the threat of trauma.
That is why the threat of hobgoblins (H. L. Mencken) is a far more powerful political tool than the promise of efficient government. Trump derangement syndrome is a prime example of this confirmation bias. The left has created a story of impending trauma if Trump remains President (a powerful hobgoblin), even though the reality is quite the opposite. Everything that Trump says or does is just a confirmation that he is a threat to those with this story.
Similarly, many have been brought up on the idea that the greatest threat to the Earth is mankind. Consequently, anything that humans do is seen through this lens and becomes confirmation bias that supports the claim. The threat of a climate change crisis is completely born from this manipulation of the human mind to build confirmation bias around negative experience. To those minds that have been inflicted with this unsubstantiated paradigm, the positive things that humans do are inconsequential at best and completely invisible at worst. The benefits of constant, cheap electricity are simply lost to the mind that believes fossil fuels will destroy the planet. Facts that indicate man-made climate change and increasing CO2 are beneficial, cannot be comprehended in a mind so conditioned. It does not fit the trauma induced paradigm.
As a student of history, my paradigm is that governments with unrestrained power are the most dangerous thing to my health and well-being. There is ample evidence to support my paradigm. It has happened over and over again throughout human history, and continuous to happen to this day, yet I recognize that my libertarian leanings are not always the most logical. I also recognize that confirmation bias is just as strong in those with a different paradigm, even if they do not have observational evidence to support their story. As Sloman and Fernback wrote: “As a rule, strong feelings about issues do not emerge from deep understanding…” Correct! They emerge from trauma or the threat of trauma.
Pattern recognition is perhaps the greatest strength of the human mind, but the emphasis given to the patterns generated by negative experience, or even the threat of negative experience, leaves us susceptible to manipulation. The Precautionary Principle, which is neither precautionary or a principle, is actually an unsupported confirmation bias that humanity is the existential threat to itself and the world. It is used as a tool of manipulation, and has no rational foundation to support it.
Our confirmation bias around traumatic events (real or anticipated) is of key importance to our survival, but needs to be understood for what it is and subject to constant, rational review. Otherwise, it can and will be used against us. There is plenty of conformation bias happening here at WUWT, but there is also a constant, rational review of the climate crisis story that I have not found anywhere else. WUWT is a light in the darkness of the traumatized mind.
Thank you, James. A succinct response which confirms my own beliefs in this matter …(oops!) Negative reinforcement inducing fear is an extreme emotive power, possibly strong enough to overcome instinctive behaviour. Yet we see that in animal training (dogs, horses etc) positive reinforcement can have remarkable results. Possibly in our modern “civilised” form of Homo sapiens we have lost most of the inborn fears of being without food, clothing, shelter, warmth and societal structure, so those commodities have lost their reward factor for modern man – we no longer have to seek them out and get the feelgood factor when we find them. Leaving us only with the negative reinforcement which seems to have taken over so much of our lives.