“Return your sword to its place, for all who will take up the sword, will die by the sword.” – Mathew, 26:52
Fans of Stephan Lewandowsky have learnt that an awful lot can be ‘explained’ simply by invoking the cognitive bias of your preference. To demonstrate the power of such a strategy, I recently shared with you my Climate Skeptic’s Guide to Cognitive Biases. Although at times tongue-in-cheek, the guide nevertheless had a serious point to make – whichever side of a debate you may choose to take, you can always appeal to cognitive bias to put your opponents in their place. Unfortunately, the fact that my guide included over sixty biases meant that I couldn’t go into too much detail for any single one. So I have decided to return to the subject, but this time to concentrate upon a single bias to further illustrate the point.
The bias I have selected for this purpose is the Backfire Effect, which I have chosen for two main reasons. Firstly, the effect is deemed to be the quintessential vice of the climate sceptic. Secondly, accusations of Backfire Effect appear to be Lewandowsky’s favorite barb, since they form the pretext for his notorious Debunking Handbook. I hope you’ll agree it would be remiss of me to write about this effect without engaging in a bit of Lewandowsky baiting.
The Backfire Effect Redux
The first thing to remind you about the Backfire Effect is that it is easy to summarise: When people are confronted with information that refutes their previously held beliefs, they respond by strengthening the belief rather than relinquishing it. This is a surprising outcome, and so one should expect it to be difficult to explain. However, explaining the effect turns out to be very easy; the challenge is getting any two psychologists to agree upon the explanation. After just the briefest of internet searches I was able to find the following offerings.
People respond to refutation by strengthening their beliefs because they:
- Feel they are being persecuted
- Are inappropriately self-confident due to consensus fallacy
- Are emboldened by a supportive availability cascade
- Are demonstrating a reactive confirmation bias
- Are over-enthusiastic in their self-affirmation
- Are reacting to a perceived threat to their self-interest
- Exhibit bravado in the face of embarrassment
- Suffer from a cognitive deficit1
- Suffer cognitive dissonance and so fall back on belief bias
- Reject the refutation as being part of the conspiracy
- Succumb to cognitive laziness
- Suffer from biased assimilation
- Are desperate to avoid the identity crisis that an abandonment of their belief would entail
- Are reacting negatively to refutation overload
- Subconsciously respond to a perverse form of the availability cascade simply by hearing their own myths repeated during the refutation
With so many explanations on offer, one has to wonder whether the pundits are all talking about the same thing. The Backfire Effect is beginning to look like the catch-all explanation for any situation in which a debate didn’t go the way someone expected. Moreover, one should keep in mind that the Backfire Effect was confirmed in controlled experiments that were conducted by a discipline that has a meagre 39% success rate when it comes to reproducibility.2 That doesn’t mean that I rule it out as the potential cause of an individual’s perverse intransigence, but my level of trust in the psychological explanation is such that I find it perfectly plausible that sometimes the real explanation might be one that isn’t actually on the psychologists’ list. Maybe the backfire happens simply because the refutation isn’t actually a refutation.
The Truth, the Whole Truth, and Nothing But
Whenever examples of the Backfire Effect are discussed, the two ‘wacko’ groups that are invariably cited are the anti-vaccination campaigners and the climate change ‘deniers’. So this is the deal: Any belief that a climate sceptic may have can be taken, a priori, as incorrect. This, of course, means that any counter-argument to their belief is, necessarily, a refutation. Any resultant strengthening of the belief, therefore, has to be a perfect example of the Backfire Effect. The sceptic is debunked but mindlessly carries on in a state of delusion.3
At least, this is how psychologists will see it. The problem, of course, is that the Backfire Effect only applies when dealing with an axiomatic truth, but when it comes to climate science, one wonders how psychologists (who let’s face it are no more experts on climate science than I am) are able to identify such truths. This is a vitally important point, because if you are in the business of peddling axiomatic truths, then you had better be certain of your facts. Which brings me back to Professor Lewandowsky.
The Debunking Handbook
Lewandowsky thinks he knows why he can say that the climate science uncertainties are bogus, and he thinks he knows why climate sceptics are therefore deniers in disguise. This much is evident from reading his Debunking Handbook, which carefully explains the reasons for the Backfire Effect and the best strategies for overcoming it. The reasons, incidentally, are the last three mentioned in my list given above. The strategies are basically: When dealing with your denier, provide the “core facts” that debunk the myth before referring to the myth you are debunking; immediately prior to mentioning the myth, make sure you explicitly warn that you are about to reveal a falsehood; and make sure you leave the denier with the correct belief, in order to fill the gaping hole you have just created by your debunking.
Helpfully, Lewandowsky provides examples, one of which is the debunking of the ‘myth’ that there are still fundamental uncertainties that are preventing a meaningful consensus within the climate science community. In keeping with his debunker’s strategy, he opens with his “core fact”:
“97 out of 100 climate experts agree humans are causing global warming.”
Of course, no citations are offered to support this statement, although it might as well be said that none of the candidate papers that come to mind are above criticism, particularly those produce by the Debunking Handbook’s co-author, John Cook.. It should also be pointed out that Lewandowsky’s preoccupation with consensus figures betrays a basic misunderstanding of how science works. However, most importantly, this statement of “core fact” fails to understand that it is not causation but the degree of attribution that troubles most sceptics. Nor should we overlook that Lewandowsky blithely ignores the sociological factors that seriously undermine the validity of the consensus. This all means that the supposed core fact is far from factual and anything but core. So anyone who thinks it serves as an axiomatic truth capable of debunking a myth, is being seriously optimistic.
The Backfire Effect – A Case Study
So here is how the Backfire Effect works on this occasion:
You read a document that professes to demonstrate the best way of debunking your own views. It provides examples. However, in the very first example you look at, you come across a statement that is supposed to serve as a straightforward refutation but is, in reality, highly contentious. Concerned, you look into the author’s other works and you come across a paper titled, “Conspiracist Ideation as a Predictor of Climate-Science Rejection”. You then discover that its findings are authoritatively disputed.4 You don’t presume that the disputation is valid but it looks pretty damming. Then you discover that the same author has produced a paper titled, “Recursive fury: Conspiracist ideation in the blogosphere in response to research on conspiracist ideation”. This paper takes the supposedly misplaced criticisms of one of his previous papers5 as further evidence that the accusations he had made in that paper were valid. You look into it and quickly discover that this argument is an obvious begged question, and so you wonder how such pseudo-scientific nonsense got through peer review. This just adds to your misgivings. Nor does it help to reflect upon the high profile that the author has amongst the pro-CAGW pundits, or indeed amongst his own profession. As a result of this experience, you are left even more sceptical than you were at the outset.
Bingo! It’s the Backfire Effect! You’re a cognitively challenged conspiracy theorist.
I’m sure I am as guilty as anyone for succumbing to the Backfire Effect. But next to Stephan Lewandowsky I am an amateur. Lewandowsky has taken a humble and commonplace cognitive bias and out of it created a thing of beauty. The more you get frustrated by his ill-formed arguments and motivated reasoning, the more this strengthens his belief. He has created for himself a meme that includes the idea that frustration with the meme provides evidence of its validity. This is a recursive delusion. Lewandowsky purports to be an expert on the psychological pathology that lies behind the Backfire Effect, and yet in his own hands he has elevated the effect to the status of an all-encompassing but ultimately sterile logic.6 So, if anyone can be said to have shot himself in the foot, then it has to be everyone’s favourite psycho-warrior, Stephan Lewandowsky.
1 I think in this instance ‘cognitive deficit’ is being used as a euphemism for stupidity.
2 A ‘Reproducibility Project’, undertaken by the journal Science, found that only 39% of the results of experimental and correlation studies published in three prominent psychology journals could be replicated.
3 And before you start, I’m not using the term in the narrow sense used by psychiatrists. There is no presupposition of mental illness; just a false belief that is held strongly enough to serve as a motivation.
4 Dixon R., Jones J., “Conspiracist Ideation as a Predictor of Climate-Science Rejection – An Alternative Analysis”, Psychological Science, March 26, 2015.
6 Expressed in doxastic logic’s terminology, Lewandowsky is a Conceited Reasoner regarding matters of climate sceptic argumentation, including his assumption that climate sceptics thrive upon the capacity to be Peculiar Reasoners. A Conceited Reasoner is defined by:
and a Peculiar Reasoner is one for which: