Guest essay by John Ridgway
I don’t know about you, but I am getting pretty fed up with psychologists proclaiming the irrationality of climate change scepticism. Eagerly, they waste no opportunity in hurling accusations of cognitive bias which, strangely enough, only seems to afflict those who find issue with the consensus view.
Well, I think it is high time that someone redressed the imbalance. So, I offer here my own commentary on the common cognitive biases and how they relate to the climate change controversy. In so doing, I hope to demonstrate how easy it is to conjecture upon a group’s psychological state and how easy it is to turn the tables and place the advocates of the Catastrophic Anthropogenic Global Warming (CAGW) hypothesis under scrutiny. The result may be so much psychological flimflam but I consider it no less worthy than the dubious speculation emanating from the supposed experts and the IPCC.
For my reference point, I have taken Wikipedia’s list of cognitive biases. The psychologists have had a field day inventing (sorry, discovering) biases, and I believe I have identified no less than 61 that the CAGW hypothesis supporters should be wary of. As a result, this guide is a bit longer than your average WUWT article, but I am hoping that even a partial reading will be sufficient to pick up the general theme.
For simplicity, the list is presented below in alphabetic order:
When there is insufficient information to determine objective probabilities, individuals assume the risk is higher, even though there is no basis for doing so. This is the thinking behind the application of the precautionary principle, where risks with so-called ‘unknown probabilities’ are used to justify policy. The experts who warn of CAGW are clearly ambiguity averse since they readily invoke the precautionary principle. And yet IPCC’s AR5, chapter 2, tendentiously claims that ambiguity aversion doesn’t affect experts.
When evaluating information, prior to making a decision, the first information encountered has an undue influence upon one’s deliberations. This will be the case irrespective of the position one finally arrives at regarding CAGW. CAGW’s believers cite anchoring as an explanation for the sceptics’ irrational denial of CAGW’s emerging evidence. Meanwhile, they fail to appreciate how the same irrationality would lie behind their own intransigence.
Some (but by no means all) of the sentiment that fuels environmental concern is engendered by the Mother Earth trope. Eco-psychologists condemn sceptics for having a disconnect with such thinking, but actually it is the eco-psychologists that are embracing a false paradigm. Having long ago made peace with the absurdity of human existence, I have little room for such sentiment (frowny face).
This is simply the appeal to authority. If there were to be no such thing as Authority Bias then the IPCC would be out of a job. In fact, so much of the advocacy behind CAGW is built upon Authority Bias, it is a wonder that anyone needs actual evidence. Nevertheless, respect for authority can go spectacularly wrong, as it did recently when the world’s greatest authority on all things, Professor Stephen Hawking, chose to share his expertise on planetary physics by predicting that anthropogenic CO2 emissions are destined to result in Venusian conditions. Those who knew no better were deeply alarmed. The rest of us squirmed with embarrassment.
An over-reliance upon automated systems may result in erroneous automated information being accepted uncritically. This results in a tendency to overlook or downplay the ‘Garbage In, Garbage Out’ syndrome in relation to computerised systems. Take, for example, the evaluation of the output of computerised climate models. The public is expected to be impressed because these models are very clever, very complicated, and very computerised.
When assessing likelihoods one tends to be unduly influenced by information or events that directly relate to current or recent personal experience. IPCC’s AR5, chapter 2, uses this to explain the sceptics’ irrational denial of CAGW evidence. Instead of following the science, we look out of the window and declare, “Look, it’s snowing. So much for global warming”. However, clearly this cuts both ways, and the CAGW hypothesis supporters never tire of using topical weather conditions to support their case. For example, every passing hurricane is well and truly milked for its availability.
This is the phenomenon by which collective belief in a proposition gains increasing plausibility simply through public repetition. Despite the best efforts of the BBC and the Guardian, this effect has not proven as effective as the psychologists say it should have been. This has led to much head-scratching and a great deal more funded research into the reasons why this should be so. Some CAGW hypothesis supporters claim that it is the sceptics’ own availability cascade that is doing the damage, but this explanation would require the sceptics to have a much higher public profile than they actually enjoy.
This is the reaction to discomforting evidence by strengthening one’s previous beliefs. Sceptics, apparently, do not respond well to inconvenient truths. Citing the Backfire Effect, CAGW hypothesis supporters claim that it is of no use providing sceptics with evidence for CAGW; the stronger it gets, then the more they will dispute it. Ultimately, irrefutable and self-evident proof will be met with abject denial. This may be the current assessment of the situation according to the CAGW faithful, but it is also psychological hogwash. No matter what one believes, one will always have a rationale for doing so.
I don’t think this needs any explanation. Both sides have equal rights to cite it in their defence, although the CAGW bandwagon does seem to enjoy better sponsorship.
When assessing the logical strength of an argument one is invariably influenced by previously held beliefs. Some would say this is an awful indictment of our inbuilt prejudices. Others will simply point out that it is a natural and unavoidable feature of the learning process. Either way, Belief Bias cannot be cited as an explanation for irrational CAGW scepticism unless one also accepts that it underpins CAGW hypothesis advocacy.
Bias Blind Spot
This is the tendency to see bias in other people’s thinking but not in your own. It is, of course, the premise of this article. The psychologists seem to be looking only for biases that undermine the legitimacy of CAGW scepticism. This is itself a bias.
Bike-shedding is the informal but more colourful term for Parkinson’s Law of Triviality. It gets its meaning from the observation that committees, faced with the difficult job of agreeing the design of a nuclear power station, are likely to spend an inordinate amount of time poring over the details of the bike sheds. Why? Because it’s the easy bit. Far be it from me to suggest that the details of the Paris Accord demonstrate that the world’s governments are currently engaged in a monumental Bike-shedding exercise.
Once we have decided upon an initial position, we all concentrate upon seeking out confirmatory evidence and, when faced with ambiguous evidence, choose to interpret it in a way that reinforces our preconceptions. This is a universal cognitive bias and there is no point in trying to suggest that CAGW sceptics are unusually prone to it. On the contrary, sceptics are by nature distrustful of belief systems – even their own.
When assessing evidence it is always important to appreciate that more than one hypothesis may be supported by it. In such circumstances, the only way to determine the correct hypothesis is to search for evidence that discounts one or the other. The problem, however, is the temptation to be satisfied simply because one already has ‘sufficient’ evidence to support the chosen hypothesis. You may then be blissfully ignorant that your hypothesis is actually wrong. Confidence in your hypothesis does not come from merely passing a test. What matters is how hard you tried to make your hypothesis fail the test and how hard you searched for better alternatives. So this is the question: Just how hard is the IPCC trying?
Conservatism (Belief Revision)
We all update our beliefs conservatively in the light of new evidence. This means that the degree of modification is less than would be expected if one accurately applied Bayes Theorem. The suggestion is that the effect may be due to Anchoring, although I think this is not so much an explanation as, perhaps, a re-stating of the phenomenon.
Although conservative belief revision is a universal bias, it is worth mentioning that CAGW scepticism is more prevalent within politically conservative individuals than with left-wing thinkers. This should come as no surprise. The whole point of conservatism is to favour the known present over the unknown future. As a result, the conservative mind demands a higher quality of evidence before committing to change.
Continued Influence Effect
Even after information has been discredited, it tends to have an influential effect on one’s thinking. Basically, mud sticks. I would like to suggest that the continued, pernicious influence of the discredited Hockey Stick graph is a good example but, in truth, it isn’t. Many of those who continue to cite the Hockey Stick are simply unaware that it has been discredited. The remainder continue in their efforts to corroborate the Hockey Stick because its removal of the global Medieval Warm Period was just too good a result to abandon. Such efforts fall within the province of Belief Bias, with the discrediting of the Hockey Stick having been readily dismissed in the manner of the Backfire Effect.
This is the tendency to express an opinion that is sociably acceptable rather than one that reflects one’s true beliefs. I don’t think any CAGW sceptic can be accused of this but, given the pressures placed upon climate scientists to conform, one has to wonder to what extent Courtesy Bias explains the much vaunted 97% consensus. As far as climate science is concerned, perhaps Courtesy Bias should be referred to as the Career Survival Bias.
Dunning and Kruger gained fame by pointing out that some people are just too stupid to know that they are stupid. The jibe was aimed at the unskilled but I see no reason to presuppose that experts are immune to unknown unknowns. It is important to keep this in mind because climate science may be a hotbed of unknown unknowns just waiting to prick expert hubris.
This occurs when someone claims more responsibility for themselves for the results of a joint action than an outside observer would give them credit for. It’s a bit like when someone claims to be a Nobel Laureate simply because they have done some work for the Nobel Prize winning IPCC. Not that such a thing could ever happen (cough! Dr Mann).
To quote directly from Wikipedia, Experimenter’s Bias is, “The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations”.
Scientists would be the last ones to admit that they might succumb to such a cognitive bias and there are some (cough! Dr Mann) who would not hesitate to sue if you suggested that they did. However, the current Reproducibility Crisis in science would seem to suggest that Experimenter’s Bias is widespread. I would not expect climate science to be exempt. After all, “one can’t make a cherry pie without picking cherries”.
Extrinsic Incentives Bias
We tend to believe other people do things for extrinsic benefits, such as monetary reward, whilst we do things for intrinsic benefits, such as the desire to maintain our integrity. This bias can be seen in the assumption that sceptical scientists are in it for the Big Oil payoff but the consensus scientists are in it for the love of knowledge.
False Consensus Effect
There is a tendency in everyone to overestimate the extent to which the population as a whole agrees with them. Predictably, psychologists have convinced themselves that this cognitive bias is stronger in climate sceptics than it is in the remainder (i.e. right-thinking) sector of society. This view seems to be based upon a much-publicised Australian study, which found that, when asked to estimate the percentage who shared their view on climate change, individuals of all persuasions overestimated the value. However, the overestimation was comparatively greater within the group holding the most sceptical views.
But then, the most sceptical group also happened to be the smallest. Surely, the smaller a group is, then the greater is the scope for overestimation. This is just a statistical effect, not a proof of psychological inferiority. Sigh!
This is the predisposition to place too much importance on one aspect of an event. For example, the recent warming has coincided (to a disputed extent) with increased anthropogenic emissions of CO2. As a result, the greenhouse effect of CO2 has become the dominant area of interest to climate scientists seeking to explain recent climatic trends. Whether or not this is an undue focus is something the lay public has to take on trust, based upon the climate science consensus. However, given the alarming extent to which sociological effects have been instrumental in forming this consensus, it is only natural that such trust may be undermined. As a result, the lay jury is still deliberating upon whether the CO2 fixation is fully justified or simply a pathological Focusing Effect.
Different conclusions can be drawn from the same information depending upon how the information is presented. The obvious example of this is the question of agreement between observations and climate model output. Simply changing the baseline upon which temperature anomalies are predicated can make a significant difference to the analysis. Framing effects are insidious and should never be underestimated.
On a related subject, IPCC’s AR5, chapter 2, places great store in the potential that framing effects may have in persuading the undecided masses to support climate change policies. I believe Goebbels was also quite adept at exploiting framing effects.
This is often presented as a tendency to find physically attractive people more credible. However, more generally it is the assumption that because someone is accomplished in one field they can be trusted to comment upon loosely related fields. The BBC use this all the time. For example, amongst their CAGW hypothesis advocates they have naturalist Sir David Attenborough (national treasure – if you can’t trust him, who can you trust?) and physicist Brian Cox (isn’t he dishy, isn’t he clever).
We have Lord Lawson. Oh dear.
People tend to think easy tasks are harder than they are, and think hard tasks are easier than they are. I happen to think that predicting the Earth’s climate for centuries into the future is a hard task, and so I am unsurprised to find that its difficulty has been underestimated (as in, ‘the science is settled, isn’t it?’).
Anyone who engages in post hoc tuning of climate models, simply to achieve correspondence with previous observations, is being wise after the event and so is guilty of Hindsight Bias. I don’t know if this practice is common amongst modellers, but I do know that the IPCC has deemed it necessary to warn against it.
This bias is closely related to the Texas sharpshooter fallacy, in which a small subset of data is focused upon because it happens to support a retrospectively formulated hypothesis.
Hostile Attribution Bias
Stand up Naomi Oreskes.
People prefer options that provide immediate payoff to those that provide deferred payoff, even when the later payoff is likely to be greater. Psychologists delight in pointing out that this short-sightedness lies at the heart of the average CAGW sceptic’s thinking. We are too keen to reap the short-term benefits of fossil fuels and too disingenuous to admit it, even to ourselves. We must think of the children but, unfortunately, most of us are just embittered, old, selfish, alt right conservatives who have no stake in the future and so couldn’t give a damn. I have only one thing to say to that:
Who are you calling old?
Identifiable Victim Risk
It is natural to be more concerned regarding risk to a single, identifiable person than to a risk affecting an anonymous mass. This effect is particularly strong when the single, identifiable person is yourself. CAGW sceptics are supposed to be particularly guilty of this bias, since they are insufficiently concerned for future generations who may have to suffer the effects of their selfish indulgencies. You may share this supposition, but my experience is that CAGW sceptics are more than able to express concerns for the anonymous masses of the future who will have to deal with the economic and environmental legacy that will result if we insist on implementing the crackpot renewable energy policies that are currently being proposed.
This is a tendency to place a disproportionately high value in something constructed by oneself, regardless of the quality of the end result. I don’t know how prevalent this effect is within climatology but I do note that Dr Mann is still very proud of his Hockey Stick.
Illusion of Asymmetric Insight
This is the widespread illusion that we know more about others and their motives than they know about us. Psychologists are themselves guilty of this every time they piously lecture upon the psychological failings of CAGW sceptics, whilst failing to appreciate that the sceptics know exactly what such psychologists are up to and exactly where they are going wrong.
Illusion of Control
With respect to the Earth’s climate, there are some that might say that the world’s governments are guilty of such an illusion. I could hardly comment.
Illusion of Validity
Stand up any paleo-climatologists who use proxies in their studies. Oh, that would be all of you then.
Some CAGW sceptics maintain that this cognitive bias can be seen in the insistence that CO2 concentrations drive temperature fluctuations. The causation is disputed, and even the level of correlation is questioned. I won’t get into this because it is too big a subject. Besides which, it would take me off topic.
Illusory Superiority is yet another cognitive bias that is supposed to provide the perfect explanation for the climate scepticism phenomenon. Surely we must be deluded to think we know better than all of those clever climate scientists.
Except, that isn’t what Illusory Superiority is about. Instead, Illusory Superiority is the tendency to believe that we are above average within a group performing the same task or in the same situation. So in this instance, we need to be focused upon comparisons between those laypersons who are evaluating the significance of the politicisation of a science that is making predictions far into the future. Do sceptics think they are above average within this group? You bet! We are guilty as sin regarding Illusory Superiority. But do the non-sceptics also think they are above average at making such an evaluation? Of course they do. There is absolutely no reason to presume that Illusory Superiority is exclusive to sceptics.
Illusory Truth Effect
Simple ideas that are repeated many times are easier to accept than complicated, often counter-intuitive and possibly arcane ideas. So let me help you out here:
You’re all gonna fry!
Fry, I tell you! Fry!
There is an assumption that one cannot get too much information before drawing a conclusion. However, this assumption is incorrect; additional information can often be irrelevant and distracting. So we should accept that those amongst us who want more information before acting on climate change may be guilty of Information Bias. It all depends upon the information that is being solicited and the cost in terms of outlay and additional risk caused by delay. This is, of course, a subject of some dispute amongst climate scientists and policy makers. The IPCC is screaming that we already have enough information, get on with it! But there again, they were screaming that from the very outset.
This is the tendency to look more favourably upon members of one’s own group. It is, of course, the basis for the pal review network. But I’m forgetting, scientists are immune to cognitive bias. Isn’t that what we learned when the Climategate scientists were exonerated?
Also known as the sunk cost fallacy, this is the reluctance to pull the plug because so much has already been invested. This effect sucked the USA into the Vietnamese War and kept its troops in Vietnam long after the point at which unavoidable defeat had been preordained. One fears that the war on climate change has already gone the same way; there are now too many careers and reputations at stake. I sincerely hope I am wrong.
Law of the Instrument
When the only tool you have is a hammer, everything starts to look like a nail. Since the only tool we have is the climate model, we are going to have to pretend that the Earth’s climate is easy to simulate.
The French also refer to déformation professionnelle. This is the temptation to view a problem narrowly from the perspective of one’s own profession, rather than taking a wider perspective. As a consequence, many experts believe their grasp on the truth is greater than it actually is.
The loss of a given level of utility is perceived as being greater than the equivalent gain. This may result in Irrational Escalation. Perversely, however, the IPCC blames this cognitive bias for the failure of many to take climate change action (see AR5, chapter 2). For the life of me, I can’t see how they have arrived at that conclusion.
According to Wikipedia, Naïve Realism is, “The belief that we see reality as it really is – objectively and without bias; that the facts are plain for all to see; that rational people will agree with us; and that those who don’t are either uninformed, lazy, irrational, or biased”.
Neglect of Probability
This is defined as the complete indifference to probability when making decisions under uncertainty. Typically, very small probabilities are not taken into account when assessing risk levels. This is usually because the relevant impacts are so high that even infinitesimal probabilities would be considered unacceptable. Basically, who cares what the probabilities are? I just don’t like the sound of it. Unfortunately, anyone can invent a horrible scenario and make it sound plausible. As a consequence, objections may be fomented based upon the flimsiest of pretexts. Take, for example, the fear that turning on the Large Hadron Collider would result in flocks of mini black holes marauding the Swiss countryside. To my knowledge, that never happened, but many feared it would. I blame the precautionary principle.
Where I come from there is no such word as ‘normalcy’, so, if you don’t mind, from hereon I will refer to ‘Normality Bias’.
Normality Bias is the refusal to plan for, or react to, a disaster which has never happened before. It is easy, therefore, to see why CAGW hypothesis supporters declare it to be the perfect explanation for climate scepticism. We stand accused of sleepwalking into disaster, simply because we lack the imagination and understanding required to appreciate the seriousness of the situation.
Expressed in this way, Normality Bias appears to be an irrational neglect. However, one has to keep in mind that basing one’s expectations upon the norm is actually quite rational – it’s called inductive reasoning. To expect the unexpected is irrational. The problem with inductive reasoning, nevertheless, is the tendency for the unexpected to happen. Unfortunately, knowing this fact is of limited value. Firstly, planning for each and every conceivable disaster is economically prohibitive; we would soon be bankrupted by our imagination. Secondly, when actually facing disaster in real time, you will still find yourself gripped by a psychological paralysis borne of cognitive dissonance.
So are CAGW sceptics just ostriches with their heads in the sand, or even rabbits frozen in the headlights? Of course not, they are just pragmatists whose inductive reasoning, based upon their evaluation of the evidence, happens to lead them to a different position (i.e. that it is too early to draw a conclusion). If anything, they are less susceptible to Normality Bias because they expect the climate to change; they just don’t agree upon the degree of anthropogenic attribution and are therefore more concerned regarding the costs and associated risks of the disaster management currently being proposed by the CAGW campaigners.
Not Invented Here
Obviously, this is the aversion to using products, research, standards, or knowledge developed outside a group. I find this particularly the case with respect to the IPCC. For example, it is difficult to explain the IPCC’s preference for their home-spun standard for the characterisation of uncertainty, rather than the adoption of the standard previously developed by the International Bureau of Weights and Measures (BIPM). The cynic in me suspects that it is in the IPCC’s best interests to establish a professional disconnect with scientific bodies that might demand higher standards.
If an observer expects a certain effect, this is likely to influence the chances of the effect being perceived. In the world of experimentation, this takes the form of the Experimenter-expectancy Effect, in which there is a temptation that experimenters may subconsciously manipulate the experiment or the data in a manner that fulfils the expectation. As with Experimenter’s Bias, you will find that most scientists will claim that such a fallibility is very rare, but the evidence of the Reproducibility Crisis suggests otherwise.
For some reason, harmful acts are considered more serious than harmful omissions. That’s why the precautionary principle is focused only upon the risks associated with actions (the onus is placed upon those taking actions to prove that the risks are acceptable). Omission Bias is another reason why one should be wary of the precautionary principle.
This is also known as wishful thinking, positive outcome bias or the valence effect. This bias is the tendency to overestimate favourable outcomes. Sceptics stand accused of being over-optimistic regarding global warming but they can counter by accusing their opponents of Pessimism Bias (see below). Of course, the matter needs to be settled by science rather than through the battle of the biases.
This is another one thrown solely at CAGW sceptics, but, like the bird, this tactic does not fly. Instead, why don’t we talk about the Ostrich Effect as it applies to the refusal to see renewable energy impracticalities? The Ostrich Effect is just a cheap jibe (no pun intended).
Most people, when asked a question, will over-estimate the probability that they have given a correct answer. This is probably because there are more ways of getting an answer wrong than there are of getting it right, and Bayesian thinking doesn’t come naturally to the average Joe. The most entertaining commentary on this website can be found when the Overconfidence Effect encounters the Overconfidence Effect.
Pareidolia is the human mind’s capacity to discern familiar patterns within a random stimulus. For example, one may see faces in clouds, canals on Mars, or trends in data when there are none. Of course, one shouldn’t just rely upon a visual inspection to discern a trend; there are statistical techniques available for the purpose. Even so, one should never underestimate the human capacity for self-deception when seeking a coveted trend. Keep in mind that, under torture, random data will tell you anything you want.
This is simply the converse of Optimism Bias. There’s not much more to say really.
In Kahneman and Tversky’s prospect theory, the Pseudo-certainty Effect is the tendency for people to understate the uncertainty associated with a decision. This happens when the decision involves a multi-step stratagem and the uncertainties entailed in the early steps are not propagated through to the final decision. Pseudo-certainty features in the evaluation of climate model ensembles when fidelity with observation is cited without regard to the plausibility of the parametric choices that led to such fidelity.
It is ironic that Daniel Kahneman should present the CAGW sceptics with the gift of Pseudo-certainty and then be amongst the most vociferous in denouncing their scepticism.
CAGW sceptics are deemed to be guilty of Reactance, as the only reason they are averse to taking action is because they see it as constraining their freedom of choice. The reverse psychology trick is to try presenting climate change action as the empowering option. This tactic is founded upon the assumption that CAGW sceptics are gullible enough to be fooled by it, which is an example of the Optimism Effect.
We all engage in the habit of denouncing ideas simply because they originate from an adversary. The politicisation of the climate science debate has caused it to become mired in Reactive Devaluation.
Status Quo Bias
Status Quo Bias is the cognitive bias most commonly laid at the feet of CAGW sceptics (usually predicated upon Loss Aversion or System Justification). Apparently, by refusing to change our habits, we are happy to continue ‘polluting’ the atmosphere with evil CO2 even though we know for a fact (but won’t admit it) that this will lead to catastrophe. The irony, however, is that there is more than a little of the Status Quo Bias in those who want to fix global temperatures at their current values, since they assume that we just happen to have been born in a period for which global temperatures are at their optimum. The fact is, we prefer the climate as it is because we have built our societies around the assumption that it will not change.
This is one of the most common tactics for those engaged in the climate science debate; stereotype your opponents and then take issue with the stereotyped.
Surrogation is a psychological error in which the measurement of a construct comes to replace the construct itself; in other words, the map is misconstrued as the territory. Surrogation is commonplace within the business world, where the achievement of a metric target is automatically taken as achievement of the business objective that the metric was presumed to track. This may or may not be the case but no one bothers to check because the metric has been satisfied. In paleo-climatology, proxies are surrogates for direct measurement of the variable of interest, e.g. temperature. Ultimately, calibration is the only way of ensuring the validity of the surrogate, but calibration is itself fraught with problems. As a result, Surrogation remains a pernicious cognitive bias within climatology.
As I said earlier, we prefer the climate as it is because we have built our societies around the assumption that it will not change.
System Justification lies at the heart of climate change concern. It even extends to the natural world, since we seem to believe that the existing eco-system has a greater validity than those that have preceded, or may follow, the current ecological equilibrium.
Zero Sum Bias
Regrettably, the climate change debate is dogged by zero sum bias, i.e. the presumption that one can only win if someone else loses. Zero sum thinking is particularly prevalent amongst the CAGW hypothesis proponents, who are too ready to denounce any colleague as a denier the moment they try to introduce a note of balance or restraint. There is no room for uncertainty or moderation in a zero sum game.
End of List
Thus ends my personal take on the role of cognitive biases within the climate science controversy. If your favourite bias is not featured, please to do not take it to heart. It will not be because I thought the bias to be unimportant, I will have just deemed it irrelevant. If you think I am mistaken, feel free to rectify the problem in the comments below.