by Judith Curry
“Is the road to scientific hell paved with good intentions?” – political psychologist Philip Tetlock (1994)
Part I in this series addressed logical fallacies. Part II addressed biases associated with a consensus building process. Part II addresses the role of social conflicts and biases.
Additional biases are triggered by social conflict between an individual’s responsibility for responsible conduct of research, and the larger ethical issues associated with the well-being of the public and the environment. Further, social biases are triggered by careerist goals, loyalty to one’s colleagues and institutional loyalties.
Scientists have the responsibility of adhering to the principles of ethical research and professional standards. But what happens when other responsibilities get in the way of these professional standards? These might include responsibilities to their conscience, their colleagues, institutions, the public and/or the environment. One can imagine many different conflicts across this range of responsibilities that that can bias the scientific process. As an example, scientists that have been heavily involved with the IPCC may be concerned with preserving the importance of the IPCC and its consensus, which has become central to their professional success, funding and influence.
Arguably the most important of these are conflicts between the responsible conduct of research and larger ethical issues associated with the well-being of the public and the environment. Fuller and Mosher’s book Climategate: The CruTape Letters argued that ‘noble cause corruption’ was a primary motivation behind the Climategate deceits. Noble cause corruption is when the ends of protecting the climate (noble) justify the means of sabotaging your scientific opponents (ignoble).
Psychologist Brian Nosek of the University of Virginia claims that the most common and problematic bias in science is ‘motivated reasoning’. People that have a ‘dog in the fight’ (reputational, financial, ideological, political) interpret observations to fit a particular idea that supports their particular ‘dog.’ The term ‘motivated reasoning’ is usually reserved for political motivations, but preserving their reputation or funding is also a strong motivator among scientists.
The embedding of political values into science occurs when value statements or ideological claims are wrongly treated as objective truth. Scientists have a range of attitudes about the environment; the problem occurs because there is the presumption that one set of attitudes is right and those who disagree are in denial. This results in conversion of a widely shared political ideology about climate change into ‘reality.’
Confirmation bias can become even stronger when people confront questions that trigger moral emotions and concerns about group identity. People’s beliefs become more extreme when they’re surrounded by like-minded colleagues. They come to assume that their opinions are not only the norm but also the truth – creating what social psychologist Jonathan Haidt calls a ‘tribal-moral community’ with its own sacred values about what’s worth studying and what’s taboo. Such biases can lead to widely-accepted claims that reflect the scientific community’s blind spots more than they reflect justified scientific conclusions.
Psychologists Cusiman and Lombrozo found that people facing a dilemma between believing an impartial assessment of the evidence and believing what would better fulfill a moral obligation, people often believe in line with the latter. Cuisman and Lombrozo found that morally good beliefs demand less evidence than morally bad beliefs. They also found that people sometimes treat the moral value of a belief as an independent justification for belief.
Motivated biases become particularly problematic once these biases are institutionalized, with advocacy statements made by professional societies, editorials written by journal editors, and public statements by the IPCC leadership.