CO2 Sensitivity is Multi-Modal – All bets are off

Guest Post by Ira Glickstein

A multi-modal probability distribution, such as the graphic below [from Schmittner 2011], cries out “MULTIPLE POPULATIONS”. Equilibrium Climate Sensitivity (expected temperature increase due to a doubling of CO2 levels, all else being equal) is distinctly different for Land and Ocean, with two peaks for Land (L1 and L2) and five peaks for Ocean (O1, O2, O3, O4, and O5).

When a probability distribution includes more than one population, the mean may, quite literally, have no MEANing! All bets are off.

Example of a Multi-Modal Distribution

According to the basic tenets of System Science (my PhD area) probability distributions that inadvertently mix multiple populations often lead to un-reliable conclusions. Here is an easy to understand example of how a multi-modal distribution leads to ridiculous results.

Say we graphed the heights of a group of infants and their mothers. We’d get a peak at, say 25″, representing the average height of the infants, and another at, say 65″, representing the mothers. The mean of that multi-modal distribution, 45″, would represent neither the mothers nor the infants – not a single baby nor mother would be 45″ tall!

If some “alien scientist” re-measured the heights of the cohort of children and their mothers over a decade, the mean would increase rapidly, perhaps from 45″ to 60″. If that “alien scientist” did not understand multi-modal distributions representing different populations, he or she might extrapolate and predict that, a decade hence, the mean would be 75″! Of course, actual measurements over a second decade, as the children reached their adult heights, would have a mean that would stabilize closer to 66″ (assuming about half the children were male). The “alien scientist’s” extrapolation would be as wrong as some IPCC predictions seem to be.

Implications of Multi-Modal CO2 Sensitivity

Schmittner says:

The [graph shown above], considering both land and ocean reconstructions, is multi-modal and displays a broad maximum with a double peak between 2 and 2.6 K [1 K = 1ºC], smaller local maxima around 2.8 K and 1.3 K and vanishing probabilities below 1 K and above 3.2 K. The distribution has its mean and median at 2.2 K and 2.3 K, respectively and its 66% and 90% cumulative probability intervals are 1.7–2.6 K, and 1.4–2.8 K, respectively. [my emphasis]

The caption for the graphic says:

Marginal posterior probability distributions for ECS2xC. Upper: estimated from land and ocean, land only, and ocean only temperature reconstructions using the standard assumptions (1 × dust, 0 × wind stress, 1 × sea level correction of ΔSSTSL = 0.32 K…). Lower: estimated under alternate assumptions about dust forcing, wind stress, and ΔSSTSL using land and ocean data.

So part of the cause of multi-modality is due to different sensitivity to dust, wind, and sea surface temperatures for the combined Ocean and Land data, and part due to differences between Ocean and Land. But, that is only part of the story. Please read on for how Geographic Zones seem to have different sensitivities.

Geographic Zones Have Different Sensitivities

Another Schmittner 2011 graphic, shown below, indicates how different the Arctic, North Temperate, Tropics, South Temperate, and Antarctic zones are. Indeed, there is a startling difference between the Arctic and Antarctic.

Zonally averaged surface temperature change between the LGM and modern. The black thick line denotes the climate reconstructions and grey shading the ±1, 2, and 3 K intervals around the observations. Modeled temperatures, averaged using only cells with reconstructions … are shown as colored lines labeled with the corresponding ECS2xC values.

The thick black line represents the “climate reconstruction” (change in temperature in ºC) between current conditions and those of about 20,000 years ago during the Last Glacial Maximum. The LGM was the coldest period in the history of the Earth in the past 100,000 years. Note that the Tropics were about 2ºC cooler than they are now, the South Temperate zone was about 3ºC cooler, the North Temperate zone about 4ºC cooler, and the Antarctic about 8ºC cooler. However, according to the climate reconstruction, the Arctic was about 1ºC WARMER than it is today!

The estimated CO2 level during the LGM is 185 ppm, quite a bit below the estimated Pre-Industrial level of about 280 ppm, and about half that of the current measured level of about 390 ppm. Thus, IF CO2 DOUBLING CAUSED ALL of the temperature increase from the LGM to the present, the sensitivity for the geographic zones would range from +8ºC (Antarctic) to +4ºC (South Temperate) to +3ºC (North Temperate) to +2ºC (Tropics) to -1ºC (Arctic).

Of course, based on the Ice Core temperature records for several ice ages over the past 400,000 years, the warming 20,000 years after a Glacial Maximum tends to be significant (several degrees). Thus, while increases in CO2, all else being equal, do cause some increase in mean temperatures, it is clear from the Ice Core record, where temperature changes lead CO2 changes by from 800 to 1200 years, that something else causes the temperature to change and then the temperature change causes CO2 to change. Thus, it would be wrong, IMHO, to assign more than some small fraction of the warming since the LGM to CO2 increases.

The colored lines in the above graphic correspond to modeled temperatures based on different assumed CO2 sensitivities, ranging from 0.3ºC to +8.4ºC. The darker blue line, corresponding to a sensitivity of 2.3ºC, is the best match for the thick black climate reconstruction line.

IPCC CO2 Sensitivities are Mono-Modal and have “Fat Tails”

So, how do the IPCC AR4 Figure 9.20 graphs of Equilibrium Climate Sensitivity compare to the Schmittner 2011 results? Not too well, as the graphic below indicates!

...Comparison between different estimates of the PDF (or relative likelihood) for ECS (°C). All PDFs/likelihoods have been scaled to integrate to unity between 0°C and 10°C ECS. ...

First of all, notice that NONE of the individual IPCC graphs are multi-modal! Yet, taken as a group, there are several distinct peaks, indicating that each of the researchers characterized only one of a number of multi-modal peaks, and were inadvertently (or purposely?) blind to the other populations. Thus, the IPCC curves, taken as a group, seem to support Schmittner’s results of multi-modality.

For example, compare the green curve (Andronova 01) to the red curve (Forest 06). They hardly overlap, indicating that they have sampled different populations.

There is another, less obvious problem with the IPCC curves. Notice that they each have a relatively “normal” tail on the left and what is called a “Fat Tail” on the right. What does that mean? Well, a “normal curve” has a single peak, representing both the mode and the mean, and two “normal” tails that approach zero at about +/- 3ơ (Greek letter sigma, representing standard deviation). A mono-modal curve may skew to the left or right a bit, which would put the mode (peak) to the left or right of the mean.

The problem with the IPCC curves is that, in addition to the skew, the right-hand tail extends quite far to the right, out to 10ºC and beyond, before approaching zero. According to Schmittner 2011:

High sensitivity models (ECS2xC > 6.3 K) show a runaway effect resulting in a completely ice-covered planet. Once snow and ice cover reach a critical latitude, the positive ice-albedo feedback is larger than the negative feedback due to reduced longwave radiation (Planck feedback), triggering an irreversible transition … During the LGM Earth was covered by more ice and snow than it is today, but continental ice sheets did not extend equatorward of ~40°N/S, and the tropics and subtropics were ice free except at high altitudes. Our model thus suggests that large climate sensitivities (ECS2xC > 6 K) cannot be reconciled with paleoclimatic and geologic evidence, and hence should be assigned near-zero probability….[my emphasis]

Based on the above argument, I have annotated the IPCC figure to “X-out” the Fat Tails beyond 6°C. I did that because any sensitivity greater than 6°C would retrodict a “total snowball Earth” at the LGM which contradicts clear evidence that the ice sheets did not extend equatorward beyond the middle of the USA or corresponding latitudes in Europe, Asia, South America, or Africa. Indeed, if Schmittner is correct, the tails of the IPCC graphs that extend beyond 5°C (or perhaps even 4°C) should approach zero probability.

Conclusions

Schmittner 2011 contradicts the IPCC climate sensitivity estimates and thus brings into question all IPCC temperature predictions due to human-caused CO2 increases.

It is clear from the several, widely-spaced peaks in the IPCC AR4 Figure 9.20 curves that Equilibrium Climate Sensitivity is indeed multi-modal. Yet, ALL the individual curves are mono-modal. Thus, the IPCC figure is, on its face, self-contradictory.

If Schmittner 2011 is correct that sensitivity beyond about 6°C is impossible based on the fact that Tropical and Sub-Tropical zones were not ice-covered during the LGM, the Fat Tails of all the IPCC Equilibrium Climate Sensitivity curves are wrong. That calls into question each and every one of those curves.

The multi-modal nature of CO2 sensitivity indicates that the effects of CO2 levels are quite different between geographic zones as well as between Ocean and Land. Thus, the very concept of a whole-Earth Equilibrium Climate Sensitivity based on a doubling of CO2 levels may be misplaced.

Finally, if CO2 is as strong a driver of surface temperatures as the IPCC would have us believe, how in the world can anyone explain the apparent fact that, given a doubling of CO2 levels, the modern Arctic is about 1°C COLDER than the LGM Arctic?

BOTTOM LINE: The Climate System is multi-faceted and extraordinarily complex. Even the most competent Climate Scientists, with the best and purest of intentions are rather like the blind men trying to characterize and understand the elephant. (One happens upon the elephant’s leg and proclaims “the elephant is like a tree”. Another happens to grab the tail and says with equal certainty “the elephant is like a snake”. The third bumps into the side of the elephant and confidently shouts “No, the elephant is like a wall!”) Each in his or her way is correct, but none can really understand all the aspects nor characterize or predict the behavior of the actual Climate System. And, sadly, not all Climate Scientists are competent, and some have impure intentions.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

153 Comments
Inline Feedbacks
View all comments
December 22, 2011 3:09 pm

IRA Glickstein (Dec. 22, 2011 at 11:07 am):
It sounds as though I gave you a bunch of details on a topic that was not of direct interest to you. Sorry about that. I’ll try to give you some more digestible meat on which to chew.
It is notable that under maximum likelihood estimation, the value assigned to the probability of observing statistical events of a particular events of a particular description is x/n, where n is the count of events in the sample and x is the count of events of a the particular description. Now, if you were to read the report of Working Group I in AR4, I believe you would conclude with me that there is no reference to observed events, to a sample or to the underlying population. The idea of a statistical event is simply missing from Working Group I’s argument regarding the equilibrium climate sensitivity (TECS). The conclusion that this idea is missing is corroborated by the fact that the equilibrium temperature (the “steady-state” temperature in engineering jargon) is not an observable feature of the real world. As it is not an observable, observed statistical events for which the equilibrium temperature is the outcome do not exist.
From facts stated above, I conclude that the interpretation which should be placed upon the word “probability” in reference to the function that maps TECS to its probability density is not the frequentist interpretation. An alternate interpretation is that the word “probability” has the Bayesian interpretation of one’s “subjective degree of belief.” Under this interpretation, the probability of an event is a proportion in a statistical ensemble. The frequentists’ statistical population is replaced by the Bayesians’ statistical ensemble.
If, as seems to be the case, the probability is the proportion in a statistical ensemble, the contents of this ensemble are the numerical values that could possibly be taken on by TECS. The idea is that in Earth’s climate system, TECS has a specific value but this value is uncertain.
Note that to determine the value of the probability that the value of TECS lies between specified bounds, one integrates the probability density function within these bounds. The probability which is computed in this way represents Working Group I’s subjective degree of belief in the proposition that the value of TECS lies within these bounds. Working Group I’s contention cannot be tested in lieu of the existence of a statistical population. It follows the lack of testability that this contention lies outside science. The IPCC makes a pretense of conducting a scientific investigation but this is a sham.
By the way, your impression that x+1/n+2 is the Bayesian assignment to a probability is incorrect. This is the assignment under the application of the Bayesian method that is called “Laplace’s law of succession.” It is not the assignment that is made to the probability that TECS lies within specified bounds. It is easy to see that this is true, for as there are no observed statistical events, neither the count n of these events not the count x of the subset of these events that are of a particular description exists as a concept.

Bill Illis
December 22, 2011 5:34 pm

Ira,
I realized that I can plot the paleoclimate observation data on the same kind of chart that the article is based on.
Since it is a quite strange at the end of the day, it needs to be to be split into three parts – one dealing with all the numbers going back 545 Mya, then one dealing with just the last 50 Mya (which has many more datapoints) and then the last 1 million years (which is completely contaminated by the Ice Albedo effect of the ice ages).
Time on the Y-axis, Implied CO2 sensitivity per doubling on the X-axis.
Last 545 Mys. (in the deep paleo climate 1.5C per doubling is the most common).
http://img23.imageshack.us/img23/1391/co2sensitivitylast545my.png
In the last 50 Mys, significant randomness where any number fits the observations. There is huge range here +/- 40C per doubling (which is where my Null comment comes from).
http://img163.imageshack.us/img163/8312/co2sensitivitylast50my.png
In the last 1 million years, one cannot determine the CO2 sensitivity without properly accounting for Albedo. In my mind, this is extemely important. All of the previous CO2 sensitivity studies are based on “some type of data” but there is no way any of them are determining the CO2 sensitivity without just constructing data where none exists.
http://img221.imageshack.us/img221/2056/co2sensitivitylast1my.png
If one were to take the last 150 years, the lags in the climate system (ocean heat accumulation for example) makes this impossible unless one builds in several assumptions. But we are on track for something like 1.0C per doubling (if the lag is 7 years which seems to be what the recent ocean data is saying) or as much as 1.5C per doubling (if the lags are as long as 35 years or longer which the theory is now moving towards).

1 5 6 7