CO2, Soot, Modeling and Climate Sensitivity

Warming Caused by Soot, Not CO2

From the Resilient Earth

Submitted by Doug L. Hoffman on Wed, 07/15/2009 – 13:19

A new paper in Science reports that a careful study of satellite data show the assumed cooling effect of aerosols in the atmosphere to be significantly less than previously estimated. Unfortunately, the assumed greater cooling has been used in climate models for years. In such models, the global-mean warming is determined by the balance of the radiative forcings—warming by greenhouse gases balanced against cooling by aerosols. Since a greater cooling effect has been used in climate models, the result has been to credit CO2 with a larger warming effect than it really has.

This question is of great importance to climate modelers because they have to be able to simulate the effect of GHG warming in order to accurately predict future climate change. The amount of temperature increase set into a climate model for a doubling of atmospheric CO2 is called the model’s sensitivity. As Dr. David Evans explained in a recent paper: “Yes, every emitted molecule of carbon dioxide (CO2) causes some warming—but the crucial question is how much warming do the CO2 emissions cause? If atmospheric CO2 levels doubled, would the temperature rise by 0.1°, 1.0°, or by 10.0° C?”

Temperature sensitivity scenarios from IPCC AR4.

The absorption frequencies of CO2 are already saturated, meaning that the atmosphere already captures close to 100% of the radiation at those frequencies. Consequently, as the level of CO2 in the atmosphere increases, the rise in temperature for a given increase in CO2 becomes smaller. This sorely limits the amount of warming further increases in CO2 can engender. Because CO2 on its own cannot account for the observed temperature rise in the past century, climate modelers assume that linkages exist between CO2 and other climate influences, mainly water vapor (for a more detailed explanation of what determines the Global Warming Potential of a gas see my comment “It’s not that simple”).

To compensate for the missing “forcing,” models are tuned to include a certain amount of extra warming linked to carbon dioxide levels—extra warming that comes from unestablished feedback mechanisms whose existence is simply assumed. Aerosol cooling and climate sensitivity in the models must balance each other in order to match historical conditions. Since the climate warmed slightly last century the amount of warming must have exceeded the amount of cooling. As Dr. Roy Spencer, meteorologist and former NASA scientist, puts it: “They program climate models so that they are sensitive enough to produce the warming in the last 50 years with increasing carbon dioxide concentrations. They then point to this as ‘proof’ that the CO2 caused the warming, but this is simply reasoning in a circle.”

A large aerosol cooling, therefore, implies a correspondingly large climate sensitivity. Conversely, reduced aerosol cooling implies lower GHG warming, which in turn implies lower model sensitivity. The upshot of this is that sensitivity values used in models for the past quarter of a century have been set too high. Using elevated sensitivity settings has significant implications for model predictions of future global temperature increases. The low-end value of model sensitivity used by the IPCC is 2°C. Using this value results, naturally, in the lowest predictions for future temperature increases. According to the paper “Consistency Between Satellite-Derived and Modeled Estimates of the Direct Aerosol Effect” published in Science on july 10, 2009, Gunnar Myhre states that previous values for aerosol cooling are too high—by as much as 40 percent—implying the IPCC’s model sensitivity settings are too high also. Here is the abstract of the paper:

In the Intergovernmental Panel on Climate Change Fourth Assessment Report, the direct aerosol effect is reported to have a radiative forcing estimate of –0.5 Watt per square meter (W m–2), offsetting the warming from CO2 by almost one-third. The uncertainty, however, ranges from –0.9 to –0.1 W m–2, which is largely due to differences between estimates from global aerosol models and observation-based estimates, with the latter tending to have stronger (more negative) radiative forcing. This study demonstrates consistency between a global aerosol model and adjustment to an observation-based method, producing a global and annual mean radiative forcing that is weaker than –0.5 W m–2, with a best estimate of –0.3 W m–2. The physical explanation for the earlier discrepancy is that the relative increase in anthropogenic black carbon (absorbing aerosols) is much larger than the overall increase in the anthropogenic abundance of aerosols.

The complex influence of atmospheric aerosols on the climate system and the influence of humans on aerosols are among the key uncertainties in the understanding recent climate change. Rated as one of the most significant yet poorly understood forcings by the IPCC there has been much activity in aerosol research recently (see Airborne Bacteria Discredit Climate Modeling Dogma and African Dust Heats Up Atlantic Tropics). Some particles absorb sunlight, contributing to climate warming, while others reflect sunlight, leading to cooling. The main anthropogenic aerosols that cause cooling are sulfate, nitrate, and organic carbon, whereas black carbon absorbs solar radiation. The global mean effect of human caused aerosols (in other words, pollution) is a cooling, but the relative contributions of the different types of aerosols determine the magnitude of this cooling. Readjusting that balance is what Myhre’s paper is all about.

Smoke from a forest fire.

Photo EUMETSAT.

Discrepancies between recent satellite observations and the values needed to make climate models work right have vexed modelers. “A reliable quantification of the aerosol radiative forcing is essential to understand climate change,” states Johannes Quaas of the Max Planck Institute for Meteorology in Hamburg, Germany. Writing in the same issue of Science Dr. Quaas continued, “however, a large part of the discrepancy has remained unexplained.” With a systematic set of sensitivity studies, Myhre explains most of the remainder of the discrepancy. His paper shows that with a consistent data set of anthropogenic aerosol distributions and properties, the data-based and model-based approaches converge.

Myhre argues that since preindustrial times, soot particle concentrations have increased much more than other aerosols. Unlike many other aerosols, which scatter sunlight, soot strongly absorbs solar radiation. At the top of the atmosphere, where the Earth’s energy balance is determined, scattering has a cooling effect, whereas absorption has a warming effect. If soot increases more than scattering aerosols, the overall aerosol cooling effect is smaller than it would be otherwise. According to Dr. Myhre’s work, the correct cooling value is some 40% less than that previously accepted by the IPCC.

Not that climate modelers are unaware of the problems with their creations. Numerous papers have been published that detail problems predicting ice cover, precipitation and temperature correctly. This is due to inadequate modeling of the ENSO, aerosols and the bane of climate modelers, cloud cover. Apologists for climate modeling will claim that the models are still correct, just not as accurate or as detailed as they might be. Can a model that is only partially correct be trusted? Quoting again from Roy Spencer’s recent blog post:

It is also important to understand that even if a climate model handled 95% of the processes in the climate system perfectly, this does not mean the model will be 95% accurate in its predictions. All it takes is one important process to be wrong for the models to be seriously in error.

Can such a seemingly simple mistake in a single model parameter really lead to invalid results? Consider the graph below, a representation of the predictions made by James Hansen to the US Congress in 1988, plotted against how the climate actually behaved. Pretty much what one would expect if the sensitivity of the model was set too high, yet we are still supposed to believe in the model’s results. No wonder even the IPCC doesn’t call their model results predictions, preferring the more nebulous term “scenarios.”

Now that we know the models used by climate scientists were all tuned incorrectly what does this imply for the warnings of impending ecological disaster? What impact does this discovery have on the predictions of melting icecaps, rising ocean levels, increased storm activity and soaring global temperatures? Quite simply they got it wrong, at least in as much as those predictions were based on model results. To again quote from David Evans’ paper:

None of the climate models in 2001 predicted that temperatures would not rise from 2001 to 2009—they were all wrong. All of the models wrongly predict a huge dominating tropical hotspot in the atmospheric warming pattern—no such hotspot has been observed, and if it was there we would have easily detected it.

Once again we see the shaky ground that climate models are built on. Once again a new paper in a peer reviewed journal has brought to light significant flaws in the ways models are configured—forced to match known historical results even when erroneous values are used for fundamental parameters. I have said many times that, with enough tweaking, a model can be made to fit any set of reference data—but such bogus validation does not mean the model will accurately predict the future. When will climate science realize that its reputation has been left in tatters by these false prophets made of computer code?

Be safe, enjoy the interglacial and stay skeptical.

==================================

ADDENDUM BY ANTHONY

I’d like to add this graph showing CO2’s temperature response to supplement the one Doug Hoffman cites from IPCC AR4. here we see that we are indeed pretty close to saturation of the response.

CO2_temperature_curve_saturation
click for larger image

The “blue fuzz” represents measured global CO2 increases in our modern times.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

209 Comments
Inline Feedbacks
View all comments
Jim
July 18, 2009 9:31 am

Phil. (18:49:51) : These spectra appear to be taken at low pressure. It would be interesting to see a series at different pressures.

Pragmatic
July 18, 2009 10:20 am

Mark Miller (01:54:58) :
“You can’t do science on a computer or with a book, because [with] the computer–like a book, like a movie–you can make up anything. We can have an inverse cube law of gravity on here, and the computer doesn’t care. No language system that we have knows what reality is like. That’s why we have to go out and negotiate with reality by doing experiments.
…the correct use of models was to have the students construct them after making observations of actual phenomena, and that their models should reflect what they observed.”

This is an excellent comment Mark. This is indeed the nature of the beast. And what there is so much resistance to. Offering computer model projections that fail to meet observations is not even pseudo-science. It is computer gaming.
(an old Kaypro fan)

July 18, 2009 11:10 am

Pragmatic:
“Offering computer model projections that fail to meet observations is not even pseudo-science. It is computer gaming.”
I remember Bob Carter saying this about a computer model projection, that it was “Playstation 4 stuff”.
I’ve been wondering though, since I watched Kay’s presentation, about the use of computer models for projections, and this seems to get into a different area. I was talking about a different context because it seems to me the AGW promoters have been using computer models in an attempt to carry out scientific investigation, which is deeply flawed.
Meteorologists use atmospheric models to help them make projections, but they do not use them to make projections. It used to be that economists used computer models of the economy the same way, but given what we learned in the latest economic collapse it sounds like they’ve been used to make economic projections, which is scary, and this situation shows how wrong they were.
It seems to me that it depends on what kind of phenomenon you model. For example, if one were to construct a gravity simulator, one could use it with reasonable accuracy to project the paths of comets and asteroids, with the understanding that some error will be introduced by the model. We can think of the model as a scientific instrument for making projections, and as with all instruments, some error is introduced into observations. It seems to me the reason this can be done is that a gravitational system can be modeled in a linear deterministic fashion, despite the fact that there will be multiple forces acting on any one object. It’s always important, even in a model like this, to inspect its internals for what, if any, assumptions are being made.
The atmosphere (and an economy) cannot be modeled this way, because the element of chaos comes into it. We cannot project exactly the way a clump of molecules will interact, and the error grows, probably exponentially with time.
I think if one is doing projections they are not doing science. They are rather (hopefully) using what science has found in order to enhance our ability to respond to something. I refer to my earlier comments about how climate models have been constructed. If I were to assume the best of intentions behind their methodology it would still be flawed: that they use the assumption that just because they’ve tweaked a model to accurately predict climate in the recent past, even though their model does not contain a complete set of atmospheric influence factors, it will do so in the future. That’s wishful thinking.

July 18, 2009 12:40 pm

Jim (09:31:25) :
Phil. (18:49:51) : These spectra appear to be taken at low pressure. It would be interesting to see a series at different pressures.

Those spectra are at atmospheric pressure and 296K.

Fred2
July 18, 2009 2:51 pm

Do we remediate soot any differently from C02? I guess we burn fossil fuels more efficiently, turning it into CO2.
Is the heating from soot more or less than CO2?

Jim
July 19, 2009 3:52 pm

Phil. (12:40:35) : I was basing that on the fact that they look so sharp.

July 19, 2009 8:59 pm

Jim (15:52:40) :
Phil. (12:40:35) : I was basing that on the fact that they look so sharp.

Just high resolution, you need that to see the overlap (or lack thereof), the water spectra are rather sparse as is apparent when you look at high res.
The lines get narrower as you get higher in the atmosphere.

George E. Smith
July 20, 2009 11:07 am

“”” Phil. (19:06:58) :
George E. Smith (11:40:27) :
“”” Phil. (10:31:31) : “””
I’ll take your word for it Phil. I’m totally at the mercy of the people who publish these purported spectra. Given the billions of dollars that US taxpayers have given to “climate scientists”, I would think that the least we could have gotten out of the deal would be some real measured spectra of real world absorption by all these EPA poisons that the government is telling us they are going to tax us up the wazoo to get rid of.
I keep trying to answer this George but my posts aren’t being accepted, sorry. “””
Well Phil. I digest everything that makes it through the system; that bears your logo.
My “Infra-Red Handbook” has high resolution spectra for the atmosphere; but since the book is mainly for military applications such as weapons targetting and such, for some reason they measure through a horizontal layer of air, and not through the vertical atmosphere.
I can’t tell whether computed spectra, properly deal with all the real world isotopic species of atmospheric gases. Given that you have C12/13/14, and O16/18, as well as H and D, and all the possible combinations of those in the existing GHGs, i would think a computed spectrum could get very complicated.
And then when you throw in the emitted spectrum; well it gets even crazier; and of course including the emitted spectrum form all levels of the atmosphere, as well as the surface emission.
But although it would be nice to know; I’m still convinced it is totally irrelevent, since I believe that the cloud negative feedback simply wipes out any changes due to GHGs or any other perturbations; except major orbital changes of course.
When the history of all of this is finally written 100 years from now, if we survive on this planet that long; people are going to puzzle over why we thought it was all so complicated.
Perhaps if more climatologists were physicists; instead of statisticians, they would get to the answers sooner. Planet earth knows how to calculate the effetcs; and does so with complete accuracy; we should attempt to do likewise, instead of writing ever more arcane computer video games.
George

July 20, 2009 7:42 pm

George E. Smith (11:07:46) :
“”” Phil. (19:06:58) :
George E. Smith (11:40:27) :
“”” Phil. (10:31:31) : “””
I’ll take your word for it Phil. I’m totally at the mercy of the people who publish these purported spectra. Given the billions of dollars that US taxpayers have given to “climate scientists”, I would think that the least we could have gotten out of the deal would be some real measured spectra of real world absorption by all these EPA poisons that the government is telling us they are going to tax us up the wazoo to get rid of.
I keep trying to answer this George but my posts aren’t being accepted, sorry. “””
Well Phil. I digest everything that makes it through the system; that bears your logo.

Eventually it got through, for some reason my spectra caused problems?
My “Infra-Red Handbook” has high resolution spectra for the atmosphere; but since the book is mainly for military applications such as weapons targetting and such, for some reason they measure through a horizontal layer of air, and not through the vertical atmosphere.
Yes I recall doing some consultancy work for the RAF on IR transmission through clouds, the cloud lab was situated underground near a runway (I forget why). Part way through the experiment there was a huge roar above, is was the nuclear bombers on standby warming up their engines!
I can’t tell whether computed spectra, properly deal with all the real world isotopic species of atmospheric gases. Given that you have C12/13/14, and O16/18, as well as H and D, and all the possible combinations of those in the existing GHGs, i would think a computed spectrum could get very complicated.
The spectra I showed have all the possible isotopologues in their appropriate proportions, e.g.:
Molecule No. Name Formula Isotopologue No. Formula Abundance
1 water H2O 1 H(16)OH 0.997317
1 water H2O 2 H(18)OH 0.00199983
1 water H2O 3 H(17)OH 0.000371884
1 water H2O 4 HD(16)O 0.000310693
1 water H2O 5 HD(18)O 6.23003e-07
1 water H2O 6 HD(17)O 1.15853e-07
2 carbon dioxide CO2 1 (16)O(12)C(16)O 0.984204
2 carbon dioxide CO2 2 (16)O(13)C(16)O 0.0110574
2 carbon dioxide CO2 3 (16)O(12)C(18)O 0.00394707
2 carbon dioxide CO2 4 (16)O(12)C(17)O 0.000733989
2 carbon dioxide CO2 5 (16)O(13)C(18)O 4.43446e-05
2 carbon dioxide CO2 6 (16)O(13)C(17)O 8.24623e-06
2 carbon dioxide CO2 7 (18)O(12)C(18)O 3.95734e-06
2 carbon dioxide CO2 8 (17)O(12)C(18)O 1.4718e-06
2 carbon dioxide CO2 9 (18)O(13)C(18)O 4.446e-08

1 7 8 9
Verified by MonsterInsights