CO2, Soot, Modeling and Climate Sensitivity

Warming Caused by Soot, Not CO2

From the Resilient Earth

Submitted by Doug L. Hoffman on Wed, 07/15/2009 – 13:19

A new paper in Science reports that a careful study of satellite data show the assumed cooling effect of aerosols in the atmosphere to be significantly less than previously estimated. Unfortunately, the assumed greater cooling has been used in climate models for years. In such models, the global-mean warming is determined by the balance of the radiative forcings—warming by greenhouse gases balanced against cooling by aerosols. Since a greater cooling effect has been used in climate models, the result has been to credit CO2 with a larger warming effect than it really has.

This question is of great importance to climate modelers because they have to be able to simulate the effect of GHG warming in order to accurately predict future climate change. The amount of temperature increase set into a climate model for a doubling of atmospheric CO2 is called the model’s sensitivity. As Dr. David Evans explained in a recent paper: “Yes, every emitted molecule of carbon dioxide (CO2) causes some warming—but the crucial question is how much warming do the CO2 emissions cause? If atmospheric CO2 levels doubled, would the temperature rise by 0.1°, 1.0°, or by 10.0° C?”

Temperature sensitivity scenarios from IPCC AR4.

The absorption frequencies of CO2 are already saturated, meaning that the atmosphere already captures close to 100% of the radiation at those frequencies. Consequently, as the level of CO2 in the atmosphere increases, the rise in temperature for a given increase in CO2 becomes smaller. This sorely limits the amount of warming further increases in CO2 can engender. Because CO2 on its own cannot account for the observed temperature rise in the past century, climate modelers assume that linkages exist between CO2 and other climate influences, mainly water vapor (for a more detailed explanation of what determines the Global Warming Potential of a gas see my comment “It’s not that simple”).

To compensate for the missing “forcing,” models are tuned to include a certain amount of extra warming linked to carbon dioxide levels—extra warming that comes from unestablished feedback mechanisms whose existence is simply assumed. Aerosol cooling and climate sensitivity in the models must balance each other in order to match historical conditions. Since the climate warmed slightly last century the amount of warming must have exceeded the amount of cooling. As Dr. Roy Spencer, meteorologist and former NASA scientist, puts it: “They program climate models so that they are sensitive enough to produce the warming in the last 50 years with increasing carbon dioxide concentrations. They then point to this as ‘proof’ that the CO2 caused the warming, but this is simply reasoning in a circle.”

A large aerosol cooling, therefore, implies a correspondingly large climate sensitivity. Conversely, reduced aerosol cooling implies lower GHG warming, which in turn implies lower model sensitivity. The upshot of this is that sensitivity values used in models for the past quarter of a century have been set too high. Using elevated sensitivity settings has significant implications for model predictions of future global temperature increases. The low-end value of model sensitivity used by the IPCC is 2°C. Using this value results, naturally, in the lowest predictions for future temperature increases. According to the paper “Consistency Between Satellite-Derived and Modeled Estimates of the Direct Aerosol Effect” published in Science on july 10, 2009, Gunnar Myhre states that previous values for aerosol cooling are too high—by as much as 40 percent—implying the IPCC’s model sensitivity settings are too high also. Here is the abstract of the paper:

In the Intergovernmental Panel on Climate Change Fourth Assessment Report, the direct aerosol effect is reported to have a radiative forcing estimate of –0.5 Watt per square meter (W m–2), offsetting the warming from CO2 by almost one-third. The uncertainty, however, ranges from –0.9 to –0.1 W m–2, which is largely due to differences between estimates from global aerosol models and observation-based estimates, with the latter tending to have stronger (more negative) radiative forcing. This study demonstrates consistency between a global aerosol model and adjustment to an observation-based method, producing a global and annual mean radiative forcing that is weaker than –0.5 W m–2, with a best estimate of –0.3 W m–2. The physical explanation for the earlier discrepancy is that the relative increase in anthropogenic black carbon (absorbing aerosols) is much larger than the overall increase in the anthropogenic abundance of aerosols.

The complex influence of atmospheric aerosols on the climate system and the influence of humans on aerosols are among the key uncertainties in the understanding recent climate change. Rated as one of the most significant yet poorly understood forcings by the IPCC there has been much activity in aerosol research recently (see Airborne Bacteria Discredit Climate Modeling Dogma and African Dust Heats Up Atlantic Tropics). Some particles absorb sunlight, contributing to climate warming, while others reflect sunlight, leading to cooling. The main anthropogenic aerosols that cause cooling are sulfate, nitrate, and organic carbon, whereas black carbon absorbs solar radiation. The global mean effect of human caused aerosols (in other words, pollution) is a cooling, but the relative contributions of the different types of aerosols determine the magnitude of this cooling. Readjusting that balance is what Myhre’s paper is all about.

Smoke from a forest fire.

Photo EUMETSAT.

Discrepancies between recent satellite observations and the values needed to make climate models work right have vexed modelers. “A reliable quantification of the aerosol radiative forcing is essential to understand climate change,” states Johannes Quaas of the Max Planck Institute for Meteorology in Hamburg, Germany. Writing in the same issue of Science Dr. Quaas continued, “however, a large part of the discrepancy has remained unexplained.” With a systematic set of sensitivity studies, Myhre explains most of the remainder of the discrepancy. His paper shows that with a consistent data set of anthropogenic aerosol distributions and properties, the data-based and model-based approaches converge.

Myhre argues that since preindustrial times, soot particle concentrations have increased much more than other aerosols. Unlike many other aerosols, which scatter sunlight, soot strongly absorbs solar radiation. At the top of the atmosphere, where the Earth’s energy balance is determined, scattering has a cooling effect, whereas absorption has a warming effect. If soot increases more than scattering aerosols, the overall aerosol cooling effect is smaller than it would be otherwise. According to Dr. Myhre’s work, the correct cooling value is some 40% less than that previously accepted by the IPCC.

Not that climate modelers are unaware of the problems with their creations. Numerous papers have been published that detail problems predicting ice cover, precipitation and temperature correctly. This is due to inadequate modeling of the ENSO, aerosols and the bane of climate modelers, cloud cover. Apologists for climate modeling will claim that the models are still correct, just not as accurate or as detailed as they might be. Can a model that is only partially correct be trusted? Quoting again from Roy Spencer’s recent blog post:

It is also important to understand that even if a climate model handled 95% of the processes in the climate system perfectly, this does not mean the model will be 95% accurate in its predictions. All it takes is one important process to be wrong for the models to be seriously in error.

Can such a seemingly simple mistake in a single model parameter really lead to invalid results? Consider the graph below, a representation of the predictions made by James Hansen to the US Congress in 1988, plotted against how the climate actually behaved. Pretty much what one would expect if the sensitivity of the model was set too high, yet we are still supposed to believe in the model’s results. No wonder even the IPCC doesn’t call their model results predictions, preferring the more nebulous term “scenarios.”

Now that we know the models used by climate scientists were all tuned incorrectly what does this imply for the warnings of impending ecological disaster? What impact does this discovery have on the predictions of melting icecaps, rising ocean levels, increased storm activity and soaring global temperatures? Quite simply they got it wrong, at least in as much as those predictions were based on model results. To again quote from David Evans’ paper:

None of the climate models in 2001 predicted that temperatures would not rise from 2001 to 2009—they were all wrong. All of the models wrongly predict a huge dominating tropical hotspot in the atmospheric warming pattern—no such hotspot has been observed, and if it was there we would have easily detected it.

Once again we see the shaky ground that climate models are built on. Once again a new paper in a peer reviewed journal has brought to light significant flaws in the ways models are configured—forced to match known historical results even when erroneous values are used for fundamental parameters. I have said many times that, with enough tweaking, a model can be made to fit any set of reference data—but such bogus validation does not mean the model will accurately predict the future. When will climate science realize that its reputation has been left in tatters by these false prophets made of computer code?

Be safe, enjoy the interglacial and stay skeptical.

==================================

ADDENDUM BY ANTHONY

I’d like to add this graph showing CO2’s temperature response to supplement the one Doug Hoffman cites from IPCC AR4. here we see that we are indeed pretty close to saturation of the response.

CO2_temperature_curve_saturation
click for larger image

The “blue fuzz” represents measured global CO2 increases in our modern times.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

209 Comments
Inline Feedbacks
View all comments
July 16, 2009 2:07 pm

Hansen’s Scenario B yields a 2008 CO2 value of 379.6 ppmv and Scenario C a value of 368.4 ppmv…If my math’s right.

Nogw
July 16, 2009 2:07 pm

Those who as Adam ate the fruit from the tree of knowledge or the computer model from the processor hardware, got confused be it in words or in binary algorithms.

George E. Smith
July 16, 2009 2:09 pm

“”” MattN (12:43:36) :
George: “Besides, anything that absorbs some incoming solar energy results in cooling, because that energy doesn’t reach the ground; and the higher up in the atmosphere it gets absorbed, the quicker it can be radiatied back into space.”
But that’s not how I’m under the impression CO2 works. I thought CO2 absorbed radiation coming back from the earth’s surface….. “””
Well you are right there; and I believe I did say, that CO2 is very little involved in incoming solar radiation absorption; water vapor on the other hand absorbs quite a lot of incoming solar radiation, and thereby can contribute to surface cooling.
But both water vapor and CO2 do absorb some of the earth surface emitted thermal (long wave) radiation, and water vapor absorbs far more than CO2 does; which is why it is water vapor that is the major green house gas in the atmosphere; not CO2.

Nick Stokes
July 16, 2009 2:13 pm

Dave M and all,
On Hansen’s prediction, you can argue about whether scenario B is exactly how it turned out, or maybe it was a bit like A as well. This was all hashed out when Pat Michaels did what the poster has done here in testimony to congress. Steve McIntyre argued for scenario A, but couldn’t defend this style of presentation. He said:

To clarify, I do not agree that it was appropriate for Michaels not to have illustrated Scenarios B or C, nor did I say that in this post. These scenarios should have been shown, as I’ve done in all my posts here. It was open to Michaels to take Scenario A as his base case provided that he justified this and analysed the differences to other scenarios as I’m doing.

And yet, it keeps happening.

July 16, 2009 2:17 pm

The real question is whether the corresponding RISE IN TEMPERATURE took place, or not.
That is why they used to call it AGW.

brazil84
July 16, 2009 2:21 pm

It’s fun to watch this whole global warming hoax slowly unravel. I’m pretty confident that the AGWer estimates of aeresol cooling will turn out to have been estimated by asking “what do we need to make our models work?”

Bill Illis
July 16, 2009 2:27 pm

In terms of Hansen’s Scenario ABC, here is the actual GHG concentrations that Hansen used.
http://www.realclimate.org/data/H88_scenarios.dat
Actual CO2 to date is just slightly lower than Scenario A and just slightly higher than Scenario B but Methane and CFCs are quite a bit lower than both.
The different total forcings in watts/m^2 is here.
http://www.realclimate.org/data/H88_scenarios_eff.dat
The total forcing of Scenario B is just slightly higher than the climate scientists/modelers estimate the forcing is at now, so Scenario B is the one to use.
Here are the actual temperature predictions for each Scenario.
http://www.realclimate.org/data/scen_ABC_temp.data
Scenario B was to be at +0.841C in 2008 but GissTemp was only +0.44C in 2008 so one could say the projections were off by about half.

George E. Smith
July 16, 2009 2:33 pm

“”” Jim (10:57:56) :
George E. Smith (10:30:23) : More and more, I appreciate your comments. One difference between CO2 and H2O is the liquification point. CO2 is a gas even at the poles, so it is always in the atomosphere to add some heat. I have wondered if CO2 isn’t like a pilot light on a gas water heater whose function is mainly to keep the air hot enough to keep water vapor in play. I wonder what would happen if there were no green house gasses. Would water liquify, then the entire Earth turn into a Snowball? Maybe we could model it?? 🙂 “””
Jim,
Not to worry, there is always plenty of water vapor in the atmosphere; more than enough to stop us becming an ice ball.
Suppose the whole earth surface were cooled down to zero deg C; except for those places which are now colder than zero; we’d let them stay where they are. Well the oceans are saline, so they won’t freeze at zero, so still plenty of liquid water out there.
The Handbook of Chemistry and Physics, gives the vapor pressure of water at zero deg C as 4.579 mm Hg. This is from the copy of the Handbook that I got off Noah’s ark with; so I’ll let you do the conversion to Pascals; in any case it’s about 0.6% of atmospheric pressure; so still vastly in excess of any CO2 vapor pressure.
Let’s drop down to -15 C; and please don’t ask me to explain why we would still have liquid water at -15 C, but we would still have 1.436 mm Hg vapor pressure; about 1/3 of the zero C value, and still way more than any CO2.
Even at -90C over ice at the coldest places on earth, there is 0.00007 mm Hg of water vapor pressure over the ice. Now CO2 is ahead.
The long and the short is that no matter what the earth temperature was at the equilibrium black body temperature or higher, there is more than enough atmospheric H2O to start the water vapor positive feedback warming; even if there were no CO2 in the atmosphere at all. The notion that water vapor needs CO2 to trigger its positive feedback warming (without which CO2 becomes a non-entity); is quite falacious; the water can do it all by itself.

July 16, 2009 2:33 pm

Hoffman’s article pointed to something Dr. John Christy talked about a couple years ago in “Global warming – What do the numbers show?” (you can see it at http://www.youtube.com/watch?v=-WWpH0lmcxA), that back in the 1990s the model predictions seemed to track right along the mean of the actual results. The problem was the modelers “knew the correct answer ahead of time”. In other words, they had looked at the actual data, saw where the trend was going and adjusted their models to fit the trend. Circular reasoning indeed. The models fit the results (until around 2001) because the modelers wanted them to fit the results.
The scientific approach would have been for the modelers to attempt to create models in an almost total data blackout. In other words they’d know about the mathematics, have their knowledge of the current scientific theories, the geography of the Earth and general properties of the atmosphere, but would be totally ignorant about past atmospheric data. Maybe they’d be able to make their own observations just to test out some theories for their models, but other than that they should’ve been shown nothing. Michael Crichton talked about this, that at the FDA researchers in different teams are not allowed to talk to each other about anything having to do with their work for fear of contaminating each other’s results.
People who reverse-engineer computer technologies are supposed to be “virgins”, ignorant of the technology they are trying to figure out. This is done for legal reasons. They are not allowed to look at the computer code. All they can do is poke and prod at their source technology to see what it does, and from that create their own work-alike model based on what they’ve observed. Sounds something like what the climate modelers should’ve been doing all along.
To tell you the truth digital computers may be inadequate to the task, simply because their fundamental design is for discrete processes, things that happen in chunks. Analog computers would probably be better tools for this sort of research.

George E. Smith
July 16, 2009 2:44 pm

To Anthony.
It’s the latest print edition Anthony; I’ll try to remember to bring it in tomorrow. Maybe that is the August edition; seems too late to be July since I just got it yesterday
The front cover article is about how the Neandertals got extinct; it wasn’t Anthony’s fault. The author is someone I never heard of; which doesn’t mean much; but it is a whole CYA description of how climate researchers are always fudging things, and “weaking” them and how that is all legitimate; and how we ordinary mortals just don’t understand the way it works;
If I were you, I would ask SA to give you a page or two to respond as to how we view this chicanery; well at least how you view it; you’re becoming a Banyan tree Anthony, and just putting down roots in all sorts of places; pretty neat I would say.
George
REPLY: I went out and bought the print copy and yep there it is. Funny, for years as a kid, I tried to get into the “Amateur Scientist” column they used to run and never succeeded with any of the projects I built. Now I get in the magazine and don’t even try or evn know about it. – Anthony

George E. Smith
July 16, 2009 2:45 pm

And now I need to make a “weak” and add a (t)in front up there.

July 16, 2009 2:47 pm

I used Hansen’s CO2 scenarios from Appendix B of his 1988 paper to construct a chart of his projected annual atmospheric CO2 concentrations vs. the actual annual values from Mauna Loa. His scenario A matches the actual values almost exactly…
Hansen vs. Mauna Loa

July 16, 2009 2:49 pm

James Hansen made three very different predictions covering a wide range of future temperatures [from a low of ~0.5°, to a high estimate of ~1.5° — 300% more], and he used vague “what if” weasel words to explain the overlap.
Now, Hansen’s defenders pick the cherry they like the best: the chart that most closely matches the current climate: click
The difference between a temperature rise of 0.5° and 1.5° is enormous. Just about anyone could have predicted a temperature rise within those wide parameters.
It’s like predicting the July 2010 unemployment rate with three charts: one “scenario” predicting 9% unemployment, one predicting 11% unemployment, and one predicting 13% unemployment. Then, to show everyone how smart I was, next July I get to pick which chart made the best prediction.
I call shenanigans.

AlexB
July 16, 2009 2:54 pm

Skeptics are always quoting ‘research’ from dodgy backdoor journals like Science. 😛

rbateman
July 16, 2009 2:54 pm

Shrunken outer atmosphere = less time/distance for outgoing radiation to achieve escape.
Which might make room for the ocean heat to escape:
http://scienceandpublicpolicy.org/images/stories/papers/reprint/changes_in_the_ocean.pdf
Pielske, Loehle, Willis have done the math on the ocean heat (stored energy).
Has nothing to do with rising CO2 levels letting it escape, the water under the bridge is gone by. Maybe Hansen could re-invent himself as the “Oceans are Catastrophically Cooling” dude. Al Gore could make a new agenda out of saving the Suez and Panama Canals which are threatened by sea levels dropping faster than we thought.
If the Oceans drop faster than they can deepen those vital canals, Global Trade will take a pounding.

Jim
July 16, 2009 3:26 pm

George E. Smith (14:33:01) :
Thanks, George. In that light, it is difficult to see how a snowball Earth could come about. Unless, of course, the Sun is more variable than we know from our limited observation time of it. If the Sun did dim and cause a snowball Earth, it appears from your data that it would have a means to recover. I’m beginning to believe the Sun may be more variable than we think. I know there are proxies for solar output, but there also appears to be doubt about their fidelity.

July 16, 2009 3:34 pm

Nogw (13:29:55) :
Phil. (12:58:45) :
Niels Bohr said about the so called “Green House effect”:
the absorption of specific wavelengths of light didn’t cause gas atoms/molecules to become hotter. Instead, the absorption of specific wavelengths of light caused the electrons in an atom/molecule to move to a higher energy state. After absorption of light of a specific wavelength an atom couldn’t absorb additional radiation of that wavelength without first emitting light of that wavelength.
And modelling it is just gaming/gambling, not serious experimental science.

Well we know a bit more now than when Bohr made that statement and it’s not correct. For rovibrational transitions in say CO2 then if it absorbs one quantum of energy to be promoted from v=0 to v=1 it can absorb a second quantum to be promoted to v=2 without first emitting light. In fact in the case of CO2 in the lower atmosphere light is usually not emitted because so many collisions with other molecules occur before the molecule has time to emit.
What in fact happens in the atmosphere is that CO2 absorbs IR and is vibrationally excited and within nanoseconds this energy is collisionally exchanged with other molecules (heating them up) and dropping back to the ground state. In that case of course it does follow Bohr’s description (but not via radiation) but it doesn’t need to.

tallbloke
July 16, 2009 3:41 pm

rbateman (14:54:52) :
Shrunken outer atmosphere = less time/distance for outgoing radiation to achieve escape.
Which might make room for the ocean heat to escape:
http://scienceandpublicpolicy.org/images/stories/papers/reprint/changes_in_the_ocean.pdf

Thanks, I’ll take a look at that. Were you making an oblique reference to my post on the GISS for June thread you commented on? It seems relevant to the modeling debate, so I’ll repost it here.
tallbloke (14:23:17) :
I think I’ve cracked it. I calculated the other day that the ocean heat content must have risen 14×10^22J to account for the sea level rise seen by the satellite altimetry, less the melted chunks of Greenland and other ice melt. This is over twice the estimate of ocean heat content given by Levitus the Lead IPCC author in his 2009 paper. I’m sure he’s wrong. His figure matches the co2 radiative forcing, but that’s too convenient…
14×10^22J is equivalent to 4W/m^2
That matches the 4W/m^2 upswings in Outgoing Longwave Radiation from the Earth which happen in antiphase to the solar cycle. It’s the ocean emitting heat when the sun is quiet.
http://s630.photobucket.com/albums/uu21/stroller-2009/?action=view&current=ssn-olr-1974-2009.gif
Dunno where this is going to lead yet, but I think the implications are far reaching.
Open Source Climatology, for open minded climatologists.

July 16, 2009 3:44 pm

Smokey (14:49:51) :
James Hansen made three very different predictions covering a wide range of future temperatures [from a low of ~0.5°, to a high estimate of ~1.5° — 300% more], and he used vague “what if” weasel words to explain the overlap.
Now, Hansen’s defenders pick the cherry they like the best: the chart that most closely matches the current climate: click
The difference between a temperature rise of 0.5° and 1.5° is enormous. Just about anyone could have predicted a temperature rise within those wide parameters.
It’s like predicting the July 2010 unemployment rate with three charts: one “scenario” predicting 9% unemployment, one predicting 11% unemployment, and one predicting 13% unemployment. Then, to show everyone how smart I was, next July I get to pick which chart made the best prediction.
I call shenanigans.

So do I, by Smokey in this case, because Hansen identified at the time which of the scenarios would be the most likely. Hansen’s scenarios depended on whether certain policy decisions were made in the future which at the time were undecided, for example the Montreal Protocol.

tallbloke
July 16, 2009 3:47 pm

Figures relate to 1993-2003 for sea level rise and ocean heat content rise.

old construction worker
July 16, 2009 3:49 pm

George E. Smith (10:42:08) :
One question about soot, which the paper declares causes warming. Hey it may warm the soot by absorbing solar energy; but then that energy doesn’t reach the ground so it would cool the fround; And by catching the solar energy high in the atmosphere, those black partilces can then radiate IR from the upper atmosphere, and send that energy back into space.
‘One question about soot, which the paper declares causes warming. Hey it may warm the soot by absorbing solar energy;’ Isn’t soot capable of being but then that energy doesn’t reach the ground so it would cool the fround;’
You mean to tell me soot (solid), which gets “heated” by income radiation, when it reradiate long wave radation, that energy does not reach the ground but CO2, high in the atmosphere, which is “heated” by long wave redation from the ground and any energy it reradiate does reach the ground! All I can say is wow. I guess CO2 has some magical power I don’t know about.‘
‘And by catching the solar energy high in the atmosphere, those black partilces can then radiate IR from the upper atmosphere, and send that energy back into space.’
Wouldn’t this cause “The Hot Spot”?
I’m stund, I tell you, just stund.

bill
July 16, 2009 3:51 pm

George E. Smith (10:57:50) :
The plot is a calculated spectrum.
The Paper if you read it is measured transmittance of gasses under varios temps and pressures.

MikeE
July 16, 2009 3:51 pm

Jim (15:26:35) :
As i understand it, the snowball earth hypothesis is more based around the increasing reflectivity of the earths surface, no long wave radiation, no greenhouse. The ole tipping point, once enough of the earths surface is reflective it will cause ever increasing negative feedback’s. Earth would have an average temp of around -18C.
The hypothesis has come about due to glacial sediment at latitudes close too the equator… which cant all be explained by continental drift. But just a hypothesis.

Francis
July 16, 2009 4:25 pm

I don’t know the intervening formulas…But I worked as an estimator once…
“In the (IPCC) the direct aerosol effect is reported to have a radiative forcing estimate of -0.5 Watt per square meter…offsetting the warming from CO2 by almost one-third.”
At -0.3 Wm-2, the warming offset is reduced to (3/5 x 33%=) 20%.
So (33%-20%=) the model CO2 warming must be reduced by about 13%.
The resulting sensitivities haven’ been mentioned.
From an estimator’s seat-of-the-pants point of view…this isn’t the end of AGW.

Jim
July 16, 2009 4:48 pm

MikeE (15:51:51) : But according to George’s data, there would be enough water vapor in the air to prevent that. There would be enough water vapor to act as a greenhouse gas to re-heat the system. In fact, as long as the Sun’s output was mostly steady, a snowball Earth would not happen at all. If there has been a snowball Earth, and I’m not saying there was, one way to account for it would be for the Sun to dim enough to allow the Earth to freeze. I’m sure there are other ways. My hypothesis of a variable Sun to account for the temperature excursions alluded to by some of the proxies over the past millions of years, is only that. Some reliable proxy for Solar output would have to be found.

Verified by MonsterInsights