Warming Caused by Soot, Not CO2
From the Resilient Earth
Submitted by Doug L. Hoffman on Wed, 07/15/2009 – 13:19
A new paper in Science reports that a careful study of satellite data show the assumed cooling effect of aerosols in the atmosphere to be significantly less than previously estimated. Unfortunately, the assumed greater cooling has been used in climate models for years. In such models, the global-mean warming is determined by the balance of the radiative forcings—warming by greenhouse gases balanced against cooling by aerosols. Since a greater cooling effect has been used in climate models, the result has been to credit CO2 with a larger warming effect than it really has.
This question is of great importance to climate modelers because they have to be able to simulate the effect of GHG warming in order to accurately predict future climate change. The amount of temperature increase set into a climate model for a doubling of atmospheric CO2 is called the model’s sensitivity. As Dr. David Evans explained in a recent paper: “Yes, every emitted molecule of carbon dioxide (CO2) causes some warming—but the crucial question is how much warming do the CO2 emissions cause? If atmospheric CO2 levels doubled, would the temperature rise by 0.1°, 1.0°, or by 10.0° C?”
Temperature sensitivity scenarios from IPCC AR4.
The absorption frequencies of CO2 are already saturated, meaning that the atmosphere already captures close to 100% of the radiation at those frequencies. Consequently, as the level of CO2 in the atmosphere increases, the rise in temperature for a given increase in CO2 becomes smaller. This sorely limits the amount of warming further increases in CO2 can engender. Because CO2 on its own cannot account for the observed temperature rise in the past century, climate modelers assume that linkages exist between CO2 and other climate influences, mainly water vapor (for a more detailed explanation of what determines the Global Warming Potential of a gas see my comment “It’s not that simple”).
To compensate for the missing “forcing,” models are tuned to include a certain amount of extra warming linked to carbon dioxide levels—extra warming that comes from unestablished feedback mechanisms whose existence is simply assumed. Aerosol cooling and climate sensitivity in the models must balance each other in order to match historical conditions. Since the climate warmed slightly last century the amount of warming must have exceeded the amount of cooling. As Dr. Roy Spencer, meteorologist and former NASA scientist, puts it: “They program climate models so that they are sensitive enough to produce the warming in the last 50 years with increasing carbon dioxide concentrations. They then point to this as ‘proof’ that the CO2 caused the warming, but this is simply reasoning in a circle.”
A large aerosol cooling, therefore, implies a correspondingly large climate sensitivity. Conversely, reduced aerosol cooling implies lower GHG warming, which in turn implies lower model sensitivity. The upshot of this is that sensitivity values used in models for the past quarter of a century have been set too high. Using elevated sensitivity settings has significant implications for model predictions of future global temperature increases. The low-end value of model sensitivity used by the IPCC is 2°C. Using this value results, naturally, in the lowest predictions for future temperature increases. According to the paper “Consistency Between Satellite-Derived and Modeled Estimates of the Direct Aerosol Effect” published in Science on july 10, 2009, Gunnar Myhre states that previous values for aerosol cooling are too high—by as much as 40 percent—implying the IPCC’s model sensitivity settings are too high also. Here is the abstract of the paper:
In the Intergovernmental Panel on Climate Change Fourth Assessment Report, the direct aerosol effect is reported to have a radiative forcing estimate of –0.5 Watt per square meter (W m–2), offsetting the warming from CO2 by almost one-third. The uncertainty, however, ranges from –0.9 to –0.1 W m–2, which is largely due to differences between estimates from global aerosol models and observation-based estimates, with the latter tending to have stronger (more negative) radiative forcing. This study demonstrates consistency between a global aerosol model and adjustment to an observation-based method, producing a global and annual mean radiative forcing that is weaker than –0.5 W m–2, with a best estimate of –0.3 W m–2. The physical explanation for the earlier discrepancy is that the relative increase in anthropogenic black carbon (absorbing aerosols) is much larger than the overall increase in the anthropogenic abundance of aerosols.
The complex influence of atmospheric aerosols on the climate system and the influence of humans on aerosols are among the key uncertainties in the understanding recent climate change. Rated as one of the most significant yet poorly understood forcings by the IPCC there has been much activity in aerosol research recently (see Airborne Bacteria Discredit Climate Modeling Dogma and African Dust Heats Up Atlantic Tropics). Some particles absorb sunlight, contributing to climate warming, while others reflect sunlight, leading to cooling. The main anthropogenic aerosols that cause cooling are sulfate, nitrate, and organic carbon, whereas black carbon absorbs solar radiation. The global mean effect of human caused aerosols (in other words, pollution) is a cooling, but the relative contributions of the different types of aerosols determine the magnitude of this cooling. Readjusting that balance is what Myhre’s paper is all about.
|
Discrepancies between recent satellite observations and the values needed to make climate models work right have vexed modelers. “A reliable quantification of the aerosol radiative forcing is essential to understand climate change,” states Johannes Quaas of the Max Planck Institute for Meteorology in Hamburg, Germany. Writing in the same issue of Science Dr. Quaas continued, “however, a large part of the discrepancy has remained unexplained.” With a systematic set of sensitivity studies, Myhre explains most of the remainder of the discrepancy. His paper shows that with a consistent data set of anthropogenic aerosol distributions and properties, the data-based and model-based approaches converge.
Myhre argues that since preindustrial times, soot particle concentrations have increased much more than other aerosols. Unlike many other aerosols, which scatter sunlight, soot strongly absorbs solar radiation. At the top of the atmosphere, where the Earth’s energy balance is determined, scattering has a cooling effect, whereas absorption has a warming effect. If soot increases more than scattering aerosols, the overall aerosol cooling effect is smaller than it would be otherwise. According to Dr. Myhre’s work, the correct cooling value is some 40% less than that previously accepted by the IPCC.
Not that climate modelers are unaware of the problems with their creations. Numerous papers have been published that detail problems predicting ice cover, precipitation and temperature correctly. This is due to inadequate modeling of the ENSO, aerosols and the bane of climate modelers, cloud cover. Apologists for climate modeling will claim that the models are still correct, just not as accurate or as detailed as they might be. Can a model that is only partially correct be trusted? Quoting again from Roy Spencer’s recent blog post:
It is also important to understand that even if a climate model handled 95% of the processes in the climate system perfectly, this does not mean the model will be 95% accurate in its predictions. All it takes is one important process to be wrong for the models to be seriously in error.
Can such a seemingly simple mistake in a single model parameter really lead to invalid results? Consider the graph below, a representation of the predictions made by James Hansen to the US Congress in 1988, plotted against how the climate actually behaved. Pretty much what one would expect if the sensitivity of the model was set too high, yet we are still supposed to believe in the model’s results. No wonder even the IPCC doesn’t call their model results predictions, preferring the more nebulous term “scenarios.”

Now that we know the models used by climate scientists were all tuned incorrectly what does this imply for the warnings of impending ecological disaster? What impact does this discovery have on the predictions of melting icecaps, rising ocean levels, increased storm activity and soaring global temperatures? Quite simply they got it wrong, at least in as much as those predictions were based on model results. To again quote from David Evans’ paper:
None of the climate models in 2001 predicted that temperatures would not rise from 2001 to 2009—they were all wrong. All of the models wrongly predict a huge dominating tropical hotspot in the atmospheric warming pattern—no such hotspot has been observed, and if it was there we would have easily detected it.
Once again we see the shaky ground that climate models are built on. Once again a new paper in a peer reviewed journal has brought to light significant flaws in the ways models are configured—forced to match known historical results even when erroneous values are used for fundamental parameters. I have said many times that, with enough tweaking, a model can be made to fit any set of reference data—but such bogus validation does not mean the model will accurately predict the future. When will climate science realize that its reputation has been left in tatters by these false prophets made of computer code?
Be safe, enjoy the interglacial and stay skeptical.
==================================
ADDENDUM BY ANTHONY
I’d like to add this graph showing CO2’s temperature response to supplement the one Doug Hoffman cites from IPCC AR4. here we see that we are indeed pretty close to saturation of the response.

The “blue fuzz” represents measured global CO2 increases in our modern times.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.


It just shows that the huge amount of money that is going in to all the climate research is having the result that the more they find out the more they realise they don’t know.
Welcome to real science, climatologists.
This sounds a bit like a football coach whose greatest asset is a withdrawn striker whose skill will set up a striking partner with pace and shooting accuracy with reams of goals.
Unfortunately, the manager bought a ‘traditional English centre forward’ who was big and strong, could head the ball beautifully but had a bit of a leaden first touch.
Normally in such cases, either the star departs, the manager realises his mistake and replaces the striker or the board fire the manager.
Any chance of any of that happening in the climate change circles??!!
Lies, damn lies and climate modelling…
Nice graph Anthony. Please can we have the equations for each of the three curves.
I do not buy this aerosols alchemy. Warmists still claim aerosols are responsible for 1940-1978 temperature stagnation/decrease, even the real reason – oceanic oscillations – have been discovered since. The only ample effect of aerosols is for me Pinatubo or El Chichon, causing a temperature drop for few following years.
“No wonder even the IPCC doesn’t call their model results predictions, preferring the more nebulous term “scenarios.””
Yeah, right!
“”All the world’s a stage, And all the men and women merely players, They have their exits and entrances, And one man in his time plays many parts, …””
We covered this point on June 28, 2009.
Science is SO last season. :^)
Regards to all, Allan
*********************************************
http://wattsupwiththat.com/2009/06/27/new-paper-global-dimming-and-brightening-a-review/#comments
28June2009
Leif Svalgaard (13:21:18) :
Allan M R MacRae (12:54:27) :
“There are actual measurements by Hoyt and others that show NO trends in atmospheric aerosols, but volcanic events are clearly evident.”
But increased atmospheric CO2 is NOT a significant driver of global warming – that much is obvious by now.
Leif: But what has that to do with aerosols?
**************************
Leif, I did say:
The sensitivity of global temperature to increased atmospheric CO2 is so small as to be inconsequential – much less than 1 degree C for a doubling of atmospheric CO2. CO2 feedbacks are negative, not positive. Climate model hindcasting fails unless false aerosol data is used to “cook” the model.
Connecting the dots, to answer your question:
The fabricated aerosol data allows climate model hindcasting to appear credible while assuming a false high sensitivity of global temperature to atmospheric CO2.
The false high sensitivity is then used to forecast catastrophic humanmade global warming (the results of the “cooked” climate models).
What happens if the fabricated (phony) aerosol data is not used?
No phony aerosol data > no credible model hindcasting > no artificially high climate sensitivity to CO2 > no model forecasting of catastrophic humanmade global warming.
Regards, Allan
Supporting P.S.:
Earth is cooling, not warming. Pass it on…
Anthony would you mind plotting the logarithm of CO2 in your graph on the x axis?
The nearly saturated argument is a well known red herring.
It is exactly this logarithmic relationship why climate sensitivity is expressed per CO2 doubling:
No feedback 1.2 K/2xCo2
Strong postiive feedback 1.5-4.5 (IPCC range)
Strong negative feedback 0.3-0.6 (eg Lindzen and Miskolczi)
German (Dr. Heinz Hug) and American scientists already reported about the very little influence of CO2 in the 80’s and 90’s of the past century. Now it becomes true! Why? I think our miserable politicians are fearing the possible coming colder times and are confronted with an ill economy.
Posted earlier on June 28
Allan M R MacRae (10:41:55) :
Allan M R MacRae (03:23:07)
FABRICATION OF AEROSOL DATA USED FOR CLIMATE MODELS:
The pyrheliometric ratioing technique is very insensitive to any changes in calibration of the instruments and very sensitive to aerosol changes.
Here are three papers using the technique:
Hoyt, D. V. and C. Frohlich, 1983. Atmospheric transmission at Davos, Switzerland, 1909-1979. Climatic Change, 5, 61-72.
Hoyt, D. V., C. P. Turner, and R. D. Evans, 1980. Trends in atmospheric transmission at three locations in the United States from 1940 to 1977. Mon. Wea. Rev., 108, 1430-1439.
Hoyt, D. V., 1979. Pyrheliometric and circumsolar sky radiation measurements by the Smithsonian Astrophysical Observatory from 1923 to 1954. Tellus, 31, 217-229.
In none of these studies were any long-term trends found in aerosols, although volcanic events show up quite clearly. There are other studies from Belgium, Ireland, and Hawaii that reach the same conclusions. It is significant that Davos shows no trend whereas the IPCC models show it in the area where the greatest changes in aerosols were occurring.
There are earlier aerosol studies by Hand and in other in Monthly Weather Review going back to the 1880s and these studies also show no trends.
___________________________
Repeating: “In none of these studies were any long-term trends found in aerosols, although volcanic events show up quite clearly.”
___________________________
Here is an email just received from Douglas Hoyt [my comments in square brackets]:
It [aerosol numbers used in climate models] comes from the modelling work of Charlson where total aerosol optical depth is modeled as being proportional to industrial activity.
[For example, the 1992 paper in Science by Charlson, Hansen et al]
http://www.sciencemag.org/cgi/content/abstract/255/5043/423
or [the 2000 letter report to James Baker from Hansen and Ramaswamy]
http://74.125.95.132/search?q=cache:DjVCJ3s0PeYJ:www-nacip.ucsd.edu/Ltr-Baker.pdf+%22aerosol+optical+depth%22+time+dependence&cd=4&hl=en&ct=clnk&gl=us
where it says [para 2 of covering letter] “aerosols are not measured with an accuracy that allows determination of even the sign of annual or decadal trends of aerosol climate forcing.”
Let’s turn the question on its head and ask to see the raw measurements of atmospheric transmission that support Charlson.
Hint: There aren’t any, as the statement from the workshop above confirms.
__________________________
IN SUMMARY
There are actual measurements by Hoyt and others that show NO trends in atmospheric aerosols, but volcanic events are clearly evident.
So Charlson, Hansen et al ignored these inconvenient aerosol measurements and “cooked up” (fabricated) aerosol data that forced their climate models to better conform to the global cooling that was observed pre~1975.
Voila! Their models could hindcast (model the past) better using this fabricated aerosol data, and therefore must predict the future with accuracy.
That is the evidence of fabrication of the aerosol data used in climate models that predict catastrophic humanmade global warming.
And we are going to spend trillions and cripple our Western economies based on this fabrication of false data, this model cooking, this nonsense?
*************************************************
Mr Hoffman clearly doesn’t understanding the usage of scenario. It is not another word for prediction. It’s an input that is determined by human decisions, and can’t be predicted scientifically. In Hansen’s 1988 paper, it referred to future CO2 emission. He said CO2 emission might, depending on governments, increase exponentially (scenario A), linearly (B) or tail off (C). He calculated a projection for each scenario.
As it turned out, CO2 did not increase exponentially. It was close to linear (scenario B). It is thoroughly misleading to plot the scenario A case as if it was Hansen’s projection. The projection for scenario B, the CO2 emission that did occur, was very good.
His statement that the CO2 frequencies are already saturated pretty much confirms my belief that there isn’t much energy left for CO2 to absorb, no matter how much CO2 is added to the atmosphere.
Looking at the absorption bands
http://www.globalwarmingart.com/wiki/Image:Atmospheric_Absorption_Bands_png
it’s easy to see that the only band CO2 has that isn’t covered by the 100 times more plentiful water bands, is the one at 4 microns. Going up to the blackbody curves, that band just doesn’t have much to give.
To tallbloke
Look here:-
http://brneurosci.org/co2.html
for your equations regarding CO2
Also the water vapour feedback in the climate models is just plain wrong. There is a massive difference between a positive feedback and a negetive feedback , at least 5C . This is the difference between an ice age and now.
What would happens if one were to repeat the calculation using not the adjusted data for measured temperatures, but instead a set of honest, competent temperature observations?
Posted only half the absorption image, sorry.
http://www.globalwarmingart.com/wiki/Image:Atmospheric_Transmission_png
Tell me, does the term ‘aerosols’ include clouds?
When we studied climate models in the late 70s, it was a given that all models were imperfect, we knew we were only human, but it was fun and quite harmless. No individual thought that the models would be able to predict with any accuracy future trends, computers got bigger, Nasa wanted more accurate results, people believed in computers, enter Jim and the rest is history(and now hopefully is the IPCC).
I am no luddite but people and machines should know their limits, or else we start to believe in our own divinity.
I would like to know if the GCM’s use a curve such as Anthony’s blue fuzz graph for the system response to CO2 or do they cheat and straight line the response from the 0 ppm point to the current point and keep going straight up after that?
Hansen published 3 scenarios (http://www.pnas.org/content/103/39/14288.full). Why show just one which is FAR too high in its CO2 emissions? This tactic has been used before (http://scienceblogs.com/deltoid/2008/01/steve_mcintyre_defends_pat_mic.php).
>The absorption frequencies of CO2 are already saturated, meaning that the atmosphere already captures close to 100% of the radiation at those frequencies.
This is wrong.
Harries, J.E., H.E. Brindley, P.J. Sagoo, and R.J. Bantges, 2001: Increases in greenhouse forcing inferred from the outgoing longwave radiation spectra of the Earth in 1970 and 1997. Nature, 410, 355–357.
http://www.nature.com/nature/journal/v410/n6826/abs/410355a0.html
Finally, the last number on the graph should be +0.62C.
Given 2 of 3 comments have been gagged today – don’t anticipate this one to get through the fact filter.
REPLY: Given that you routinely use a BoM IP address to post comments, I’d suggest that if you want to speak on behalf of BoM use your full name there. See the policy page. – Anthony
Given there is no visible decadal or multidecadal reduction in the earth’s outgoing longwave radiation, this admits the possibilities that:
1) The theoretical forcing due to co2 of 1.7W/m^2 is overstated, due to a failure to allow for other climate influencing factors such as the positive phases of oceanic cycles, and the overestimation of the effect of aerosols as described here. This seems highly likely.
2) The Earth has compensatory mechanisms we had not hitherto considered which can mitigate any effect of a 30% change in the atmospheric concentration of a trace gas. For example, Stephen Wilde’s idea that the latitudinal polewards boundaries of the temperate zone Hadley cells wouldn’t have to move very far to alter the radiative balance of Earth’s climate. This would reduce the scare story to “Oh no, the seasonal rains moved two hundred miles north.” Since the boundaries of the cells and the jet streams wimble around anyway due to known cyclic and incalculable chaotic factors and semi-periodic inter-annual variabilities, we could probably live with that without to much upheaval and panic.
3) Given that there is no visible co2 forcing signal in the observational OLR data it would seem likely that (1) or (2) or both is the true situation.
Rhys,
We call that soccer. And we have no idea what you are talking about.
(:
Moderator: This ought to be snipped!
WARNING: Extremely old Swedish Pharmacist joke.
Customer: I vant a deodorant.
Pharmacist: Yes sir. Ball or Aerosol?
Customer: Neider, I vant it for my armpit.
[Shame on you! ~dbstealey, moderator]
“DJ (01:51:27) : Hansen published 3 scenarios (http://www.pnas.org/content/103/39/14288.full). Why show just one which is FAR too high in its CO2 emissions?”
You are wrong, DJ. Scenario A is not FAR too high in its CO2 emissions. In fact, emissions have exceeded the 1.5% growth used as an assumption. However, there are a couple of points that helps your side. First, Hansen underestminated (or ignored), how increased CO2 levels would be mitigated by increased flora. Therefore, actual CO2 levels have been close to his Scenario levels. Second, Scenario A (Business-as-Usua) had high levels of ozone-depleting gases which definitely have exceeded reality. (Hansen emphasized Scenario A in his presentation even though the Montreal Protocol had already been implemented.)
DJ (01:51:27) :
“Given 2 of 3 comments have been gagged today – don’t anticipate this one to get through the fact filter.”
Oh don’t worry, none of what you said made it through the fact filter.
@DJ (01:51:27)
Have you read this post at CA? http://www.climateaudit.org/?p=3354
Scenario B, according to Hansen, is the most likely case and Scenario C requires “draconian measures”.