Aerosol sat observations and climate models differ "by a factor of three to six"

Fig. 4. Shortwave indirect forcing from the true modeled PD and PI values of Nc (Top), from the PI Nc based on the regression between Nc and AOD (Middle), and from the PI Nc based on the regression between Nc and AI (Bottom). The satellite estimates of forcing include only the region from 60 °N to 60 °S. If the true model forcing is restricted to this region, the total forcing is −1.56 Wm−2.

From the University of Michigan something I think Dr. Roy Spencer will be interested in as it is yet another case where models and satellite observations differ significantly. See the figure S1 at the end of this article – Anthony

Aerosols affect climate more than satellite estimates predict

ANN ARBOR, Mich.—Aerosol particles, including soot and sulfur dioxide from burning fossil fuels, essentially mask the effects of greenhouse gases and are at the heart of the biggest uncertainty in climate change prediction. New research from the University of Michigan shows that satellite-based projections of aerosols’ effect on Earth’s climate significantly underestimate their impacts.

The findings will be published online the week of Aug. 1 in the early edition of the Proceedings of the National Academy of Sciences.

Aerosols are at the core of “cloud drops”—water particles suspended in air that coalesce to form precipitation. Increasing the number of aerosol particles causes an increase in the number of cloud drops, which results in brighter clouds that reflect more light and have a greater cooling effect on the planet.

As to the extent of their cooling effect, scientists offer different scenarios that would raise the global average surface temperature during the next century between under 2 to over 3 degrees Celsius. That may not sound like a broad range, but it straddles the 2-degree tipping point beyond which scientists say the planet can expect more catastrophic climate change effects.

The satellite data that these findings poke holes in has been used to argue that all these models overestimate how hot the planet will get.

“The satellite estimates are way too small,” said Joyce Penner, the Ralph J. Cicerone Distinguished University Professor of Atmospheric Science. “There are things about the global model that should fit the satellite data but don’t, so I won’t argue that the models necessarily are correct. But we’ve explained why satellite estimates and the models are so different.”

Penner and her colleagues found faults in the techniques that satellite estimates use to find the difference between cloud drop concentrations today and before the Industrial Revolution.

“We found that using satellite data to try to infer how much radiation is reflected today compared to the amount reflected in the pollution-free pre-industrial atmosphere is very inaccurate,” Penner said. “If one uses the relationship between aerosol optical depth—essentially a measure of the thickness of the aerosols—and droplet number from satellites, then one can get the wrong answer by a factor of three to six.”

These findings are a step toward generating better models, and Penner said that will be the next phase of this research.

“If the large uncertainty in this forcing remains, then we will never reduce the range of projected changes in climate below the current range,” she said. “Our findings have shown that we need to be smarter. We simply cannot rely on data from satellites to tell us the effects of aerosols. I think we need to devise a strategy to use the models in conjunction with the satellite data to get the best answers.”

###

The paper is called “Satellite-methods underestimate indirect climate forcing by aerosols.” The research is funded by NASA.

PNAS Early Edition: http://www.pnas.org/content/early/recent

Joyce Penner: http://aoss.engin.umich.edu/people/penner

The University of Michigan College of Engineering is ranked among the top engineering schools in the country. At $180 million annually, its engineering research budget is one of largest of any public university. Michigan Engineering is home to 11 academic departments, numerous research centers and expansive entrepreneurial programs. The College plays a leading role in the Michigan Memorial Phoenix Energy Institute and hosts the world-class Lurie Nanofabrication Facility. Michigan Engineering’s premier scholarship, international scale and multidisciplinary scope combine to create The Michigan Difference. Find out more at http://www.engin.umich.edu/.

===========================================================

You can read the full text of the paper here including the SI: http://www.pnas.org/content/early/2011/07/25/1018526108.full.pdf?with-ds=yes

This figure from the SI is quite interesting:

Fig. S1. The off-line aerosol indirect effect averaged over the 60 °S to 60 °N (GLB), Northern Hemisphere (NH), Southern Hemisphere (SH), land areas (LAD), ocean areas (OCN), and the regions defined in Table S1 using preindustrial values for Nc based on the regression of present-day values of aerosol optical depth (AOD), based on regression of present-day values of AI (Aerosol Index), and based on the true modeled preindustrial values. SPO, Southern Pacific Ocean; SAO, Southern Atlantic Ocean; SIO, Southern Indian Ocean; TPO, Tropical Pacific Ocean; TAO, Tropical Atlantic Ocean; TIO, Tropical Indian Ocean; NPO, North Pacific Ocean; NAO, North Atlantic Ocean; SAM, South America; AFR, Africa; OCE, Oceania; NAM, North America; EUR, Europe; ASI, Asia.

 

0 0 votes
Article Rating
109 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Darren Parker
August 2, 2011 12:14 am

The models will be the downfall – because they’ll either fall over completely or they’ll be continually adjusted until they actually do become useful but by then we’ll have realised (well THEY will) that the models were all too alarmist to start with and should never have been the basis for any type of policy or action

jorgekafkazar
August 2, 2011 12:43 am

Worse than we thought, robust, unprecedented, yatta-yatta.

August 2, 2011 12:47 am

I think the satellites do a better job than the models. Kaufmann, Kauppi, M.L. Mann and Stock PNAS July 2011 estimated sulfur dioxide emissions in the range 55-75 million tonnes p.a. over the period 1970-2000. and argue this sort of level has been enough since 1998 to explain the “hiatus” in global mean temperature since 1998, despite continuously increasing emissions of carbon dioxide at 3% p.a. since then, to the current level of over 30 Billion tCO2 gross, of which around 14 GtCO2 remain aloft. There seems to be an order of magnitude difference, can it really be that 70 Million tonnes of SO2 are enough to offset whatever warming potential there is in 14 Billion tonnes of CO2?
I suspect the satellites have difficulty tracking the SO2 because there is so little of it being emitted outside the USA and China.

August 2, 2011 12:55 am

“based on the true modeled preindustrial values” Is that an oxymoron? Can a “true” value of something historical be obtained from a computer model? How are these values verified as “true” if there were no measurements of these in pre-industrial times? I am not implying the results are right or wrong, just curious about the “true”ness.

August 2, 2011 1:00 am

As I read this, some of the English is obscure. The opening paragraph fails to make clear whether it is the aerosoles or the greenhouse gases whose effects are underestimated The argument advanced seems to be that as the satellite data disagrees with the computer models, the data must be wrong. Does the old English phrase, “ass about face”, summarise the situation, or have I missed something?

Alan the Brit
August 2, 2011 1:18 am

At what point precisely, do academia modelists stop & look hard at the real world data, then back at their modelled output, & say, “Now, just hold on one cotton-pickin’ minute!” ? At which point do they put their hands up & admit they could be wrong. I suspect never whilst the financial tap is left well & truly open filling their trough!

TinyCO2
August 2, 2011 1:22 am

So will they be going back and recalculating the amount/effect of aerosols in historical temperatures? Will they be trying to count how many small volcanoes there were a 100+ years ago? Or will they estimate them by using the climate models to tell them how many there should have been? Will they rethink the effect of desulphurisation in the 70s and 80s on the rapidly rising temperatures? If their assessment of how the climate worked in the noughties is wrong then their assessment of how the climate worked on the past is wrong.
Those models that so ‘accurately’ matched the past are just sophisticated computer games where the ending is the only fixed point. It’s always worse than we thought.

anopheles
August 2, 2011 1:29 am

Aerosols, the epicycles of the climate world.

bruce
August 2, 2011 1:49 am

Special pleading to provide a case for further special pleading! Is this how science advances?

Another Ian
August 2, 2011 1:59 am

“Comment from: ianl8888 July 30th, 2011 at 9:42 am
@sean2829
“now its being cited as a source of light reflective aerosols that can explain cooling over the last 10 years.”
The base paper for this assertion is “Anthropogenic sulfur dioxide emissions:1850-2005″, Smith,S.J. et al, Atmos. Chem. Phys., 11, 1101-1116, 2011, wherein Smith et al promote a measurement of the sulphur content of coal burnt in Chinese power stations
Unhappily for this notion of SOx aerosols helping “cool” the planet, the guesstimate of sulphur content in raw/washed coals used in the paper is > 50% higher than laboratory-measured content. Since sulphur content in raw/washed coal is a make-or-break parameter for supply contracts, the widespread lab measurements are accurate and very carefully monitored
Yet another wishful Polyanna notion promoted to a gullible media (which are infested by scientific illiterates and mathematical innumerates)
And, as cementafriend’s post notes:
CSG (coal seam gas) = very, very BAD
LNG (liquified natural gas) = better, mo’ greenie friendly
yet both are methane CH4”
From http://jennifermarohasy.com/blog/2011/07/natural-gas-more-polluting-than-coal-only-according-to-the-ipcc-a-note-from-cementafriend/#comments

Katherine
August 2, 2011 2:05 am

“The satellite estimates are way too small,” said Joyce Penner, the Ralph J. Cicerone Distinguished University Professor of Atmospheric Science. “There are things about the global model that should fit the satellite data but don’t, so I won’t argue that the models necessarily are correct. But we’ve explained why satellite estimates and the models are so different.”
Penner and her colleagues found faults in the techniques that satellite estimates use to find the difference between cloud drop concentrations today and before the Industrial Revolution.
“We found that using satellite data to try to infer how much radiation is reflected today compared to the amount reflected in the pollution-free pre-industrial atmosphere is very inaccurate,” Penner said.

So the models are correct and there has to be something wrong with how the satellites estimate the impact? Did I get that right? And they’re saying that the atmosphere was pollution-free before the Industrial Revolution? Well, I suppose aerosols from volcanic eruptions, oceanic spray, continental dust, wildfires, and natural coal fires don’t count as pollution, per se.
“Our findings have shown that we need to be smarter. We simply cannot rely on data from satellites to tell us the effects of aerosols. I think we need to devise a strategy to use the models in conjunction with the satellite data to get the best answers.”
They don’t like what the satellites say, therefore they must be wrong. She says, “I won’t argue that the models necessarily are correct,” then proceeds to do just that. It is to laugh.

kadaka (KD Knoebel)
August 2, 2011 2:05 am

Svensmark’s GCR theory keeps advancing to full confirmation, Dr. Spencer has shown how a mere 1 to 2% decrease in global cloud cover can account for the “global warming signal.”
And now Climate Science™ will say: “No no no, we are still right, even though we were wrong! Our CO2->(C)AGW theory dependent on positive feedbacks is still sound! We merely figured in the aerosols, and how they alone affect clouds, wrong! The aerosols have merely masked the CO2 warming effect, it’s really still there! Send us more grant money so we can reprogram the models, and you’ll see our climate models match the observations with peer-reviewed unprecedented accuracy! Don’t believe those climate deniers, only we Climate Scientists™ have always known the only real truth!”
Does that sound like a fair summary?

Alexander K
August 2, 2011 2:21 am

The faith in models being superior to actual data is … nothing other than a religious faith.

Mac
August 2, 2011 2:34 am

‘Observations don’t match projections’ has becoming a much repeated phrase of late.
The models don’t work.

Peter Miller
August 2, 2011 2:35 am

So, on the one hand soot, because it is dark, absorbs heat and when it lands on glaciers obviously helps melt them, which seems to the case in the Himalayas, as opposed to ‘global warming’. How do you model the impact of burning cow dung and sticks as fuel for cooking in India?
On the other hand, sulphur dioxide helps in cloud formation, which reflects heat back out of the atmosphere.
So burning coal is both good and bad for our planet’s climate. I fail to see how this can possibly be modeled in the context of our planet’s climate, when you also have to deal with El Nino and La Nina cycles, UHI, long term climate cycles, volcanoes, plus general weather and climate ‘chaos’.
Perhaps in the aftermath of the ridiculous budget posturings on Capitol Hill over the past few days, someone will have the sense to say that pouring millions into this type of research is a complete waste of money.
Even if you can model it accurately, which I doubt, will that stop the Chinese, Indians etc. burning coal? The answer is No – perhaps in the countries with goofball energy policies like Britain, Denmark and Germany it might have an effect.

August 2, 2011 2:42 am

“That may not sound like a broad range, but it straddles the 2-degree tipping point beyond which scientists say the planet can expect more catastrophic climate change effects” – what “2-degree tipping point”, and which scientists claim that? The 2 degrees was simply an arbitrary target for policy makers suggested in 2009.
This paper sounds rather like another brick in the wall of “explain the decline” which is being built now that “ignore the decline” failed to work. Expect more of the same “big on words” but “thin on content” new wallpaper for the crumbling edifice.

KnR
August 2, 2011 2:49 am

First rule of climate science if the values of the models and reality differ, its reality which is in error .

August 2, 2011 2:51 am

“We found that using satellite data to try to infer how much radiation is reflected today compared to the amount reflected in the pollution-free pre-industrial atmosphere is very inaccurate,” Penner said.
A pollution-free pre-industrial atmosphere? Never heard of natural forest fires? Never heard of natural emissions of volatile organics (like terpenes) from trees, leading to secondary organic aerosols (SOA), causing the haze of the “blue” mountains? These are measured in the free troposphere in a ratio of 2:1 to larger than 10:1 compared to SOx, between 0.5 and 5.5 km altitude. See Heald e.a.:
http://acmg.seas.harvard.edu/publications/heald_2005.pdf

David A
August 2, 2011 2:57 am

kadaka (KD Knoebel) says:
August 2, 2011 at 2:05 am
Exactly!!!

Antonia
August 2, 2011 3:39 am

Just last year, nonagerian James Lovelock, green guru and father of the Gaia hypothesis, cheerfully admitted that climate scientists might all be wrong. That clouds and aerosols might be driving the climate. That the physics aren’t settled yet. And that the best thing to do was to enjoy life. I especially enjoyed his summation of the Stern Review: “If you mix up some science that’s incomplete with some economics which is almost as bad, you’re going to get an absolutely dreadful progeny.” Priceless. Now why does Australian economist Ross Garnaut spring to mind?

Roger Carr
August 2, 2011 3:44 am

Peter Miller says: (August 2, 2011 at 2:35 am)
Even if you can model it accurately, which I doubt, will that stop the Chinese, Indians etc. burning coal? The answer is No – perhaps in the countries with goofball energy policies like Britain, Denmark and Germany it might have an effect.
With regret, Peter, may I ask you to add Australia to the goofball list?

Steve Keohane
August 2, 2011 3:49 am

Apparently, overestimating the effects of CO2 leads to overestimating the amount of aerosols to counteract the first fantasy.

JaneHM
August 2, 2011 4:10 am

The problem with this sort of ‘aerosol’ paper is that it considers only the negative feedback effect which aerosols have on lower troposphere temperature, not both the positive and negative feedback effects.

August 2, 2011 4:14 am

Please, officer, don’t arrest me! I’m not really speeding. I know you clocked me at 90, but my REAL speed is 45. I know this because I’ve calculated it from digital simulations. My REAL speed of 45 was masked by the rally stripes on my car’s hood, which fooled you into thinking I was going fast.
Wouldn’t work with a cop, shouldn’t work with scientists. Unfortunately scientists and Experts are much dumber than cops.

August 2, 2011 4:15 am

If the satellite estimates are way too small, then this does not bode well for those who insist that the models hindcasted rather well. — John M Reynolds

John Law
August 2, 2011 4:44 am

Another Ian says:
August 2, 2011 at 1:59 am
And, as cementafriend’s post notes:
“CSG (coal seam gas) = very, very BAD
LNG (liquified natural gas) = better, mo’ greenie friendly
yet both are methane CH4″
Simple really
Good methane: (LNG) fairy scarce and controlled by big oil, and linked to oil price.
Bad methane: shale and coal bed methane, plentiful and spread throughout the world and more ameanable to market forces.

Dave Springer
August 2, 2011 4:52 am

So the aerosol modeling is all eff’d up, they go back and forth at their convenience on whether more clouds cool the surfarce or warm it, then tell us the science is settled. Someone get a rope…

steven
August 2, 2011 4:52 am

It will be interesting to see how much of the warming from 1975 – 2000 will have to be accredited to decreasing aerosols using their model.

AndyG55
August 2, 2011 4:53 am

““now its being cited as a source of light reflective aerosols that can explain cooling over the last 10 years.””
So suddenly 10 years ago, even though technology was probably gradually reducing the ratio of aerosols to CO2, aerosols started having a cooling effect… but aerosols had no affect before that, when there was some apparent warming ???
(even though that warming was probably mostly from removing colder temp records).
Something wrong with their logic there, I reckon.

Dave Springer
August 2, 2011 4:55 am

Climate boffins need to learn the first rule of holes: when you’ve dug yourself into a hole the first thing to do is stop digging.

bikermailman
August 2, 2011 4:58 am

This would be H I larious if it weren’t so serious. One word for you (one acronym anyway), GIGO.

Cliff Maurer
August 2, 2011 5:08 am

I thought field studies from 2004 showed that seasonal atmospheric brown clouds caused warming of up to half the amount attributed to carbon dioxide- at least on the synoptic scale. So aerosols cause tropospheric warming in addition to stratospheric effects that offset warming. Which effect is the more significant?

Mike Davis
August 2, 2011 5:10 am

The University of Michigan just took a hit on their reputation with the publication of this research as did NASA for funding it!

MikeN
August 2, 2011 5:15 am

If they are understating the impact of aerosols currently, which are a cooling factor, that means they are providing more cooling that previously thought, and thus blocking more warming than previously thought, meaning future warming will be higher.

Steve Fitzpatrick
August 2, 2011 5:18 am

Seems like nothing by wild eyed speculation. The paper is just another arm-wave to explain why the projected warming of 0.2C per decade is not happening. The paper seems to admit that the models don’t have it right, but still (surprise!) claims very high climate sensitivity by postulating even greater albedo increases due to sulfate emissions than the modelers have used. So long as people rely on speculation about aerosol effects instead of data on aerosol effects, there will be little progress confirming climate sensitivity. A good start would be considering Occam’s razor: the simplest explanation is that the climate sensitivity is just much lower than the models say… like about 1.4 – 1.6C per doubling….. no arm waves about extreme aerosol effects are then needed.
I believe that divergence of reality from warming projections will eventually drag climate science (kicking and screaming) to accept objective reality instead of the very-high-sensitivity paradigm that has been dominant for 25+ years… this paper seems to be just another in that long series of kicks and screams. I expect many more like it. As Thomas Kuhn noted, defenders of an existing paradigm are extremely resistant to contrary data; they have too much personal stake in the paradigm to be convinced by contrary information, no matter how solid that information may be. But data always trumps the dominant paradigm in the end, so the high-climate-sensitivity paradigm will ultimately fall. The only important question is if the paradigm leads to wasteful and counterproductive public policy before it is replaced.

Peter Plail
August 2, 2011 5:18 am

This reliance on and belief in models seems to have got completely out of hand.
My understanding is that models are attempts to simulate real world processes so that, eventually, accurate predictions may be made. The process of developing complex models I would expect to be a long, tedious process, with continual refinements based on observations of real world situations followed by testing of each refinement. Only after showing that a refinement actually improves the model would it be “hard wired” into the model, otherwise it should be junked.
Given the chaotic and immensely complex nature of the earth’s climate system, I am astounded that climate modellers have the arrogance to believe that they have got it all right so quickly.
If I were a modeller (I’m not, the only models I make fly, although most eventually crash), I would be pleased with the opportunity to refine my model as new information comes to light, as with the study here, which is why I find it refreshing that Penner said “These findings are a step toward generating better models”.
But this raises the question – at what point should we have confidence that the models are right? I would suggest there is a high risk in using current models to drive economic and environmental policies given the discrepancies that seem to be highlighted almost daily.

Sandy Rham
August 2, 2011 5:19 am

An ‘academic’ who says that measurements should be discounted in favour of theory is a disgrace to all scientific academics. For honest academics not to comment on this foolery is not “academic courtesy”, it is cowardice… and it costing the rest of academia their previously honourable reputation.

Slabadang
August 2, 2011 5:27 am

Well they present a postmodern method!
“If the large uncertainty in this forcing remains, then we will never reduce the range of projected changes in climate below the current range,” she said. “Our findings have shown that we need to be smarter. We simply cannot rely on data from satellites to tell us the effects of aerosols. I think we need to devise a strategy to use the models in conjunction with the satellite data to get the best answers”
If theory doesnt fit meashurements fudge them with aerosols to fit. Thats what they say!
Be aware that they are using two great fudgefactors.
1. The history of preindustrial aerosols (uncertainty is enormus).
2. The feedback of the aerosols (Unsertanty is enormus).
Those two in combination with models can give you any answer you wish. Now they want to use these to fit their co2 hyposesis and “models” right and explain why meashurement is wrong!!!!!!!
Welcome to the postmodern pseudo science era! They went down to the deep oceans trying to find the missing heat that didnt work so now they are using the biggest fudgin factor in models instead. Well thats how “robust” the cagw pseudo scince is. What ever you do dont question the dogma or the models.
They really really are in deep shit because this study has some very easy detetactable contradicition to answer for. They have gone in to the masterlevel of fudgemanagement and they sat themselfes in a position where they have fudge managed to many strings that they made a big tangel of them all. The numbers of contradactions is increasing like an avalange.
Whats more intresting is real scince based on meashurement and the scientific method.
They will not be able to answer to the meashured differencebetween models shown by Spencer.
They havent taken the Svensmark theory to account. They havent understood the feedbacks from clouds and they almost criminaloly underestimated and neglected the suns influence.

A. C. Osborn
August 2, 2011 5:31 am

How strange that they go back to the pre industrial age, why not go back to the 1930, one of the hottest periods ever. Wasn’t most of the western world burning COAL in millions of open fires in that period? Don’t they remember SMOG?

August 2, 2011 5:38 am

Steve Keohane says:
August 2, 2011 at 3:49 am
Apparently, overestimating the effects of CO2 leads to overestimating the amount of aerosols to counteract the first fantasy.
===========================================================
Yep, nailed it. They all agreed on some virtual reality, believed it was reality and now have to adjust other objects to fit their view of reality. It is fascinating. My hope is we can preserve some of them for observation and study at some point in the future.

JohnWho
August 2, 2011 5:47 am

Who you gonna believe:
the models
or your lying eyes?

Gary Pearse
August 2, 2011 5:55 am

Could someone write a brief, non AGW polluted paragraph without double negatives to simply say what was found? I find this report essentially unintelligible. Please say:
1) what was believed before, regarding aerosols’ effect
2) what was actually found and its implication.

Rob Potter
August 2, 2011 6:03 am

A fine example of the legal approach to science – if there is inconvenient data which you can’t ignore (i.e. the other side have already put forward) then try to obscure it with a complex scenario bringing in new theories to cast doubt on the data. About the same as casting doubt on eye witness testimony because the witness once had an eye infection.
“The data doesn’t fit the models, therefore the data must be wrong” – priceless!

August 2, 2011 6:06 am

No one is an expert, but everyone is playing “expert theoretician” with their favorite ideas, and refusing to heed the definitive evidence that makes a mockery of all of their theories. The Venus/Earth comparison I have been trying to get everyone to face and accept as fact, shows that the temperature-vs.-pressure profiles of Venus and Earth, when their different distances from the Sun (and only that difference) are taken into account, are essentially the same over the range of Earth tropospheric pressures (1,000 millibars to 200 mb). Not only does this mean there is no increase in atmospheric temperature with an increase in atmospheric CO2 (Earth has just 0.04%, to Venus’s 96.5%), i.e., no greenhouse effect, but the only effect of Venus’s planet-wide cloud cover is a slight cooling, of just a few degrees, and ONLY WITHIN the clouds themselves, not of the rest of the atmosphere outside of the clouds. So those thick clouds covering Venus, while reflecting much of the visible light from the Sun, do not affect the overall temperature profile, or the “climate” of the planet.
My analysis is extremely simple, but no one seems to be able to see its unavoidable consequences for the pretensions of every theory of atmospheric warming today. Scientists have simply been fundamentally miseducated about the real thermodynamics of planetary atmospheres, and they are miseducating the world with their ignorant, and incompetent, theories.

Bystander
August 2, 2011 6:10 am

Somehow missed from the abstract – “Nevertheless, the AIE using ln(Nc) versus ln(AI) may be substantially incorrect on a regional basis and may underestimate or overestimate the global average forcing by 25 to 35%.”

John
August 2, 2011 6:19 am

To Tim Curtin (12:47 AM):
China emits a great deal of SO2, now. But the US and Western Europe emit comparable (smaller) amounts. Russia and Eastern Europe are also emit enough SO2 to matter, but they emit far, far less than they did when the USSR was in business. When the USSR collapsed, within 2 years there was a very large drop in SO2 emissions from Eastern Europe and the former USSR, because all the extremely inefficient and thus uneconomical enterprises went out of business.
The peak emissions of SO2 in the US was around 1974, at around 27 million tons per year. After the first oil embargo, we lost a lot of industry as well, and that has kept up ever since. In addition, the Clean Air Act amendments of 1977 and 1990 required utilities to reduce their SO2 emissions, which are now down to about 7.5 million tons per year, from a peak of over 18 million tons per year.
Western Europe had the same concern about acid rain as in the US, and has cut emissions by about the same amounts and now emits about the same as the US, give or take.
China is head and shoulders above everyone else, and the US is essentially in a three way tie for second with Western Europe, Eastern Europe and Russia, and up and coming India.

Katherine
August 2, 2011 6:32 am

Ferdinand Engelbeen says:
A pollution-free pre-industrial atmosphere? Never heard of natural forest fires? Never heard of natural emissions of volatile organics (like terpenes) from trees, leading to secondary organic aerosols (SOA), causing the haze of the “blue” mountains? These are measured in the free troposphere in a ratio of 2:1 to larger than 10:1 compared to SOx, between 0.5 and 5.5 km altitude.
Maybe it’s pollution only if it’s man-made? And what do they consider “pre-industrial”? In the 1300s, pollution from coal burning was already a concern in medieval London. Supposedly by 1600, London’s coal consumption was over 50,000 tons a year. Earlier, there’s atmospheric Pb pollution from pre-medieval (Roman) mining that blew all the way to western Ireland. Then there’s copper smelting pollution during Roman and medieval times, “especially in Europe and China,” recorded in Greenland ice. What pollution-free pre-industrial atmosphere?

Don K
August 2, 2011 6:36 am

anopheles says:
August 2, 2011 at 1:29 am
Aerosols, the epicycles of the climate world.
=====
A slur on epicycles which were really a quite clever way to synthesize an inexplicable waveform using pre-seventeenth century math — which lacked almost all the tools we take for granted today. You could navigate using headings predicted from cycles and epicycles and end up where you wanted to go. On the other hand, using climate models in their current state for practical purposes seems likely to be something those without a prediliction for adventure might do well to avoid.

Alexander Duranko
August 2, 2011 6:40 am

The claim that [thicker] clouds with smaller droplets ‘reflect’ more solar radiation is one of the key errors in climate science. You can easily prove it because when cloud droplets coarsen prior to rain, albedo increases, the opposite of what the aerosol optical physics in the models predict as you reduce optical depth.
Ask any glider pilot and he/she’ll confirm this second optical effect can cause temporary blindness. Sagan and Chandrasekar missed it so their maths is wrong.
The other error in the models is that ‘back radiation’ is (1) the artefact of a mathematical mistake made in 1922 and (2) is confused with Prevost exchange radiation, which does no work.
The fact that oxymoronic ‘climate science’ believes this junk science shows how bad is the level of basic physics in the subject!
So, this paper is also bunkum.

John
August 2, 2011 6:42 am

To Steven (4:52 AM):
Yes, it is indeed a good point that if increasing aerosols disguise alleged warming, then decreasing aerosols should increase it. There was a huge drop in worldwide SO2 emissions in the 1989-1991 period, with the collapse of the former USSR. Prior to that time, worldwide emissions had been slowly increasing. There are some articles in the press from a few years back, with titles such as “global brightening” and “global dimming” which do suggest that the warming post-Pinatubo may have had a lot to do with the reduction in sulfate formation from SO2 reductions, and, similarly, that the drop in warming, or steady temps, from after WW II through the mid 1970s had to do with the industrial ramp up in most developed countries after WW II.
I can’t judge whether they are right or not. But they back up your comment. Here is the Wikipedia link on global dimming (no, I don’t trust Wikipedia as a source of science, but this link shows that the subject has been discussed for a while):
http://en.wikipedia.org/wiki/Global_dimming
And here is a member of the warmist team arguing that Christopher Monckton’s suggestion that global brightening post 1990 was the cause for warming, not greenhouse gases, is incorrect:
http://www.skepticalscience.com/Could-global-brightening-be-causing-global-warming.html
In other words, the warmist camp, when it is in their interest, argues against the sulfate effect. In this case, if the sulfate effect is true and important, then the rapid warming of the 1990s was less to do with greenhouse gases and more to do with sulfate reductions.
Can’t have it both ways. We need science, uncorrupted science. And maybe the world wide suspicion of the games the hockey team, and the Climategate folks, and their acolytes such as Joe Romm and Gavin Schmidt have been playing will slow things down enough so that real scientists can be heard — whatever their ultimate findings may be.

Richard S Courtney
August 2, 2011 6:46 am

Friends:
The article quotes Penner saying:
“The satellite estimates are way too small,” said Joyce Penner, the Ralph J. Cicerone Distinguished University Professor of Atmospheric Science. “There are things about the global model that should fit the satellite data but don’t, so I won’t argue that the models necessarily are correct. But we’ve explained why satellite estimates and the models are so different.”
Hmmm. Let us consider what we know about how the models incorporate climate sensitivity and aerosol effects.
None of the models – not one of them – could match the change in mean global temperature over the past century if it did not utilise a unique value of assumed cooling from aerosols. So, inputting actual values of the cooling effect (such as the determination by Penner et al.) would make every climate model provide a mismatch of the global warming it hindcasts and the observed global warming for the twentieth century.
This mismatch would occur because all the global climate models and energy balance models are known to provide indications which are based on
1.
the assumed degree of forcings resulting from human activity that produce warming
and
2.
the assumed degree of anthropogenic aerosol cooling input to each model as a ‘fiddle factor’ to obtain agreement between past average global temperature and the model’s indications of average global temperature.
More than a decade ago I published a peer-reviewed paper that showed the UK’s Hadley Centre general circulation model (GCM) could not model climate and only obtained agreement between past average global temperature and the model’s indications of average global temperature by forcing the agreement with an input of assumed anthropogenic aerosol cooling.
And my paper demonstrated that the assumption of aerosol effects being responsible for the model’s failure was incorrect.
(ref. Courtney RS An assessment of validation experiments conducted on computer models of global climate using the general circulation model of the UK’s Hadley Centre Energy & Environment, Volume 10, Number 5, pp. 491-502, September 1999).
More recently, in 2007, Kiehle published a paper that assessed 9 GCMs and two energy balance models.
(ref. Kiehl JT,Twentieth century climate model response and climate sensitivity. GRL vol.. 34, L22710, doi:10.1029/2007GL031383, 2007).
Kiehl found the same as my paper except that each model he assessed used a different aerosol ‘fix’ from every other model.
He says in his paper:
”One curious aspect of this result is that it is also well known [Houghton et al., 2001] that the same models that agree in simulating the anomaly in surface air temperature differ significantly in their predicted climate sensitivity. The cited range in climate sensitivity from a wide collection of models is usually 1.5 to 4.5 deg C for a doubling of CO2, where most global climate models used for climate change studies vary by at least a factor of two in equilibrium sensitivity.
The question is: if climate models differ by a factor of 2 to 3 in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy. Kerr [2007] and S. E. Schwartz et al. (Quantifying climate change–too rosy a picture?, available at http://www.nature.com/reports/climatechange, 2007) recently pointed out the importance of understanding the answer to this question. Indeed, Kerr [2007] referred to the present work and the current paper provides the ‘‘widely circulated analysis’’ referred to by Kerr [2007]. This report investigates the most probable explanation for such an agreement. It uses published results from a wide variety of model simulations to understand this apparent paradox between model climate responses for the 20th century, but diverse climate model sensitivity.”
And, importantly, Kiehl’s paper says:
”These results explain to a large degree why models with such diverse climate sensitivities can all simulate the global anomaly in surface temperature. The magnitude of applied anthropogenic total forcing compensates for the model sensitivity.”
And the “magnitude of applied anthropogenic total forcing” is fixed in each model by the input value of aerosol forcing.
Thanks to Bill Illis, Kiehl’s Figure 2 can be seen at
http://img36.imageshack.us/img36/8167/kiehl2007figure2.png ]
Please note that the Figure is for 9 GCMs and 2 energy balance models, and its title is:
”Figure 2. Total anthropogenic forcing (Wm2) versus aerosol forcing (Wm2) from nine fully coupled climate models and two energy balance models used to simulate the 20th century.”
It shows that
(a) each model uses a different value for “Total anthropogenic forcing” that is in the range 0.80 W/m^-2 to 2.02 W/m^-2
but
(b) each model is forced to agree with the rate of past warming by using a different value for “Aerosol forcing” that is in the range -1.42 W/m^-2 to -0.60 W/m^-2.
In other words the models use values of “Total anthropogenic forcing” that differ by a factor of more than 2.5 and they are ‘adjusted’ by using values of assumed “Aerosol forcing” that differ by a factor of 2.4.
In summation, all the model projections of future climate change are blown out of the water by the findings of Penner at al.
Richard

Matt
August 2, 2011 6:51 am

Bizarre – the University of Michigan is receiving research grants for that? The result is poor – the conclusions kind of funny at best. If reality says it is different than it was thought before and predicted in “the models” – sad for reality, because the models are definitively right – certainly with a high high “confidence level”.
Nice phrase:
Steve Keohane says:
August 2, 2011 at 3:49 am
Apparently, overestimating the effects of CO2 leads to overestimating the amount of aerosols to counteract the first fantasy.
Best
Matt from Chile

Bill Illis
August 2, 2011 6:55 am

This paper is not much different than most climate science papers. It is hard to determine what they actually did or where the data came from but it concludes 3.0C per doubling is probably correct.
Having said that, the results are not much different than what the IPCC and the climate models are using.
Note this paper is about the first Aerosol-Indirect-Effect or the aerosols impact on the reflectance of sunlight by clouds. There is also second effect in regards to the lifetime of clouds but this is usually treated as an add-on to the first effect. These impacts come from both sulfate aerosols and soot aerosols (although soot can also have a positive impact as well.
The first aerosol indirect effect is typically about 40% of the total aerosol impact.
The average IPCC 1st Aerosol-Indirect-Effect is -0.5 W/m2 (GISS uses -0.7 W/m2) and this paper used six models/methods to come up with a range of -0.27 W/m2 to -1.69 W/m2. Lots of other papers have come up with a range like this so there is nothing ground-breaking here.
Since there is so much uncertainty and so many direct and indirect impacts, and ones that are probably not additive, I’m assuming this paper will have no impact, direct or indirect.

Steve from Rockwood
August 2, 2011 7:13 am

First we were reaching the CO2 tipping point. Now we are reaching the aerosol tipping point. What happens when all these tipping points collide?

John
August 2, 2011 7:22 am

To Richard Courtney (6:46 AM):
Bravo! A must read for everyone on this thread.

timetochooseagain
August 2, 2011 7:33 am

Sounds to me like “blah blah blah we don’t know” again when it comes to aerosols. And yet this largely unknown factor is crucial to estimating the “forcing” determining recent climate. Without a decent estimate one can claim any sensitivity is consistent with the data. Clearly we know less than we need to about aerosols, and this paper is trying to say something about this factor but it is a little muddled, since it flip-flops, I get the impression, between whether to believe observation based estimates or model estimates. It’s not clear why one would prefer modeling estimates to observational ones, unless your observations are really bad and your models really good. But hey, this is climate science, so par for the course.

Hal
August 2, 2011 7:42 am

After Climategate, after the de-pantsing of the IPCC, and after the “consensus” went up in flames, how do all of these AGW jokesters and hucksters keep getting research funds?

ferd berple
August 2, 2011 7:42 am

“We simply cannot rely on data from satellites to tell us the effects of aerosols. ”
That really says it all. Climate Science is turning the scientific method on its head. Since the models and satellites don’t agree, the models (theory) are right and the satellites (observations) are wrong.
Scientific method – when theory doesn’t match observation, the theory is wrong.
Climate science – when theory doesn’t match observation, the observation is wrong

ferd berple
August 2, 2011 7:54 am

Richard S Courtney says:
August 2, 2011 at 6:46 am
”One curious aspect of this result is that it is also well known [Houghton et al., 2001] that the same models that agree in simulating the anomaly in surface air temperature differ significantly in their predicted climate sensitivity. .. The question is: if climate models differ by a factor of 2 to 3 in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy. Kerr [2007] and S. E. Schwartz et al.”
An excellent point. How, if the models have different estimates of climate sensitivity, can the model all simulate past global temperatures?
The answer is simple. The models are using “parametrized curve fitting” to fit the historical data. What do we know about parametrized curve fitting? That is has no predictive value beyond chance. None whatsoever. One of the parameter sets may in fact be correct, and this will have predictive value, however there is no way to know in advance which set of parameters is correct.
Also, it is also possible that none of the parameter sets is correct, and none of the models will have predictive value. The parameter set may appear to track for a short period of time, then go widely off course. There is no way to know in advance.

Latitude
August 2, 2011 8:23 am

Aren’t computer programmers wonderful……..
…and they still can’t model tomorrow’s forecast

Steven Hales
August 2, 2011 8:24 am

Sahara Sand

Mark
August 2, 2011 8:51 am

Mac says (2:34 am) ‘Observations don’t match projections’ has becoming a much repeated phrase of late. The models don’t work.”
Having lived though a couple product recalls- class two medical devices- were the MODELS said the response (accuracy) of a device in the field (attribute observations) would be within the prescribed bias (errors) only to have the number/frequency of observations be 3 to 10 times the expected quantity with with poor accuracy has lead me (and the FDA) to the conclusion that models are ONLY an attempt to estimate causal relationships and they are useful only in their ability to match real world results. If they don’t match the real world observations the question becomes are they good enough for what you are trying to explain/sell. There are corrective feedback mechanisms in the real world when the models do not explain, predict observations: product recalls, big fines from the FDA, no one buying your models of the weather patterns for next year, etc. The corrective feedback loop seems to be missing in a lot of the climate science models.
In the real world the gold standard is ALWAYS the observations and if they fall within the predicted output from a model great- if not the regulating bodies in the pharmaceutical, medical device, automotive, etc world let you know if you have botched the primacy of real world results vs a theoretical models estimate.

Richard S Courtney
August 2, 2011 8:52 am

ferd berple:
At August 2, 2011 at 7:54 am you comment on my above post (at August 2, 2011 at 6:46 am ) by saying:
“The models are using “parametrized curve fitting” to fit the historical data. What do we know about parametrized curve fitting? That is has no predictive value beyond chance. None whatsoever. One of the parameter sets may in fact be correct, and this will have predictive value, however there is no way to know in advance which set of parameters is correct. ”
But it is much, much worse than you say.
Firstly, we know for a certain fact that all exccept at most one of the models is wrong because each model uses a different set of parametrisations. And if one of them were right then we have no way of knowing which one it is.
Secondly, as my above post tried to explain, the paper by Penner et al. proves that every model is using a wrong set of parametrisations, so the paper provides direct empirical evidence that all the models are wrong.
Importantly, as I tried to explain, if you input the empirical data on aerosols obtained by Penner et al. then none of the models – not one of them – could emulate global temprature change over the past century.
A model that cannot hindcast the past cannot forecast the future.
Richard

Tim Huls
August 2, 2011 9:14 am

Is S02 the new pixie dust? Sprinkle enough on the models and they can fly. Just have Good (non-denier) thoughts! Common everyone, do you believe?

TomRude
August 2, 2011 9:27 am

Aerosols…
The latest fad in global warming ‘science’!

Brian H
August 2, 2011 9:58 am

Richard SC;
Yes, a dog’s breakfast of offsetting fudges. You pays your money and you makes your choice — but with the caveat that NONE of them can possibly be correct, given the wild divergences each experiences from its own trend lines and observations.
We fools and our money were soon parted by these charlatans.

Richard S Courtney
August 2, 2011 10:02 am

TomRude:
Especially in the light of your name, your post at August 2, 2011 at 9:27 am reminded me of the old joke; i.e.
A man walked into a phamacist shop and asked for some deodorant.
The assistant asked, “Aerosol or ball?”
And the man replied, “No, it’s for under my armpits”.
So, are lamestream climatologists aerosols?
Richard

Theo Goodwin
August 2, 2011 10:30 am

Peter Plail says:
August 2, 2011 at 5:18 am
“My understanding is that models are attempts to simulate real world processes so that, eventually, accurate predictions may be made. The process of developing complex models I would expect to be a long, tedious process, with continual refinements based on observations of real world situations followed by testing of each refinement. Only after showing that a refinement actually improves the model would it be “hard wired” into the model, otherwise it should be junked.”
You are being far too generous. What you describe is the use of hypotheses in physical science to build ever more complete and better confirmed theories. The theories are “about” the physical world. Models are not “about” anything; that is, they have no cognitive content whatsoever. Each model run produces a longish string of numbers that is interpreted as representing a “state of the planet” or some such thing. If the numbers disagree with reality then the modelers tweak the entire model as they are unable to identify particular numbers with particular pieces of the computer code.
As the final absurdity, Gaia Modelers model only heat exchanges caused by radiation. They do not model other natural processes such as La Nina but treat them as emergent properties.

Theo Goodwin
August 2, 2011 10:32 am

Rob Potter says:
August 2, 2011 at 6:03 am
Yep, that is the method. And it is so juvenile! How can they bring themselves to write this stuff? How can the funding agencies keep dishing out for this stuff?

Theo Goodwin
August 2, 2011 10:41 am

Richard S Courtney says:
August 2, 2011 at 6:46 am
Many thanks to you. Your post is a must read.

Don E
August 2, 2011 11:03 am

Just as I suspected all along; burning fossil fuel causes global cooling after it causes global warming.

Gary Hladik
August 2, 2011 11:12 am

Richard S Courtney says (August 2, 2011 at 8:52 am): “Importantly, as I tried to explain, if you input the empirical data on aerosols obtained by Penner et al. then none of the models – not one of them – could emulate global temprature change over the past century.”
Or, to paraphrase Samuel Johnson, “‘Aerosolism’ is the last refuge of a (climate) scoundrel!” 🙂

August 2, 2011 11:57 am

Why am I not surprised by this. Real world data conflicts with models, so it must be the data that is wrong.

Matt G
August 2, 2011 12:57 pm

If satellites don’t see this supposed 3 to 6 times greater affect with aerosols, then why doesn’t the surface based or ocean based data sets don’t either? Maybe because none of the observed data shows this because it really doesn’t exist? If it is 3 to 6 times greater than why have volcanoes not had a bigger affect then on any ocean, surface or satellite data sets? Why did aerosols do so little in preventing warming of the planet during the 1980’s and 1990’s?
How did this paper even get published when nothing can be shown to back this up as evidence and yet a recent publication using satellite was actually supported with observed evidence got so much stick with little/no actual science reasoning and plain pe[t]ty. If this was even given a much higher standard of reviewing ,this paper would have never been published. The standard of science and bias in climate is poor and no doubt this is done on purpose with the intent of stagnation.
Obviously the missing heat must be in the ocean too combined with that from CO2. (sarc/off) The correct conclusion from this paper is that model findings are incorrect by 3 to 6 times observed from satellite. (with ocean and surface based data sets, not too different) If there were real scientist these would be looking into why this is the case and not that the observed data is that wrong.

Matt G
August 2, 2011 1:15 pm

Richard S Courtney says:
August 2, 2011 at 6:46 am
Exactly, the weighting for aerosols cooling during the 20th century was way too high because they guessed this was the cause. Now there is longer more reliable data set with more accurate satellite data, this confirms that the levels previously were way over estimated because the same factor continued through the rest of the period simply doesn’t fit by very large error.

D. J. Hawkins
August 2, 2011 2:59 pm

MikeN says:
August 2, 2011 at 5:15 am
If they are understating the impact of aerosols currently, which are a cooling factor, that means they are providing more cooling that previously thought, and thus blocking more warming than previously thought, meaning future warming will be higher.

Mmmm, not quite. The thesis presented by the paper is that the satellite based forcings, which are negative, are 6X too large. The models are using the 1/6X forcings NOW. If it were to turn out that the satellites are correct, the models would, all other things being equal, wind up showing less heating. To preserve the current CAGW hysteria, it is necessary to convince us that the modeled forcings are somehow superior to the actual data. Sort of like modeling g to be 180 ft/sec/sec and trying to find out where the engineers went wrong when they keep getting 32.2 ft/sec/sec, plus or minus, when they drop things out the window.

August 2, 2011 3:19 pm

I have to agree with others that this appears as though reality does not match the models, so reality must be wrong. Now it is not as simple as that, because we are comparing an estimate based on reality to models based on … something. Still, that estimate must be wrong, because the models must be right!

RDCII
August 2, 2011 4:23 pm

I’m a skeptic, so don’t rain all over me, but the way I read this is different than the way many people here are reading this.
What Penner seems to be saying is not that the Satellite data is wrong, or is being measured incorrectly, but rather that the equations that are used to derive the amount of radiation reflected by aerosols is incorrect.
So far, no one has addressed the SCIENCE of what Penner is actually saying, and as someone interested in the science, I wonder if someone who is qualified to do so could analyze whether or not Penner has a valid observation?

Ursus Augustus
August 2, 2011 4:38 pm

Well there is one thing that can be said with absolute certainty and both sides implicitly agree – the science is NOT settled. Policy makers, please take note.

R. Gates
August 2, 2011 4:54 pm

Don E says:
August 2, 2011 at 11:03 am
Just as I suspected all along; burning fossil fuel causes global cooling after it causes global warming.
______
Actually getting closer to the truth, but just opposite. Burning things like dirty coal creates cooling before it creates warming, as CO2 will stay around far longer than black carbon or sulfates. But this is hard for some people to grasp…that things can have opposite effects over differing time spans due to the complex nature of climate. Volcanoes cool in the short term, but can warm in the longer term as once the aerosols are washed from the atmosphere, you still have the CO2. Good thing this is true, by the way, or the earth could still be locked in the “snowball earth” mode, where even repeated Milankovitch cycles couldn’t shake it out, and it took massive volcanic activity to create enough CO2 to bust it out of that mode.

Bill Illis
August 2, 2011 6:21 pm

R. Gates says:
August 2, 2011 at 4:54 pm
Good thing this is true, by the way, or the earth could still be locked in the “snowball earth” mode, where even repeated Milankovitch cycles couldn’t shake it out, and it took massive volcanic activity to create enough CO2 to bust it out of that mode.
—————————–
Before Snowball Earth, CO2 was about 10,000 ppm. How did all that ice build up? When it ended, CO2 was only 12,000 ppm – about 7% of the level needed to end the Snowball.
It was super-continent Pannotia covering the South Pole which resulted in the last Snowball. CO2 had nothing to do with it. This is the continental allignment right after the Snowball ended. Think about it, Antarctica times 20 – Albedo rises to close to 50%. When continental drift splits the super-continent up and spreads it out away from the South Pole, the Snowball ends.
http://astro.wsu.edu/worthey/earth/html/im-paleozoic/tectonics-cambrian-600.jpg

Richard S Courtney
August 2, 2011 6:34 pm

R. Gates:
Your post at August 2, 2011 at 4:54 pm says in total;
**************
“Don E says:
August 2, 2011 at 11:03 am
Just as I suspected all along; burning fossil fuel causes global cooling after it causes global warming.
______
Actually getting closer to the truth, but just opposite. Burning things like dirty coal creates cooling before it creates warming, as CO2 will stay around far longer than black carbon or sulfates. But this is hard for some people to grasp…that things can have opposite effects over differing time spans due to the complex nature of climate. Volcanoes cool in the short term, but can warm in the longer term as once the aerosols are washed from the atmosphere, you still have the CO2. Good thing this is true, by the way, or the earth could still be locked in the “snowball earth” mode, where even repeated Milankovitch cycles couldn’t shake it out, and it took massive volcanic activity to create enough CO2 to bust it out of that mode.”
*************
That is so wrong it is hard to know which parts to refute. A complete rebuttal would require several pages. So, instead, and as a start, I ask you to justify one of your assertions. In the improbable case that you can, then we can learn from you about another of them.
As a starter, please explain in a quantitative manner how volcanic CO2 emissions could have increased global temperature from any of the postulated ‘snowball Earth’ episodes when throughout each of those times of ‘snowball Earth’ – and before the asserted increased volcanism – atmospheric CO2 was tens of times higher than it is now.
Richard

Editor
August 2, 2011 6:54 pm

Katherine says: August 2, 2011 at 6:32 am
Yeah, I noticed that, too. The idea that the pre-industrial area was “pollution-free” suits the perspective that all environmental ills are the result of rapacious modern capitalism. I think it is rather telling that Dr. Joyce Penner is the “Ralph J. Cicerone Distinguished University Professor of Atmospheric Science” – apparently history was not part of her training. The French author Jean Gimpel wrote a lovely book in 1976 called The Medieval Machine which diocumented the pollution problems of the Middle Ages. To say that she is “disingenuous” is a kindness.

Uber
August 2, 2011 7:46 pm

Frankly this article makes no sense. Am I missing some inscrutible logic or does the author need to go back to school to study English?

SteveSadlov
August 2, 2011 8:16 pm

Between this and the many “mitigation” efforts being undertaken to “cool the Earth’s fever” I would not be surprised of certain SOBs hastened the end of the interglacial.

Rhoda Ramirez
August 2, 2011 8:46 pm

SteveS, that has been a minor worry of mine also.

Editor
August 2, 2011 8:47 pm

SteveSadlov says: August 2, 2011 at 8:16 pm
You might, if you haven’t already, really, really, really want to read to Jerry Pournelle’s (et al) “Fallen Angels” (http://www.webscription.net/p-137-fallen-angels.aspx) and (http://en.wikipedia.org/wiki/Fallen_Angels_(science_fiction_novel))

Richard S Courtney
August 2, 2011 8:50 pm

Uber:
At August 2, 2011 at 7:46 pm you ask:
“Frankly this article makes no sense. Am I missing some inscrutible logic or does the author need to go back to school to study English?”
In my opinion, the article is deliberately written in inscrutable language because the findings of Penner et al. are so damning to the cause of AGW. The article is a press release on a paper that reports a measured value of the the cooling effect of aerosols which Penner et al. obtained using satelites. This measured value is very different from that which climate models assume. This can only mean that the measurements are wrong or the models are wrong.
Findings like those in the reported paper have to be presented in a manner that seems to support AGW or they have little chance of getting published. So, the paper goes to great lengths to provide doubt to its own findings because a report that says the models are wrong would be unlikely to obtain publication. Hence, in the press release Penner tries to say her work is good but her findings must be wrong: she has to say that because it is what her paper tries to say. In this circumstance it would be strange if the press release were clear.
The implications of the findings of Penner at al. are explained in this thread by several posts.
These explanatory posts include those from
myself at August 2, 2011 at 6:46 am and August 2, 2011 at 8:52 am
Matt G at August 2, 2011 at 1:15 pm
D. J. Hawkins at August 2, 2011 at 2:59 pm
I hope this helps.
Richard

David Falkner
August 2, 2011 9:03 pm

So the climate is more sensitive to aerosols than the models? Then the the other study blaming aerosols for the lack of cooling was wrong. That means that the climate is actually more sensitive, and that the increase in ‘missing heat’ should pretty much disprove the CO2 hypothesis by making the math untenable.

David Falkner
August 2, 2011 9:08 pm

Ha, I see Richard Courtney beat me to it with a much more surgical explanation. Still, how can you just gloss that over? It’s mind boggling.

August 3, 2011 12:19 am

re John says:
August 2, 2011 at 6:19 am
Yes of course, John, but my main point was to point out how trivial total SO2 emissions are, at around 65 Million tonnes p.a. relative to the annual increase in atmopsheric CO2 of around 14 Billion tonnes. It is characteristic of climate scientists like Penner that they have little feel or common sense when it comes to numbers. I repeat my question, is it really possible as claimed by Kaufmann et al PNAS 2011 that the 65 MtSO2 could offset the warming attributed by them to 14 GtCO2? What is the underlying science to justify this claim? What lab experiments like those of Tyndall 1861 have Kaufmann et al and Penner done to verify their beliefs?

son of mulder
August 3, 2011 1:01 am

What is the current level of temperature suppression by aerosols?
As global sulphate aerosol levels are lower than 40 years ago and distributed differently around the globe, the temperature must have been suppressed less by them. So what net effect have aerosols had on the temperature record over the past 40 years? And how have they caused flat temperatures for the past 10 years as CO2 has continued to grow.

Syl
August 3, 2011 2:07 am

RDCII says:
August 2, 2011 at 4:23 pm
I’m a skeptic, so don’t rain all over me, but the way I read this is different than the way many people here are reading this.
What Penner seems to be saying is not that the Satellite data is wrong, or is being measured incorrectly, but rather that the equations that are used to derive the amount of radiation reflected by aerosols is incorrect……
——————————————————————
There is a total amount of reflected radiation leaving the earth that is measurable. If there’s more or less coming from aerosols the difference will be subtracted/added to albedo. Just a matter of distribution. EXCEPT the specific allocation will matter to the climate models. The modelers seem to feel safe saying they may get clouds wrong, but not aerosols. So those equations do matter and point taken.

Bill Illis
August 3, 2011 6:16 am

The satellite measurements do not show any significant change in the total solar shortwave reflected, the total Albedo of the Earth (including clouds, dust, atmospheric molecules etc.) which is the important measure (rather than some small component of it).
There was a short-term increase in solar shortwave reflected (up to about 3.0 W/m2) during the two major volcanoes and then a small drop from 1993 to 1999 (ozone depletion?) which has since gone back up. The change over the whole period from 1983 to 2010 is basically less than 1.0 W/m2. Now we are just getting small wiggles from the ENSO’s influence on total cloud cover.
So all this recent talk about aerosols cooling us off (and in this paper which says the aerosol impact on clouds might be under-estimated), is meaningless.
The total Albedo has hardly changed in 30 years. There is some evidence that solar radiation reaching the surface has increased a little but this either contradicts the reflected shortwave measurements from the satellites or it says that the Sun’s radiance has increased slightly (also contradicted by the satellites) or it says that direct absorption in the atmosphere has declined slightly (ozone depletion from the volcanoes). On balance, we have to say that there has been a slight increase in the total solar radiation making it to the surface, a small positive number rather than a negative one.

Julian Braggins
August 3, 2011 6:18 am

I have been led to believe that aerosols tend to remain in their respective hemispheres unlike CO2,
http://www.barrettbellamyclimate.com/page29.htm
compares the satellite record of temperatures graphically latitude by latitude from1976 to 2006. The southern hemisphere seems to have warmed ~.1°C in that time. The northern hemisphere ~.6°C
So why hasn’t the southern hemisphere warmed from the CO2, while it is the northern hemisphere that is being shielded from warming by the aerosols?

Theo Goodwin
August 3, 2011 8:12 am

Richard S Courtney says:
August 2, 2011 at 8:52 am
“Importantly, as I tried to explain, if you input the empirical data on aerosols obtained by Penner et al. then none of the models – not one of them – could emulate global temprature change over the past century.”
I am not commenting om Richard’s fine work but using it to add a point of my own.
I want to make the point that models are incapable of serving as a replacement for scientific hypotheses and that the idea that models can be revised and improved in the same way as a physical theory is a fiction. I want to make this point in the most fundamental way, right down at the level of basic logic.
Looking at Richard’s words above, I think it is clear that the empirical results obtained by Penner, the satellite data, require a revision in all of the models to bring them in line with past history as it has been modified by Penner. Penner refers to “the equations for aerosols” and the need to revise them. In doing so, she is treating them as if they were physical hypotheses that are “about” something and that imply a specific range of data. That is simply wrong.
Take the simplest case of a model, a linear programming model that is used to study distribution patterns for some large organization, such as the Pentagon, and that serves the practical purpose of assisting planners as they create new distribution sites. The equations in that model are not “about” the distribution patterns, the map of the world, or anything. They have no cognitive content whatsoever. Of course, changes can be made to them in hopes of improving the model but the only way that those changes can be evaluated for improvement is through the resulting changes in the output of the model. In other words, you have to make several runs and compare simulation to simulation. A simulation is just a string of numbers that is treated as a “state of the world” or some such thing. There is no basis for identifying some subset of that string with some subset of the mathematical equations or some part of the code. (Of course, there are always programmer’s hunches.) The same holds for more ramified models such as Penner’s.
As Penner goes about improving her model, all she can do is compare simulation to simulation until she finds a string of numbers that better fits the impact that she believes that aerosols should have. However, the changes that she makes in the “equations for aerosols” will be propagated throughout the model. To fully evaluate those changes, she would have to compare all the output of a simulation with all the output of the other simulations. (Is this necessary in practical terms? Of course not, we have programmer’s hunches. But that is not science.) And here we have found the crucial difference between computer models and scientific hypotheses. Except at the level of high theory, think CERN, physical hypotheses do have cognitive content and can be defined in terms of the observational results that they specify by implying them. In physical theory, one can work on the equations for aerosols in isolation from the rest of theory and independently of competing theories. That is not possible with computer models. The only way to revise a computer model is to revise the entire model through comparison of complete simulations. So, it should be no surprise that new observational data about aerosols would invalidate all existing models and require changes in all of them, changes that affect the entire model, not just a specifiable bit of code.

Richard S Courtney
August 3, 2011 10:16 am

Julian Braggins:
At August 3, 2011 at 6:18 am you say:
“I have been led to believe that aerosols tend to remain in their respective hemispheres unlike CO2,”
That is true because aerosols wash out of the atmosphere in days so they lack time to spread from one hemisphere to the other.
And you ask:
“So why hasn’t the southern hemisphere warmed from the CO2, while it is the northern hemisphere that is being shielded from warming by the aerosols?”
That is because the aerosol excuse is nonsense. Reality does what it does and it has no regard for what climate models say it ‘should’ do.
The bottom line is that nobody understands how the climate system operates or behaves. In the absence of either of those understandings it is not possible to build a scientific model of the system or how it behaves.
However, it is possible to build a computer game that can be claimed to be a model of the climate system, and several people have built such computer games. They each differ from all the others, but there is only one climate system that it is falsely asserted they all emulate.
Richard

Douglas Hoyt
August 3, 2011 12:15 pm

This paper is hard to understand, but basically it seems to be saying that modelled aerosol cooling effects are 3 to 6 times greater than measured aerosol cooling effects. Therefore, they conclude, the observations are incorrect. If the observations are correct, the 1940-1976 cooling cannot be explained by aerosols. Furthermore, if the observations are correct, then the sensitivity of climate to CO2 must be much smaller than is presently modelled, just like Spencer is arguing.
Is this a correct summary?

Theo Goodwin
August 3, 2011 1:20 pm

Richard S. Courtney writes:
“The bottom line is that nobody understands how the climate system operates or behaves. In the absence of either of those understandings it is not possible to build a scientific model of the system or how it behaves.”
Well said. That is the bottom line. And no one will know until climate scientists put away their computer games, do research in the field, and create some physical hypotheses which describe the natural regularities that they discover. Of course, the physical hypotheses must prove to be reasonably well confirmed over the years.
What is most needed at this time is a reliable system of measurement. Unfortunately, there is no international or national body that, at this time, has either the good sense or the political credibility to plan such a system and the many subsystems that it would require.

Richard S Courtney
August 3, 2011 2:24 pm

Douglas Hoyt:
Yes. You do provide a correct summary.
The implications of the findings of Penner at al. are explained in this thread by several posts.
These explanatory posts include those from
myself at August 2, 2011 at 6:46 am and August 2, 2011 at 8:52 am
Matt G at August 2, 2011 at 1:15 pm
D. J. Hawkins at August 2, 2011 at 2:59 pm
Richard

Matt G
August 3, 2011 3:11 pm

Julian Braggins says:
August 3, 2011 at 6:18 am
“I have been led to believe that aerosols tend to remain in their respective hemispheres unlike CO2,
http://www.barrettbellamyclimate.com/page29.htm
compares the satellite record of temperatures graphically latitude by latitude from1976 to 2006. The southern hemisphere seems to have warmed ~.1°C in that time. The northern hemisphere ~.6°C”
Already been answered that the aerosol excuse is nonsense based on observed data. Looking at this a different way, which seems even more interesting. The northern hemisphere has warmed more than the southern hemisphere down to the land surface area being much larger. The idea this occurs because the energy is supposed to warm the ocean (never shown with scientific evidence except at the skin surface) and therefore because it has much higher heat capacity it takes longer to warm it. With the missing energy in the ocean and thus can’t suddenly skip the top 700m to deeper depths.
This demonstrates at least some minor warming expected from CO2 in the northern hemisphere is only found over land. The little warming compared, observed in the southern hemisphere again only down to land. Thats why the land/ocean ratio between the two shows this difference. This would give scientific evidence that the oceans can’t warm due to longwave CO2 and what little affect occured was the known physics based on greenhouse gases, but only applies when not over ocean. Hence, this would apply that only atmosphere over land can demonstrate a little change in greenhouse effect and with it only covering around 29% of the planet, this would explain at least why the planet hasn’t warmed as much as expected via the models.

Douglas Hoyt
August 3, 2011 3:32 pm

Richard Courtney
Thanks for the confirmation. Penner has shown that the aerosol forcing is too weak to expalin the 1940-1976 cooling using the model aerosol changes claimed by Charlson (1991). The Charlson paper does not agree with observations (e.g., Hoyt, D. V. and C. Frohlich, 1983. Atmospheric transmission at Davos, Switzerland, 1909-1979. Climatic Change, 5, 61-72). In davos there was no trend in aerosol loading from 1909 to 1979 and yet the Charlson paper claims that maximum increase in aerosol loading directly over Davos. So it is now a doublw whammy for the aerosol-cooling explanation – weaker than modelled forcing and weaker than claim trends.
There are other papers supporting the trend conclusions above. For example, MacDonald’s (1938) Atlas Of Climatic Charts of the Oceans shows the same geographical distribution of aerosols as are now seen in the satellites. There is just one exception and that is an aerosol cloud coming off England into the North Sea. It didn’t get as far as Norway or Belgium. This manmade aerosol cloud disappeared in the 1950s along with London fogs caused by coal burning. So the trend there is actually in the opposite direction of what the modellers assume.
Withe the small to non-existent aerosol forcing, the only remaining way to get the models to agree with observations for 1940-1976 is have a very low climate sensitivity as Lindzen, Spencer, and others have deduced.

Richard S Courtney
August 4, 2011 2:14 am

Douglas Hoyt:
Your very fine post at August 3, 2011 at 3:32 pm includes this comment:
“Withe the small to non-existent aerosol forcing, the only remaining way to get the models to agree with observations for 1940-1976 is have a very low climate sensitivity as Lindzen, Spencer, and others have deduced.”
Yes! See
http://img36.imageshack.us/img36/8167/kiehl2007figure2.png
Richard

Dave Springer
August 4, 2011 11:43 am

Matt G says:
August 3, 2011 at 3:11 pm

Julian Braggins says:
August 3, 2011 at 6:18 am
“I have been led to believe that aerosols tend to remain in their respective hemispheres unlike CO2,
http://www.barrettbellamyclimate.com/page29.htm
compares the satellite record of temperatures graphically latitude by latitude from1976 to 2006. The southern hemisphere seems to have warmed ~.1°C in that time. The northern hemisphere ~.6°C”

The difference can also be accounted for due to the fact that CO2 has essentially no greenhouse effect over the ocean. It only happens over land. The northern hemisphere has twice the land mass of the southern hemisphere. Since Antarctica doesn’t count because it radiates so little upwelling infrared there is nothing for greenhouse gases to absorb and reemit as downwelling IR it’s more like a 4:1 greater ratio of land in the northern hemisphere.
Nothing in climate science makes sense until it is understood that greenhouse gases don’t do any greenhousing over the ocean. Once that is understood all observations fall neatly in place.

Dave Springer
August 4, 2011 12:01 pm

The fact that greenhouse gases have no greenhouse effect over the ocean appears to be “the trade secret of climate science”. Not that climate boffins should be accused of keeping secrets, mind you (sarc sarc sarc). I’m reminded of the same thing happening in paleontology as famously and candidly admitted by perhaps the 20th century’s most distinguished paleontologist, Harvard’s distinguished professor Stephen J. Gould. I blogged about Gould’s admission here:
ttp://www.uncommondescent.com/intelligent-design/texas-mandates-teaching-the-trade-secret-of-paleontology/

24 January 2009
Texas Mandates Teaching “The Trade Secret of Paleontology”
Dave S.
Stephen J. Gould, perhaps the most famous paleontologist of the 20th century, wrote:
The extreme rarity of transitional forms in the fossil record persists as the trade secret of paleontology. The evolutionary trees that adorn our textbooks have data only at the tips and nodes of their branches … in any local area, a species does not arise gradually by the gradual transformation of its ancestors; it appears all at once and fully formed.
Lest I be accused of quote mining you can find Gould discussing it in more detail in Gould’s book The Richness of Life, pages 263 and 264, found in its entirety on Google Books.

What are the odds that there’s a scientist as distinguished in climate science as Gould was in paleotology admitting the secret? Slim to none is my guess.

D. J. Hawkins
August 4, 2011 8:21 pm

Dave Springer says:
August 4, 2011 at 12:01 pm
…Stephen J. Gould, perhaps the most famous paleontologist of the 20th century, wrote:
The extreme rarity of transitional forms in the fossil record persists as the trade secret of paleontology. The evolutionary trees that adorn our textbooks have data only at the tips and nodes of their branches … in any local area, a species does not arise gradually by the gradual transformation of its ancestors; it appears all at once and fully formed.
Lest I be accused of quote mining you can find Gould discussing it in more detail in Gould’s book The Richness of Life, pages 263 and 264, found in its entirety on Google Books…

The first time I read about “punctuated equilibrium” I thought I had stumbled on someone’s April Fool’s prank. When push came to shove, Gould was in the same boat as the Intelligent Design crew. Just more, you know, scientific about it all. Although, come to think of it, I never heard his explanation of what was doing the “punctuating.”

Brian H
August 5, 2011 2:37 am

DJ H;
Since you seem to have a limited attention span, there’s a two-word answer: “population bottlenecks”. YCLIU, but you won’t.

D. J. Hawkins
August 7, 2011 10:35 pm

Brian H says:
August 5, 2011 at 2:37 am
DJ H;
Since you seem to have a limited attention span, there’s a two-word answer: “population bottlenecks”. YCLIU, but you won’t.

Two word reply: “not quite”.
Even Eldredge and Gould admitted that the sudden appearance of higher order taxa was overeaching. The best you can hope for with population bottlenecks is the domination of previously random alleles because they were lucky enough to survive. It is sufficient to explain the tips of the branches. There’s no other mechanism elucidated for bringing up whole taxa. Wait, that sounds like another hypothosis we’ve been discussing…

Brian H
August 8, 2011 7:35 pm

“lucky enough” is only a part of it. Brutal winnowing of all but a few, with their valuable alleles, is entirely plausible. Not guaranteed; after all, only a few percent of one percent of species make it through the gauntlet(s).
There’s also, IMO, a form of “self-directed mutation” involved. Any of the “junk” DNA which usefully winnows the possible random mutation counts would be highly conserved. As much junk DNA is, contrary to all (purely statistical) expectation.