Shindell, Methane, and Uncertainty

Guest Post by Willis Eschenbach

http://www.gsfc.nasa.gov/gsfc/earth/pictures/hansen010302/methane.jpg
Image: NASA Goddard Spaceflight Center

A recent study by Shindell et. al, entitled Improved Attribution of Climate Forcing to Emissions, (Science Magazine, 30 October 2009, Vol. 326) reports on interactions between aerosols and methane and other greenhouse gases. It has been discussed on Watts Up With That here <http://wattsupwiththat.com/2009/10/31/an-idea-i-can-get-behind-regulate-methane-first/>, as well as on other blogs. The Shindell study gives new values for the “radiative forcing” of various greenhouse gases. The “radiative forcing” is the increase in greenhouse radiation which is due to the increases in greenhouse gases since 1750.

UPDATE: The remainder of this article has been removed at the request of the guest author, Willis Eschenbach. During discussion, an error was discovered (see comments) and rather than leave this article with that error in place which may possibly mislead somebody in the future (if they didn’t read through comments) I’m honoring Willis’ request for removal. The comments remain intact. – Anthony

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

105 Comments
Inline Feedbacks
View all comments
Frank
November 9, 2009 8:52 pm

Can I get an abstract or a summary in English please?
Pretty pictures, though.

WAG
November 9, 2009 9:01 pm

Obviously you know nothing about statistics or economics. You can’t draw conclusions from comparing uncertainty ranges of different studies. In any case, pointing to a study that found a HIGHER sensitivity to methane than past studies, and claiming that the disagreement proves there is ZERO sensitivity to methane, is a logical fallacy.
And, 73% of economists agree that uncertainty INCREASES the economic case for climate legislation.
http://akwag.blogspot.com/2009/11/dont-believe-global-warming-scientists.html
Who’s the alarmist now?

Reply to  WAG
November 9, 2009 9:05 pm

WAG:
From your link:

The law school’s Institute for Policy Integrity sent surveys to 289 economists who had published at least one article on climate change in a top-rated economics journal in the past 15 years.

No selection bias there. Nope. You might as well be surveying Real Climate moderators.

Ray
November 9, 2009 9:06 pm

That it be CO2, methane, ozone or even N2O, this is totally irrelevant to talk about greenhouse gases and their radiative forcing in view of the previous story. Regardless of man made or even natural emissions, the global temperature seems to be going down anyway.

eric anderson
November 9, 2009 9:16 pm

The IPCC Summary for Policy Makers states quite baldly on page 4 that the Level Of Scientific Understanding (LOSU) of the effects of aerosols, and changes in surface albedo from land use, is “medium-low.”
The report does not claim “high” LOSU for any Radiative Forcing Components except “Long-lived greenhouse gases” (includes methane and CO2).
Assuming their estimations of the level of scientific understand are correct, then how in the heck do you get so certain about a combined level of anthropogenic forcings, when many of the forcings are not well understood scientifically?
This does not compute. In essense they are saying, “We are uncertain about the level of the forcings caused by various major factors. But we are certain about the combined effects.”
Lunacy.

a jones
November 9, 2009 9:26 pm

And we should remember that for all practical purposes methane levels stopped rising ten years ago.
And if that rise was as the IPCC claims due to human activity what was it we stopped doing ten years ago?.
Kindest Regards

WAG
November 9, 2009 9:26 pm

Economists have a well-known free market bias, so in the economist community, you would expect the authors with the strongest opinions on climate legislation–and hence those who published most frequently on it–to be those opposed to it. Remember, the study participants were publishing in economics journals, not climate journals, so if the peer review process produces any bias (assuming economics journals go through peer review), the bias would be against climate legislation, not for it.
Also, it’s worth repeating that the original post here makes no logical sense. Within the uncertainty ranges of the climate sensitivities presented, the warming from methane would still be massive. Just because studies disagree on the magnitude of warming does not mean the correct value is zero. Really basic stuff.

November 9, 2009 9:29 pm

WAG (21:01:05) :
In any case, pointing to a study that found a HIGHER sensitivity to methane than past studies, and claiming that the disagreement proves there is ZERO sensitivity to methane, is a logical fallacy.
He didn’t claim that. What he said was the claims of *certainty* were obviously false.
And, 73% of economists agree that uncertainty INCREASES the economic case for climate legislation.
You can’t get 73% of economists to agree on anything, unless they’re all Keynesians — and then they’ll only agree that they’re Keynesians.

David Archibald
November 9, 2009 9:35 pm

a jones (21:26:17) :
You beat me to it. Methane has a half life in the atmosphere of 7 years. The level has not risen for eleven years, so an equilibrium has been reached between methane generation and oxidation. There can be no more warming from methane. It does not matter that interaction with aerosols makes the methane warming effect 0.1, 1.0 or 10.0 degrees. In fact the higher the real number, the less the contribution that CO2 makes, and the less the contribution that CO2 can make from here.

Willis Eschenbach
November 9, 2009 9:40 pm

WAG (21:01:05), thank you for your comments.

Obviously you know nothing about statistics or economics. You can’t draw conclusions from comparing uncertainty ranges of different studies. In any case, pointing to a study that found a HIGHER sensitivity to methane than past studies, and claiming that the disagreement proves there is ZERO sensitivity to methane, is a logical fallacy.

I fear that either my writing is not clear, or that your reading is not clear. Where did I say that there is zero sensitivity to methane? What I wrote is that the latest study says that the forcing from methane is way, way outside the range given by the IPCC.
Next, why can’t different studies be compared? If one study says variable A is 0.5 ±0.1 (99% CI), and another study says that variable A is 1.0 ± 0.3 (99% CI), what prevents us from comparing them?

And, 73% of economists agree that uncertainty INCREASES the economic case for climate legislation.
http://akwag.blogspot.com/2009/11/dont-believe-global-warming-scientists.html

So economists believe in anthropogenic global warming? I’m shocked, I tell you, shocked … but why should that concern us in the slightest? Science is not decided by consensus, thank goodness … and in particular, science is not decided by a consensus of economists.

Who’s the alarmist now?

I would say that at this point in history, anyone who believes that “the science is settled” about AGW is an alarmist. And I would say that when the IPCC doesn’t know basic facts, such as how much forcing comes from methane, anyone wanting to spend billions of dollars based on their “consensus opinion” is an alarmist.

anna v
November 9, 2009 9:40 pm

Uncertainties the way the IPCC treats them are not scientifically calculated.
I keep giving this reference to AR4 chapter 8 , the physics case:
More complex metrics have also been developed
based on multiple observables in present day climate, and have
been shown to have the potential to narrow the uncertainty in
climate sensitivity across a given model ensemble (Murphy et
al., 2004; Piani et al., 2005). The above studies show promise
that quantitative metrics for the likelihood of model projections
may be developed, but because the development of robust
metrics is still at an early stage, the model evaluations presented
in this chapter are based primarily on experience and physical
reasoning, as has been the norm in the past.

based primarily on experience and physical reasoning, as has been the norm in the past
They admit it out in the open as well as in the last sentence of the quote of above:
Note that a number of uncertainty ranges in the Working Group I TAR corresponded to 2-sigma (95%), often using expert judgement.
I suppose it justifies “the norm in the past”.
The spaghetti graphs are not error bars. They are according to the taste of the modeler clearly stated, so they cannot be sued . It is the small print.
How are true statistical error bars given for a model to data fit? They are given by creating a likelihood function with the independent variables of the model and then maximizing it by varying the parameters within their errors. The output is a chisquare per degree of freedom, which should be near 1 for a many variable problem, and the error bars on the output values sought ( example would be the mass and the widths of resonances).
What are the model runs? The modelers vary the parameters used to fit the temperature curve according to “their experience and physical reasoning” and make runs and give average values with their estimated by experience and physical reasoning feelings. Then a lot of such models are put on a graph, and voila a seemingly error band around the fitted temperature curve.
It means next to nothing.
WAG (21:01:05) :
Obviously you know nothing about statistics or economics. You can’t draw conclusions from comparing uncertainty ranges of different studies. In any case, pointing to a study that found a HIGHER sensitivity to methane than past studies, and claiming that the disagreement proves there is ZERO sensitivity to methane, is a logical fallacy.
I am sorry, but error bars are calculated just so that one can compare and make conclusions from different studies. That is the function of error bars and you are displaying your level of knowledge

a jones
November 9, 2009 9:44 pm

Put three economists in a room and get five opinions.
Put a dozen ‘climate scientists’, as they grandly style themselves, and get one opinion, they need more money to perform some elaborate mystical ritual to avert climate catastrophe.
Which is not to say there are not many honest scientists working to discover what is happening. There are.
But what is science against these charlatans and mountebanks who claim to control the weather, past and future, and for all time?
Provided you pay them of course. And they don’t come cheap.
Kindest Regards

November 9, 2009 9:49 pm

Willis,
Is your previous work on a new sensitivity calculation being reviewed? An update would be appreciated if you find a chance.

This is the same situation as the certainty represented by proxy studies. The scientists use the annual differences of filtered proxies to generate a standard deviation by year of certainty each year. After the data has been filtered the bandwidth is massively reduced and the real uncertainty is very difficult to correct for.
For proxy sticks in the end, one thing we can say for certain is that the ‘normal distribution’ based certainty is less certain than the true certainty of the signal.
Stats are funny things.
Today while playing around with sat data, I found accidentally that a ‘downslope’ in RSS satellite temperatures exceeded 95% statistical certainty for the past 8 years.
http://noconsensus.wordpress.com/2009/11/09/statistical-significance-in-satellite-data/
It wasn’t the point of the post but it is a strong downslope that doesn’t include 1998.

Mike Bryant
November 9, 2009 9:58 pm

Uncertainty’s truth is unspoken
Real numbers still sleep unawoken
It’s settled, the science
Is all in compliance
But science, dear reader, is broken.
Reply: I am thankful that Hoboken did not appear ~ ctm

a jones
November 9, 2009 10:09 pm

I do like little dittys like that.
Short, sharp and to the point.
Kindest regards

Gene Nemetz
November 9, 2009 10:18 pm

Go and eat a cheeseburger guilt free.

Willis Eschenbach
November 9, 2009 10:23 pm

Jeff Id (21:49:45), you ask:

Willis,
Is your previous work on a new sensitivity calculation being reviewed? An update would be appreciated if you find a chance.

If you mean my work on tropical tropospheric amplification, I am rewriting it for submission to a journal. It’s slow because I hate doing that, it always feels like I have to lobotomize myself to do it, but such is science.

WAG
November 9, 2009 10:39 pm

Bill Tuttle –
Ah, you’re on to something. You’re right, the author never *claimed* that the sensitivity to methane was zero–he *implied* it. He pointed out that there were two different estimates of the climate’s sensitivity to methane, and that therefore claims of certainty are “false” and “not valid,” and that “much is not understood” about climate. Of course, the uncertainty in the graph is whether things will be bad or really bad, not over whether methane affects the climate. But through clever use of language and misleading use of statistics, the author creates the impression that we don’t know anything about climate, and should therefore do nothing. It’s just what Steve McIntyre did to the hockey stick – when the error he pointed out was corrected, the shape of the graph didn’t fundamentally change, but he’d accomplished his mission, which was to sow the seeds of doubt. Same with Briffa’s study.
Congratulations, you’ve just had your first lesson in denial 101: how to imply something without saying anything.
Of course, if you’re getting your science advice from a TV weatherman, you might as well be getting your medical advice from Tor the holistic healer:
http://akwag.blogspot.com/2009/11/youre-not-scientist-but-you-play-one-in.html

Richard111
November 9, 2009 10:58 pm

“Well, the Shindell study first recalculated the values for the radiative forcing of a variety of greenhouse gases using the standard methods.”
Can anyone post a link to these methods please.
Since learning about Avogadro’s Law and number and moles etc. I can calculate the precise number of molecules of any “greenhouse” gas in a cubic meter of air at any temperature and density but I cannot find how to calculate the amount of longwave radiation absorbed, re-radiated or converted to heat by conduction. (the greenhouse effect)
I do this purely as a hobby for when anyone raises the subject of “global warming” or “climate change” I talk knowingly of numbers like 6.022 times 10 to the power of 23 and radiation bandwidths in microns and they tend to go a bit blank 🙂

WAG
November 9, 2009 11:00 pm

Also, I stand corrected here by Willis:
“Next, why can’t different studies be compared? If one study says variable A is 0.5 ±0.1 (99% CI), and another study says that variable A is 1.0 ± 0.3 (99% CI), what prevents us from comparing them?”
Must have been the cold medicine speaking when I wrote that. thanks for pointing out the error

Bruckner8
November 9, 2009 11:10 pm

eric anderson (21:16:10) :
This does not compute. In essense they are saying, “We are uncertain about the level of the forcings caused by various major factors. But we are certain about the combined effects.”

This sounds like “We can’t predict local weather but we’re certain we can predict Global Climate Change.”

tallbloke
November 9, 2009 11:14 pm

Frank (20:52:25) :
Can I get an abstract or a summary in English please?
Pretty pictures, though.

Crystal clear to me, thanks WIllis.

par5
November 9, 2009 11:21 pm

You follow a preacher through the gates of heaven, and follow his children through the gates of hell. Everybody knows that the preachers’ kids are the worst of the bunch…

Mike G in Corvallis
November 9, 2009 11:26 pm

Shorter WAG: “You are responsible for my lack of reading comprehension! So you’re stupid!”

Willis Eschenbach
November 9, 2009 11:33 pm

WAG (22:39:12), you are absolutely wrong when you say:

Bill Tuttle –
Ah, you’re on to something. You’re right, the author never *claimed* that the sensitivity to methane was zero–he *implied* it. He pointed out that there were two different estimates of the climate’s sensitivity to methane, and that therefore claims of certainty are “false” and “not valid,” and that “much is not understood” about climate. Of course, the uncertainty in the graph is whether things will be bad or really bad, not over whether methane affects the climate.

I implied nothing of the sort at all. I did not say it, I did not hint at it, I did not whisper it.
My focus, as I clearly stated, is on the false claims of certainty by the IPCC. They said that they were 99% sure that the methane forcing was between 0.4 and 0.6. The recent study has shown that their claims of accuracy were totally unfounded. I bring this up as an example of the false certainty that the IPCC claims for its results.
Despite your attempt to torture my words into another meaning, my statements were very clear. I did not say that this meant warming would be less bad or more bad. In fact, the authors of the study say (as they always do) that this will make little difference in the estimates of future warming. This is because (quite coincidentally I’m sure) their estimate of total forcing is not far from the IPCC estimate. The Shindell paper says:

We calculated
both the “abundance-based” RF owing to the net
atmospheric composition response by species
when all emissions are changed simultaneously
and the “emissions-based” forcing attributable to
the responses of all species to emissions of a single
pollutant (Fig. 1). The sum of the forcings that take
place via response of a particular species in the
emissions-based analysis (each represented by a
different color in Fig. 1) is approximately equal to
the forcing due to that species in the abundance-based
analysis. Likewise, the sums of all emissions-based
and all abundance-based forcings are similar.

This “coincidental” result, however, arises directly from the nature of the models used to do their calculations. Kiehl has shown that because of the way the climate models are tuned to match the historical temperature record, there is a close relationship between total forcing and climate sensitivity. As one goes up, the other goes down.
In the Shindell case, they have run a model “in reverse” to calculate the forcings. But the nature of the models guarantees that they will end up with a total forcing which is similar to the IPCC forcings. When the models are run “forwards” to calculate temperatures, the IPCC forcings yield a given temperature pattern. So when they are run “in reverse” to calculate forcings, a given temperature pattern will of necessity give similar total forcings.
However, this is meaningless to me. This is because I do not think that the change in the calculated methane forcings makes any difference at all, for the reasons that I explain in detail in my Thunderstorm Thermostat Hypothesis paper (q.v.). So your accusation, that I am trying to minimize the threat, is nonsense. I don’t think there is a threat from methane, but my reasons have nothing to do with the subject under discussion.
In short, your ideas about what I am saying are very far from what I said. I have no axe to grind regarding whether methane forcing is larger, smaller, or zero. Please read what I said. You are more than welcome to comment on what I have written … but your comments about what you imagine I was trying to imply are very wide of the mark.

Willis Eschenbach
November 9, 2009 11:45 pm

Richard111 (22:58:47), you ask an interesting question:

“Well, the Shindell study first recalculated the values for the radiative forcing of a variety of greenhouse gases using the standard methods.”
Can anyone post a link to these methods please.
Since learning about Avogadro’s Law and number and moles etc. I can calculate the precise number of molecules of any “greenhouse” gas in a cubic meter of air at any temperature and density but I cannot find how to calculate the amount of longwave radiation absorbed, re-radiated or converted to heat by conduction. (the greenhouse effect)
I do this purely as a hobby for when anyone raises the subject of “global warming” or “climate change” I talk knowingly of numbers like 6.022 times 10 to the power of 23 and radiation bandwidths in microns and they tend to go a bit blank 🙂

None of these radiative forcing numbers can be calculated directly. You have to use a climate model to calculate them. Of course, since we don’t understand the climate, to me it is somewhat circular to build a model of a climate system we don’t understand and then use the model to try to understand the climate system we don’t understand … but I digress.
There is a handwaving description in the Shindell paper, viz:

We used the composition-climate model
Goddard Institute for Space Studies (GISS) Model
for Physical Understanding of Composition-
Climate Interactions and Impacts (G-PUCCINI)
(6) to calculate the response to removal of all
anthropogenic methane, carbon monoxide (CO)
plus volatile organic compounds (VOCs), NOx,
SO2, and ammonia emissions. This model couples
gas-phase, sulfate (7), and nitrate (8) aerosol
chemistry within the GISS ModelE general
circulation model (GCM). Anthropogenic emissions
are from a 2000 inventory (9).

So the answer to your question is that the numbers you ask about are calculated by a random climate model of unknown accuracy which has never been subjected to the industry standard V&V and SQA testing (Verification and Validation, and Software Quality Assurance) … but then they claim that they know how accurate the estimates are, and report their claims to the nearest hundredth of a watt per square meter …
Welcome to the wonderful world of climate science. If you want more accuracy, just add a bunch more untested models, and average the results!

1 2 3 5