Guest Post by Willis Eschenbach

A recent study by Shindell et. al, entitled Improved Attribution of Climate Forcing to Emissions, (Science Magazine, 30 October 2009, Vol. 326) reports on interactions between aerosols and methane and other greenhouse gases. It has been discussed on Watts Up With That here <http://wattsupwiththat.com/2009/10/31/an-idea-i-can-get-behind-regulate-methane-first/>, as well as on other blogs. The Shindell study gives new values for the “radiative forcing” of various greenhouse gases. The “radiative forcing” is the increase in greenhouse radiation which is due to the increases in greenhouse gases since 1750.
UPDATE: The remainder of this article has been removed at the request of the guest author, Willis Eschenbach. During discussion, an error was discovered (see comments) and rather than leave this article with that error in place which may possibly mislead somebody in the future (if they didn’t read through comments) I’m honoring Willis’ request for removal. The comments remain intact. – Anthony
Uncertainty intervals depend upon all kinds of assumptions about the nature of the sample and how that sample relates to the underlying population of datapoints and the nature of that underlying population, etc. If we’re talking uncertainty ranges of variables that are estimated from others (using some “model”; i.e., formula), we’ve introduced an entirely new class of issues: relationship uncertainties (linear/ not linear, for one). Bottom line, there is some sensibility in talking about uncertainty ranges for direct measurements of phenomena, but that still is not easy to do and it is not logically entailed by any means whatsoever — lots of assumptions are involved. Thus, to talk about uncertainty ranges for temperatures as a result of CH3 (or any kind of) forcing as though doing so provides some “certainty” as to what is possible, is pure nonsense. You can do it, but it means little for poorly characterized relationships (such as chemical and radiative interaction in the environment). It’s amazing how bad so many scientists seem to be at understanding the fundamentals of inferential statistics. Sigh.
WAG,
Paraphrasing your first ( rather tasteless) sentence, you obviously don’t know anything about grammar. Assuming that 73% is equivalent to unanimity is not a grammatical mistake, it’s a semantic one.
I have a metaphor as puerile as yours; when mechanic #1 says my car will go another 100,000 miles and mechanic #2 says it will go another 100 miles, they’re basically saying the same thing.
@WAG. About forty years ago someone said that if you laid all the world’s economists end-to-end no two of them would agree on anything. But it would be a good start.
Viewing the events in the world economy over the last three years has greatly reduced my confidence in the ability of economists to calculate anything other than a way out when the markets crash.
The “economy” is a chaotic system, the climate is a chaotic system, and neither is understood to the level of anything being decided.
BTW, your courtesy level calculates at minus 1.
Tenuc (05:41:41) :
The point is, does anyone really know the actual effect of increasing CO2 in the atmosphere?
I would be really surprised if this hasn’t already been done, but I propose a simple experiment using a commercial greenhouse. The temperature should rise with added C02. It would be nice to know the shape of that function, wouldn’t it?
Then, a further experiment could be run to include a shallow pond, and to increase the humidity by sprays to simulate rain, to tell us whether Henry’s law actually works or not, as well as giving some idea of the real airborne lifetime of added CO2 molecules.
Then we could look at other stuff such as methane, decreasing O2, and other related conjectures.
And I take your point about scaling, but we CAN say that the measured experimental effects would be a lot less in the Real World … maybe even vanishingly small/negligible!
So I suggest that this kind of experiment would be relatively cheap to do, and very quick. It could be funded directly via small donations from skeptics, and would be just the thing for the MSM and the Blogosphere to follow in real-time.
The hell with computer modelling! Let’s do some proper experiments we can all see!
My thanks to all who are participating. Let me divide my responses up into several posts about different important issues.
CIVILITY
First, let me make a heart-felt plea for civility. There is a huge difference between saying “You are wrong”, and “You are wrong, you idiot.” Speculations about another poster’s mental abilities, proclivity for error, ancestry, level of education, and motives for posting have no place in a discussion of science. Please accord all posters the basic courtesies.
In particular, this applies to what we say to/about people like WAG. Unlike many people who only post on sites where they get agreement, he has come here to present an opposing point of view. This is both desirable and laudable, as it is the heart of what science is about. I toss out my ideas, and other scientists try to knock them down. That is how science advances.
So let us make everyone welcome in a collegial spirit, without trying to second-guess them, without questioning their fitness to espouse their ideas, without commenting on their command of English or their supposed mental state. I invite you all to focus on the ideas, and not on the people who put them forward.
Finally, please do not respond “in kind”. The fact that a poster may be discourteous to us does not give us the right to be discourteous to them.
We now return you to your regularly scheduled programming …
WAG above espouses a frequently-seen point of view about uncertainty:
While it is true that uncertainty does not mean we should do nothing, it also does not mean that we should do something. If your doctor says “We’re not certain what’s wrong with you, so we’re going to perform open-heart surgery”, you would be justified in refusing.
Much of this confusion stems from a misunderstanding of the Precautionary Principle. This Principle is widely misinterpreted. A bit of caution about the Precautionary Principle is in order. It is not just a restatement of “better safe than sorry”, nor is it ordinary caution.
Let me start with an early and very clear statement of the “Precautionary Principle” (I’ll call it PP for short), which comes from the UN Rio Declaration on the Environment (1992). Here’s their original formulation:
“In order to protect the environment, the precautionary approach shall be widely applied by States according to their capability. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.”
This is an excellent statement of the PP, as it distinguishes it from such things as wearing condoms, denying bank loans, approving the Kyoto Protocol, invading Afghanistan, or using seat belts.
The three key parts of the PP (emphasis mine) are:
1) A threat of serious or irreversible damage.
2) A lack of full scientific certainty (in other words, the existence of partial but not conclusive scientific evidence).
3) The availability of cost-effective measures.
Here are some examples of how these key parts of the PP work out in practice.
We have full scientific certainty that condoms and seat belts save lives. Thus, using them is not an example of the PP, it is simply acting reasonably on principles about which we are scientifically certain.
There are no scientific principles or evidence that we can apply to the question of invading Afghanistan, so we cannot apply the PP there either.
Bank loans are neither serious nor irreversible, nor is there partial scientific understanding of them, so they don’t qualify for the PP.
Finally, the Kyoto Protocol is so far from being cost-effective as to be laughable. The PP can be thought of as a kind of insurance policy. No one would pay $200,000 for an insurance policy if the payoff in case of an accident were only $20, yet this is the kind of ratio of cost to payoff that the Kyoto Protocol involves.
On the other side of the equation, a good example of when we might use the PP involves local extinction. We have fairly good scientific understanding that removing a top predator from a local ecosystem badly screws things up. Kill the mountain lions, and the deer go wild, then the plants are overgrazed, then the ground erodes, insect populations are unbalanced, and so on down the line.
Now, if we are looking at a novel ecosystem that has not been scientifically studied, we do not have full scientific certainty that removing the top predator will actually cause serious or irreversible damage to the ecosystem. However, if there is a cost-effective method to avoid removing the top predator, the PP says that we should do so. It fulfils the three requirements of the PP — there is a threat of serious damage, we have partial scientific certainty, and a cost-effective solution exists, so we should act.
Regarding the proposal that we pump CO2 into the ocean (or take any action regarding CO2), while there is at least a theoretical threat of serious or irreversible damage, there is no partial scientific certainty, nor are there any cost-effective solutions in hand, including pumping the oceans full of CO2. At the moment, therefore, no action regarding CO2 is justified by the Precautionary Principle.
So no, action on CO2 is not justified at this time. Instead, we should support what is called the “no-regrets” option.
This is to take actions that will be productive regardless of whether or not CO2 is the problem. What most people forget is that we have all of the threatened catastrophes which the models say will come from CO2 are with us today — droughts, and floods, and disease, and famine, and rising sea level and the seven Biblical plagues, we have all of them right now.
So the no-regrets option is to research and advance the knowledge and train people around the world in how to avoid the poverty and pain and suffering which comes from the vagaries of the climate today. That will pay huge dividends whether or not CO2 is the boogeyman. Doing that is the best way of preparing for an unknown tomorrow.
And that is what we should be advocating. The “no-regrets” policy is the only policy that makes sense in the face of the uncertainty that WAG correctly points out above, which is very real. We are not certain about climate science … but that doesn’t mean we should waste billions of dollars on any possible solution.
Willis,
Correct me if I am wrong but a model was used to determine the sensitivity of methane; it was not derived from first principles? This is circular reasoning as you will always get the assumed value implied in the model. CC has forced me to dust off my “Radiative Heat Transfer” by Hottel. All the chem engineers out there will know this. When I compare CO2 and methane emissivities at the same value of partial pressure times path length, methane is about 1/4 of CO2. At equilibrium emissivity equates to absorptivity the key parameter in describing the “greenhouse” effect. As the methane conc is at least 1/100 of CO2 then methane contributes 1/400 to atmos absorption as does CO2 ie SFA. It is not hard to also create a greenhouse model in excel using the principles in the book. The atmos is just layers of absorbing/emitting media.
THE FALLACY OF THE “EXCLUDED MIDDLE”
This is a common mistake made by both sides in the debate. I’ll use WAG’s statement as an example, because it is to hand, but it is quite common from AGW opponents as well as AGW supporters:
The fallacy of the excluded middle is the assumption that we have listed all of the possible choices. In this case, they are listed as “pass legislation” or “do more of the same.”
In the real world, however, there are often a plethora of other options. In this case they might include
• put money into climate research to reduce the uncertainty.
• pursue a “no-regrets” path of actions that will have benefit whether or not CO2 is the secret global thermostat.
• put on a public education campaign to encourage the “three R’s” (reduce, re-use, recycle).
• fund research into alternate sources of energy.
and so on.
I find myself falling into this fallacy all too often, and to fight it I have my “rule of two”. This is that whenever I list only two outcomes (e.g. “Are my writing skills that bad, or are you just pretending you don’t understand”) I am wrong. In general there’s a host of options and explanations for any real-world situation, and I ignore them at my peril.
Coincidence? The leveling off of methane levels in the atmosphere seems to be in parallel with the (at least temporary) halt to temperature increase over the past decade. This makes sense, since the melting of permafrost is the most likely culprit in the past methane rise.
May we then conclude that, while we don’t know the effect of methane on global temperature, we can be certain that rising temperatures will increase methane levels? Somehow, I don’t think that’s the result AGWers were looking for….
Oliver – I admit, my initial remark was rather not tasteful. Willis has been respectful, and I should not have said it.
But there’s still an interesting debate here over uncertainty and how much certainty we need before taking an action. Jim writes that “doing something carries a huge cost,” and therefore “The science MUST be certain!!”
Barring the point that there’s no such thing as 100% certainty (any businessman will tell you that decisions must be made in uncertainty), I can turn this argument around and point out that the costs of climate legislation are also uncertain (and also rely on models). The Congressional Budget Office estimates that with cap-and-trade, US GDP will grow about 0.03%-0.09% per year more slowly. The National Association of Manufacturers’ worst-case estimate is 0.15% slower growth. And like past cost analyses of environmental regulations, neither of these models include the effects of technology – breakthroughs spurred by higher carbon prices that allow us to produce the same level of output with less energy (e.g. efficient manufacturing processes, cheaper solar panels, etc.). So models don’t agree on the exact costs of cap-and-trade, but they do agree that such costs will be relatively modest. And the dimensions of uncertainty – namely, how technology will respond to higher carbon prices – gives us reason to believe that the costs will be smaller than predicted. Why should we trust economic models and not climate models?
The costs of global warming are also uncertain. However, there’s much more variation in these estimates. They range from modest (e.g. money spent building sea walls) to catastrophic (permanent droughts). And the dimensions of uncertainty – namely, feedbacks – have generally pointed to greater-than-expected warming as more is learned (the original post refered to an estimate of greater-than-expected sensitivity to methane). If humans lack the foresight to predict how interventions in the market affect national wealth, why would anyone trust our ability to engineer the workings of Nature without unintended consequences?
The point is, while the costs of climate legislation fall within well-understood bounds (higher energy costs and short-term job losses), the costs of climate change range from modest to worse-than-expected to catastrophic to as-yet-unimagined. The economic crisis has taught us to ignore these long-tail probabilities at our own risk.
And I still haven’t seen a defense of the logic chain that I originally criticized:
“new study finds sensitivity to methane is higher than previously estimated –> scientists disagree over climate sensitivity –> the science is not settled –> we can’t be certain enough that we are causing climate change to justify action”
You don’t need to be certain it’s 25 or 30 degrees outside (F) to know you should put on a coat, and whether forcing due to methane is 0.5 or 1.0 W/m^2 does not reduce our certainty that adding more greenhouse gases will warm the planet.
Jörg Zimmermann (04:27:34), you raise a very interesting issue:
If there were no difference between the approaches, the paper would be meaningless. The difference is this:
According to the IPCC, the rise in methane since 1750 has been the cause of a temperature rise of 0.48 w/m2 * 0.8°C/W-m^2 = 0.4°C.
According to the new study, the rise in methane since 1750 has been the cause of a temperature rise of 0.99 w/m2 * 0.8°C/W-m^2 = 0.8°C.
This is a significant difference, which (if one believes that GHGs are governing the earth’s temperature) has significant policy implications. Or as the Shindell study says,
In other words, indeed there is a difference between the two approaches.
Willis Eschenbach, very good article, and even better follow-ups (civility, uncertainty, excluded middle); all good things to keep in mind. The bit about the precautionary principle is a keeper.
Uncertainty, to me, is the greatest weakness of the alarmist view. When dealing with something as complex as the earth’s climate system, a model that’s only partly right is wrong, period. Yet the IPCC uses an “ensemble” of models–only one of which at most can be “right”–to predict (excuse me, “project”) the state of an intricate system decades into the future. And these projections are used to justify doing certain harm now to prevent uncertain damage at some constantly-receding time in the future. If that’s not a good definition of insanity…
WAG (22:39:12) : Thank you for the lesson in denialism. I’ve always wondered about the kind of mindset that would reject science and the scientific method. Your stream of consciousness narrative gives me at least some insight into how a denialist such as yourself operates. This will be invaluable in trying to counteract denialists such as yourself and RC.
WAG,
I appreciate the civil tone of your post, however I disagree with most of what you’ve said.
As regards the logic chain that you find flawed, I would make the point that the credibility of an advocate is germane to an assessment of the advice given. In this case, putatively, not only were scientists wrong about CH4 but they were, therefore wrong about CO2, unless they were wrong about the total forcings, instead.
The wider context of this dispute must be familiar to you; we don’t all agree on any aspect but, of course, the smaller the problem, the less urgent the remedy. I suspect there is absolutely no problem at all, but time will tell.
In the meantime, it isn’t certain that adding GHG’s will warm the Earth, it isn’t certain that a warmer Earth is a catastrophe, but, perversely, it’s not certain that expending enormous amounts of wealth and energy on a non-problem would be an economic disaster. Most of the global economy is concerned with the pursuit of luxury to some degree, since all we really need to perpetuate our species is enough to eat and a secure place to hang out. Economic activity happens because people don’t have what they want, not what they need. If they want a warm fuzzy sense of saving a planet from themselves and they’re prepared to pay good coin for it, then let them fill their boots. As long as it’s not too much of my coin.
Sorry, Willi Eschenbach, you still missed the point. In the IPCC radiative forcings you see ozone and stratospheric water listed beside methane. In the approach by Shindell et al. these compounds are no longer viewed separately. Methane is the primary pollutant, ozone and stratospheric water are secondary pollutants. In the emissions based approach you put the radiative forcings due to secondary pollutants on top of the radiative forcings of the primary pollutants. In the emissions based approach you will not see ozone or stratospheric water being listed separately. If you add the radiative forcings of methane and larger parts of ozone and stratospheric water from the abundance-based approach and calculate the error bars accordingly, you should come closer to the emissions-based approach. Simpler said, you compare apples and oranges.
Jörg Zimmermann (04:27:34), a further thought on the question.
If the initial estimate of the uncertainty of the forcing of methane and the other forcings had been correct, the uncertainty intervals would have overlapped after the “effects of secondary greenhouse gases are redistributed to the primary greenhouse gases “.
The fact that they were not overlapping shows that the uncertainty estimates were too narrow.
The uncertainty, of course, did not allow for our lack of understanding of the climate.
In addition, the uncertainty (both the old and the new) is obtained in a most curious way — from the variation in the results of different climate models.
However, this is not a measure of the uncertainty of our estimates of the underlying variable. It is merely a measure of the spread of the models, which may simply represent a commonality of shared incorrect assumptions …
so CO2 is 24.14 times as effective in warming the earth as is stratospheric water.
Is that stratospheric CO2 versus stratospheric water; and what is the local ambient temperature at that stratospheric zone; in other words is the stratospheric H2O in gaseous form, or liquid or solid form.
What on earth is the significance of stratospheric trace gases, since it is generally considered that the major GHG effect of say CO2 is in just the first few metres at ground level; and presumably the same goes for water vapor.
My last question would be; are those radiative forcing Watts/m^2 values the same all over the globe or is there say a latitudinal variation; it just doesn’t seem rational that there is as much W/m^2 available to trap by CO2 (or methane or stratospheric water) at the poles as there is over say a tropical African or middle east desert.
I hate to even ask how many stations there are all over the globe that have the necessary equipment to measure the local radiative forcing W/m^2 contributed by each of the major GHGs. Something tells me that without proper mapping of the local values, it would not be possible to compute a (credible) mean global value.
“”” WAG (22:39:12) :
Bill Tuttle –
Ah, you’re on to something. You’re right, the author never *claimed* that the sensitivity to methane was zero–he *implied* it. He pointed out that there were two different estimates of the climate’s sensitivity to methane, and that therefore claims of certainty are “false” and “not valid,” and that “much is not understood” about climate. “””
“””” You’re right, the author never *claimed* that the sensitivity to methane was zero–he *implied* it. “”””
Well not exactly; maybe YOU inferred it !
Since you alone raised the issue, statistics suggests that that is the more likely explanation.
Willis Eschenbach: I haven’t been through the other comments. Sorry if this is a duplicate comment or if it belongs on an earlier thread.
As I was reading your post, it occurred to me that Shindell et al were trying to rearrange greenhouse gas attribution in an attempt to explain the recent flattening of global temperatures and OHC. Since methane was relatively flat from 1998 to 2006…
http://www.esrl.noaa.gov/gmd/aggi/aggi_2009.fig2.png
…it appears they’re trying to make it the dominant greenhouse gas.
Mr. Eschenbach,
I agree with the comments of Zimmerman that you are comparing apples to oranges. The values for “Abundance-based” and “emissions-based” are coming out of the same models; they are different from each other because they include different things: they are defined differently. The total RF from all forcings is about the same in the two cases; they are just distributed differently. In the “emissions-based” metric, methane gets credit for effects it has on atmospheric chemistry.
Note that tropospheric ozone does not have its own bar on the emissions-based chart. The point is that IPCC reports assign a RF value to tropospheric ozone, but ozone is not directly emitted by industry; it is formed by reactions involving CO, VOCs including methane, and NOx. Roughly speaking, the emissions-based view thus divides up the contribution of tropospheric ozone among the components that react to form it. Hence, methane gets a RF boost. This is to help guide policy, as perhaps the reader may not know that methane, CO, NOx and other VOC emissions lead to the production of low-level ozone. This is not a reflection of the uncertainty on the role of methane; nor does it signal any sort of big mistake in past RF values for methane.
If you want to compare apples with apples, then compare the FAR value with the ‘abundance-based’ value, as those are defined in the same way. You’ll see them to be similar.
Bob Tisdale (14:03:12), good to hear from you. You say:
Having been accused of having nefarious motives quite often (including in this thread), I prefer to avoid speculation on the motives of the scientists involved. While what you say may be true, it also may be very untrue, and unfair to the scientists involved. I try to ascribe good motives to everyone unless there is strong evidence otherwise.
George E. Smith (13:40:40) :
Since you didn’t ask, I believe that the answer is “none”. As far as I know, given our current measuring devices, it can’t be measured from the ground, or from the air, or anywhere. It can only be estimated from a computer model.
I’d be glad to be proven wrong on this one, however.
Does W/m^2 have any relation to atmospheric pressure? Would the value of a forcing fluctuate if atmospheric pressure changed?
Willis: The closest thing to a directly measurable radiative forcing is that due to changes in solar irradiation. Satellites can measure the change in incoming radiation; you then account for whatever fraction is reflected, and the basic fact that the earth is round. Even here, you need a model to get further in the detail.
Bob Tisdale: I think your hunch is on the wrong track. If a modeler wanted to crank down the amount of warming in the model, he’d reduce the total forcing somehow. That’s not what’s happening here. The total forcing is about the same; the amount due to CO2 is about the same. The authors are just relating the different forcings to emissions; they are normally more defined in terms of composition of the atmosphere, not composition of actual emissions. Due to chemical reactions in the atmosphere of the emitted compounds, these are not the same thing. Hence the paper.