Sense and Sensitivity II – the sequel

By Christopher Monckton of Brenchley

Joel Shore, who has been questioning my climate-sensitivity calculations, just as a good skeptic should, has kindly provided at my request a reference to a paper by Dr. Andrew Lacis and others at the Goddard Institute of Space Studies to support his assertion that CO2 exercises about 75% of the radiative forcings from all greenhouse gases, because water vapor, the most significant greenhouse gas because of its high concentration in the atmosphere, condenses out rapidly, while the non-condensing gases, such as CO2, linger for years.

Dr. Lacis writes in a commentary on his paper: “While the non-condensing greenhouse gases account for only 25% of the total greenhouse effect, it is these non-condensing GHGs that actually control the strength of the terrestrial greenhouse effect, since the water vapor and cloud feedback contributions are not self-sustaining and, as such, only provide amplification.”

Dr. Lacis’ argument, then, is that the radiative forcing from water vapor should be treated as a feedback, because if all greenhouse gases were removed from the atmosphere most of the water vapor now in the atmosphere would condense or precipitate out within ten years, and within 50 years global temperatures would be some 21 K colder than the present.

I have many concerns about this paper, which – for instance – takes no account of the fact that evaporation from the surface occurs at thrice the rate imagined by computer models (Wentz et al., 2007). So there would be a good deal more water vapor in the atmosphere even without greenhouse gases than the models assume.

The paper also says the atmospheric residence time of CO2 is “measured in thousands of years”. Even the IPCC, prone to exaggeration as it is, puts the residence time at 50-200 years. On notice I can cite three dozen papers dating back to Revelle in the 1950s that find the CO2 residence time to be just seven years, though Professor Lindzen says that for various reasons 40 years is a good central estimate.

Furthermore, it is questionable whether the nakedly political paragraph with which the paper ends should have been included in what is supposed to be an impartial scientific analysis. To assert without evidence that beyond 300-350 ppmv CO2 concentration “dangerous anthropogenic interference in the climate system would exceed the 25% risk tolerance for impending degradation of land and ocean ecosystems, sea-level rise [at just 2 inches per century over the past eight years, according to Envisat], and inevitable disruption of socioeconomic and food-producing infrastructure” is not merely unsupported and accordingly unscientific: it is rankly political.

One realizes that many of the scientists at GISS belong to a particular political faction, and that at least one of them used to make regular and substantial donations to Al Gore’s re-election campaigns, but learned journals are not the place for über-Left politics.

My chief concern, though, is that the central argument in the paper is in effect a petitio principii – a circular and accordingly invalid argument in which one of the premises – that feedbacks are strongly net-positive, greatly amplifying the warming triggered by a radiative forcing – is also the conclusion.

The paper turns out to be based not on measurement, observation and the application of established theory to the results but – you guessed it – on playing with a notorious computer model of the climate: Giss ModelE. The model, in effect, assumes very large net-positive feedbacks for which there is precious little reliable empirical or theoretical evidence.

At the time when Dr. Lacis’ paper was written, ModelE contained “flux adjustments” (in plain English, fudge-factors) amounting to some 50 Watts per square meter, many times the magnitude of the rather small forcing that we are capable of exerting on the climate.

Dr. Lacis says ModelE is rooted in well-understood physical processes. If that were so, one would not expect such large fudge-factors (mentioned and quantified in the model’s operating manual) to be necessary.

Also, one would expect the predictive capacity of this and other models to be a great deal more successful than it has proven to be. As the formidable Dr. John Christy of NASA has written recently, in the satellite era (most of which in any event coincides with the natural warming phase of the Pacific Decadal Oscillation) temperatures have been rising at a rate between a quarter and a half of the rate that models such as ModelE have been predicting.

It will be helpful to introduce a little elementary climatological physics at this point – nothing too difficult (otherwise I wouldn’t understand it). I propose to apply the IPCC/GISS central estimates of forcing, feedbacks, and warming to what has actually been observed or inferred in the period since 1750.

Let us start with the forcings. Dr. Blasing and his colleagues at the Carbon Dioxide Information and Analysis Center have recently determined that total greenhouse-gas forcings since 1750 are 3.1 Watts per square meter.

From this value, using the IPCC’s table of forcings, we must deduct 35%, or 1.1 Watts per square meter, to allow for negative anthropogenic forcings, notably the particles of soot that act as tiny parasols sheltering us from the Sun. Net anthropogenic forcings since 1750, therefore, are 2 Watts per square meter.

We multiply 2 Watts per square meter by the pre-feedback climate-sensitivity parameter 0.313 Kelvin per Watt per square meter, so as to obtain warming of 0.6 K before any feedbacks have operated.

Next, we apply the IPCC’s implicit centennial-scale feedback factor 1.6 (not the equilibrium factor 2.8, because equilibrium is thousands of years off: Solomon et al., 2009).

Accordingly, after all feedbacks over the period have operated, a central estimate of the warming predicted by ModelE and other models favored by the IPCC is 1.0 K.

We verify that the centennial-scale feedback factor 1.6, implicit rather than explicit (like so much else) in the IPCC’s reports, is appropriate by noting that 1 K of warming divided by 2 Watts per square meter of original forcing is 0.5 Kelvin per Watt per square meter, which is indeed the transient-sensitivity parameter for centennial-scale analyses that is implicit (again, not explicit: it’s almost as though They don’t want us to check stuff) in each of the IPCC’s six CO2 emissions scenarios and also in their mean.

Dr. Lacis’ paper is saying, in effect, that 80% of the forcing from all greenhouse gases is attributable to CO2. The IPCC’s current implicit central estimate, again in all six scenarios and in their mean, is in the same ballpark, at 70%.

However, using the IPCC’s own forcing function for CO2, 5.35 times the natural logarithm of (390 ppmv / 280 ppmv), respectively the perturbed and unperturbed concentrations of CO2 over the period of study, is 1.8 Watts per square meter.

Multiply this by the IPCC’s transient-sensitivity factor 0.5 and one gets 0.9 K – which, however, is the whole of the actual warming that has occurred since 1750. What about the 20-30% of warming contributed by the other greenhouse gases? That is an indication that the CO2 forcing may have been somewhat exaggerated.

The IPCC, in its 2007 report, says no more than that between half and all of the warming observed since 1950 (and, in effect, since 1750) is attributable to us. Therefore, 0.45-0.9 K of observed warming is attributable to us. Even taking the higher value, if we use the IPCC/GISS parameter values and methods CO2 accounts not for 70-80% of observed warming over the period but for all of it.

In response to points like this, the usual, tired deus ex machina winched creakingly onstage by the IPCC’s perhaps too-unquestioning adherents is that the missing warming is playing hide-and-seek with us, lurking furtively at the bottom of the oceans waiting to pounce. However, elementary thermodynamic considerations indicate that such notions must be nonsense.

None of this tells us how big feedbacks really are – merely what the IPCC imagines them to be. Unless one posits very high net-positive feedbacks, one cannot create a climate problem. Indeed, even with the unrealistically high feedbacks imagined by the IPCC, there is not a climate problem at all, as I shall now demonstrate.

Though the IPCC at last makes explicit its estimate of the equilibrium climate sensitivity parameter (albeit that it is in a confused footnote on page 631 of the 2007 report), it is not explicit about the transient-sensitivity parameter – and it is the latter, not the former, that will be policy-relevant over the next few centuries.

So, even though we have reason to suspect there is a not insignificant exaggeration of predicted warming inherent in the IPCC’s predictions (or “projections”, as it coyly calls them), and a still greater exaggeration in Giss ModelE, let us apply their central estimates – without argument at this stage – to what is foreseeable this century.

The IPCC tells us that each of the six emissions scenarios is of equal validity. That means we may legitimately average them. Let us do so. Then the CO2 concentration in 2100 will be 712 ppmv compared with 392 ppmv today. So the CO2 forcing will be 5.35 ln(712/392), or 3.2 Watts per square meter, which we divide by 0.75 (the average of the GISS and IPCC estimates of the proportion of total greenhouse forcings represented by CO2) to allow for the other greenhouse gases, making 4.25 Watts per square meter.

We reduce this value by about 35% to allow for negative forcings from our soot-parasols etc., giving 2.75 Watts per square meter of net anthropogenic forcings between now and 2100.

Nest, multiply by the centennial-scale transient-sensitivity parameter 0.5 Kelvin per Watt per square meter. This gives us a reasonable central estimate of the warming to be expected by 2100 if we follow the IPCC’s and GISS’ methods and values every step of the way. And the warming we should expect this century if we do things their way? Well, it’s not quite 1.4 K.

Now we go back to that discrepancy we noted before. The IPCC says that between half and all of the warming since 1950 was our fault, and its methods and parameter values seem to give an exaggeration of some 20-30% even if we assume that all of the warming since 1950 was down to us, and a very much greater exaggeration if only half of the warming was ours.

Allowing for this exaggeration knocks back this century’s anthropogenic warming to not much more than 1 K – about a third of the 3-4 K that we normally hear so much about.

Note how artfully this tripling of the true rate of warming has been achieved, by a series of little exaggerations which, when taken together, amount to a whopper. And it is quite difficult to spot the exaggerations, not only because most of them are not all that great but also because so few of the necessary parameter values to allow anyone to spot what is going on are explicitly stated in the IPCC’s reports.

The Stern Report in 2006 took the IPCC’s central estimate of 3 K warming over the 20th century and said that the cost of not preventing that warming would be 3% of 21st-century GDP. But GDP tends to grow at 3% a year, so, even if the IPCC were right about 3 K of warming, all we’d lose over the whole century, even on Stern’s much-exaggerated costings (he has been roundly criticized for them even in the journal of which he is an editor, World Economics), would be the equivalent of the GDP growth that might be expected to occur in the year 2100 alone. That is all.

To make matters worse, Stern used an artificially low discount rate for inter-generational cost comparison which his office told me at the time was 0.1%. When he was taken apart in the peer-reviewed economic journals for using so low a discount rate, he said the economists who had criticized him were “confused”, and that he had really used 1.4%. William Nordhaus, who has written many reviewed articles critical of Stern, says that it is quite impossible to verify or to replicate any of Stern’s work because so little of the methodology is explicit and available. And how often have we heard that before? It is almost as if They don’t want us to check stuff.

The absolute minimum commercially-appropriate discount rate is equivalent to the minimum real rate of return on capital – i.e. 5%. Let us oblige Stern by assuming that he had used a 1.4% discount rate and not the 0.1% that his office told me of.

Even if the IPCC is right to try to maintain – contrary to the analysis above, indicating 1 K manmade warming this century – that we shall see 3 K warming by 2100 (progress in the first one-ninth of the century: 0 K), the cost of doing nothing about it, discounted at 5% rather than 1.4%, comes down from Stern’s 3% to just 0.5% of global 21st-century GDP.

No surprise, then, that the cost of forestalling 3 K of warming would be at least an order of magnitude greater than the cost of the climate-related damage that might arise if we just did nothing and adapted, as our species does so well.

But if the warming we cause turns out to be just 1 K by 2100, then on most analyses that gentle warming will be not merely harmless but also beneficial. There will be no net cost at all. Far from it: there will be a net economic benefit.

And that, in a nutshell, is why governments should shut down the UNFCCC and the IPCC, cut climate funding by at least nine-tenths, de-fund all but two or three computer models of the climate, and get back to addressing the real problems of the world – such as the impending energy shortage in Britain and the US because the climate-extremists and their artful nonsense have fatally delayed the building of new coal-fired and nuclear-fired power stations that are now urgently needed.

Time to get back down to Earth and use our fossil fuels, shale gas and all, to give electricity to the billions that don’t have it: for that is the fastest way to lift them out of poverty and, in so doing, painlessly to stabilize the world’s population. That would bring real environmental benefits.

And now you know why building many more power stations won’t hurt the climate, and why – even if there was a real risk of 3 K warming this century – it would be many times more cost-effective to adapt to it than to try to stop it.

As they say at Lloyds of London, “If the cost of the premium exceeds the cost of the risk, don’t insure.” And even that apophthegm presupposes that there is a risk – which in this instance there isn’t.

The Viscount Monckton of Brenchley

===========================================================

Part 1 of Sense and Sensitivity can be found here

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
206 Comments
Inline Feedbacks
View all comments
January 22, 2012 11:30 pm

But, Terry, It is not in relation to climate models that I ponder the parameters. It is in relation to the climate system itself. I am not interested in any of the models when I consider the data to be relatively nebulous rather than scientifically suitable for important decision making.

Myrrh
January 23, 2012 4:04 am

Terry Oldberg says:
January 22, 2012 at 9:32 pm
Myrrh (Jan. 22, 2012 at 4:12 pm):
Contrary to what the IPCC says, a “projection” is not a “prediction” under any circumstances. A “prediction” is an extrapolation to the outcome of a statistical event. The complete set of statistical events form a “statistical population” but the IPCC has yet to identify the statistical population underlying its conjectures
Ah yes, back to the reality of it. Another example of how they take real science terms out of context and I’ve seen this before, real scientists think they’re using the terms correctly and don’t find out otherwise unless they investigate it for themselves as you’ve done.
So now there’s a two pronged answer to those who try to downplay the failure of IPCC ‘predictions’ by claiming they don’t make them, they make them in their own nomenclature as ‘most likely’, but this term turns out to be projection as mere “opinions of scientists transformed by mathematics and obscured by complex writing” and not predictions/forecasts as understood in the real science of statistics because without any reference to any statistical events/populations from which predictions can be extrapolated and against which tested. As you’ve put it here:
“Statistical analyses require inputs of substance and mathematical definable value and accuracy. None of the climate parameters meet this criteria.”
But isn’t this what Ken is saying is lacking to start with?
I can follow your explanations in English, just about, but without facility in mathematics would have to expend more effort than I have time for to explore further, but there are many here who wouldn’t have this problem and many I imagine who would enjoy following your reasoning and examples as long as there was enough English, as you’ve given here, have you considered offering this to Anthony as a guest post?
With Mike Jonas here, what concerns me most is the way they have manipulated real science to produce the scenarios of doom laden with guilt and illogical demonisation of our natural world contrary to what even a glance at a simple graph of substance such as Vostok conveys..
..how much time do you think we have to introduce this generation educated in AGW fictional fisics to the world of real science about it through such arguments before everyone loses interest in the subject..?

Reply to  Myrrh
January 23, 2012 9:07 am

Myrrh (Jan.23, 2012 at 4:04 am):
Thanks for sharing your ideas. An approach that I’ve found fruitful is to examine the methodology of the IPCC’s inquiry into AGW in light of logic. There is an account of my work at http://judithcurry.com/2011/02/15/the-principles-of-reasoning-part-iii-logic-and-climatology/
I’ve been able to show that this methodology is neither logical nor scientific. However, that this is true is obscured by the ambiguity of reference by terms in the language of climatology to the associated ideas. The ambiguity of reference leads the unwary to false or unproved conclusions though the use of negations of the law of non-contradiction as the premises to specious arguments.

Myrrh
January 23, 2012 3:08 pm

Terry Oldberg:
January 23, 2012 at 9:07 am
Thanks for the link to your work on http://judithcurry.com/2011/02/15/the-principles-of-reasoning-part-iii-logic-and-climatology/
I’ve had a quick glance and I’ll certainly make time to read it.
Just to finalise my own exploration and as well to leave a record of it here, I checked out the IPCC definitions of the standard likelies. Found it together with examples from the 4th and in guidance notes for the 5th:
FOURTH ASSESSMENT REPORT
At a glance: IPCC report
Global climate change is “very likely” to have been human-induced, the Intergovernmental Panel on Climate Change (IPCC) has concluded.
•This is the first of four reports that will be published in 2007 by the IPCC as part of its Fourth Assessment Report (4AR)
KEY FINDINGS
•It is very likely that human activities are causing global warming
•Probable temperature rise by the end of the century will be between 1.8C and 4C (3.2-7.2F)
•Possible temperature rise by the end of the century ranges between 1.1C and 6.4C (2-11.5F)
•Sea levels are likely to rise by 28-43cm
•Arctic summer sea ice is likely to disappear in second half of century
•It is very likely that parts of the world will see an increase in the number of heatwaves
•Climate change is likely to lead to increased intensity of tropical storms
IPCC REPORT DEFINITIONS
Probability of occurrence:
virtually certain – more than 99%
extremely likely – more than 95%
very likely – more than 90%
likely – more than 60%
more likely than not – more than 50%
unlikely – less than 33%
very unlikely – less than 10%
extremely unlikely – less than 5%
(Source: IPCC)
http://news.bbc.co.uk/2/hi/uk_news/6324029.stm
===========
http://www.ipcc.ch/pdf/supporting-material/uncertainty-guidance-note.pdf
Guidance Note for Lead Authors of the
IPCC Fifth Assessment Report on
Consistent Treatment of Uncertainties
IPCC Cross-Working Group Meeting on Consistent Treatment of Uncertainties
Jasper Ridge, CA, USA
6-7 July 2010
Explain the governing factors, key indicators, and
3
ipcc guidance note
High agreement
Limited evidence
High agreement
Robust evidence
Low agreement
Limited evidence
Low agreement
Robust evidence
Evidence (type, amount, quality, consistency)
Agreement
Low agreement
Medium evidence
High agreement
Medium evidence
Medium agreement
Medium evidence
Medium agreement
Limited evidence
Medium agreement
Robust evidence
Confidence
Scale
High agreement
Limited evidence
High agreement
Robust evidence
Low agreement
Limited evidence
Low agreement
Robust evidence
Evidence (type, amount, quality, consistency)
Agreement
Low agreement
Medium evidence
High agreement
Medium evidence
Medium agreement
Medium evidence
Medium agreement
Limited evidence
Medium agreement
Robust evidence
Confidence
Scale

Figure 1: A depiction of evidence and agreement statements and their relationship to
confidence. Confidence increases towards the top-right corner as suggested by the
increasing strength of shading. Generally, evidence is most robust when there are multiple,
consistent independent lines of high-quality evidence.
Table 1. Likelihood Scale
Term* Likelihood of the Outcome
Virtually certain 99-100% probability
Very likely 90-100% probability
Likely 66-100% probability
About as likely as not 33 to 66% probability
Unlikely 0-33% probability
Very unlikely 0-10% probability
Exceptionally unlikely 0-1% probability
* Additional terms that were used in limited circumstances in the AR4 (extremely likely –
95-100% probability, more likely than not – >50-100% probability, and extremely
unlikely – 0-5% probability) may also be used in the AR5 when appropriate.
========
Hmm, I copied from Page 3 (4/7 pdf), in one hit: “Explain the governing factors, key indicators, and”, together with the following table and its note on additional terms. The extra infomation I put in italics was embedded and became visible through copying.
Anyway, thank you for taking the time to post here and I wish you success in taking this to a wider audience. We could do more interest from those in the political arena..
..I hope he’s still reading.
Myrrh

1 7 8 9