Monckton: Why current trends are not alarming

Since there has been a lot of discussion about Monckton here and elsewhere, I’ve offered him the opportunity to present his views here. – Anthony

Guest post by Christopher Monckton of Brenchley

At www.scienceandpublicpolicy.org I publish a widely-circulated and vigorously-debated Monthly CO2 Report, including graphs showing changes in CO2 concentration and in global mean surface temperature since 1980, when the satellites went on weather watch and the NOAA first published its global CO2 concentration series. Since some commenters here at Wattsup have queried some of our findings, I have asked Anthony to allow me to contribute this short discussion.

We were among the first to show that CO2 concentration is not rising at the fast, exponential rate that current anthropogenic emissions would lead the IPCC to expect, and that global temperature has scarcely changed since the turn of the millennium on 1 January 2001.

CO2 concentration: On emissions reduction, the international community has talked the talk, but – not least because China, India, Indonesia, Russia, Brazil, and South Africa are growing so quickly – it has not walked the walk. Accordingly, carbon emissions are at the high end of the IPCC’s projections, close to the A2 (“business as usual”) emissions scenario, which projects that atmospheric CO2 will grow at an exponential rate between now and 2100 in the absence of global cuts in emissions:

Exponential increase in CO2 concentration from 2000-2100 is projected by the IPCC on its A2 emissions scenario, which comes closest to today’s CO2 emissions. On the SPPI CO2-concentration graph, this projection is implemented by way of an exponential function that generates the projection zone. This IPCC graph has been enlarged, its ordinate and abscissa labeled, and its aspect ratio altered to provide a comparison with the landscape format of the SPPI graph.

On the A2 emissions scenario, the IPCC foresees CO2 rising from a measured 368 ppmv in 2000 (NOAA global CO2 dataset) to a projected 836[730, 1020] ppmv by 2100. However, reality is not obliging. The rate of increase in CO2 concentration has been slowing in recent years: an exponential curve cannot behave thus. In fact, the the NOAA’s deseasonalized CO2 concentration curve is very close to linear:

CO2 concentration change from 2000-2010 (upper panel) and projected to 2100 (lower panel). The least-squares linear-regression trend on the data shows CO2 concentration rising to just 570 ppmv by 2100, well below the IPCC’s least estimate of 730 ppmv on the A2 emissions scenario.

The IPCC projection zone on the SPPI graphs has its origin at the left-hand end of the linear-regression trend on the NOAA data, and the exponential curves are calculated from that point so that they reach the IPCC’s projected concentrations in 2100.

We present the graph thus to show the crucial point: that the CO2 concentration trend is well below the least IPCC estimate. Some have criticized our approach on the ground that over a short enough distance a linear and an exponential trend may be near-coincident. This objection is more theoretical than real.

First, the fit of the dark-blue deseasonalized NOAA data to the underlying linear-regression trend line (light blue) is very much closer than it is even to the IPCC’s least projection on scenario A2. If CO2 were now in fact rising at a merely linear rate, and if that rate were to continue, concentration would reach only 570 ppmv by 2100.

Secondly, the exponential curve most closely fitting the NOAA data would be barely supra-linear, reaching just 614 ppmv by 2100, rather than the linear 570 ppmv. In practice, the substantial shortfall between prediction and outturn is important, as we now demonstrate. The equation for the IPCC’s central estimate of equilibrium warming from a given rise in CO2 concentration is:

T = 4.7 ln(C/C0),

where the bracketed term represents a proportionate increase in CO2 concentration. Thus, at CO2 doubling, the IPCC would expect 4.7 ln 2 = 3.26 K warming – or around 5.9 F° (IPCC, 2007, ch.10, p.798, box 10.2). On the A2 scenario, CO2 is projected to increase by more than double: equilibrium warming would be 3.86 K, and transient warming would be <0.5 K less, at 3.4 K.

But if we were to take the best-fit exponential trend on the CO2 data over the past decade, equilibrium warming from 2000-2100 would be 4.7 ln(614/368) = 2.41 K, comfortably below the IPCC’s least estimate and a hefty 26% below its central estimate. Combining the IPCC’s apparent overestimate of CO2 concentration growth with the fact that use of the IPCC’s methods for determining climate sensitivity to observed increases in the concentration of CO2 and five other climate-relevant greenhouse gases over the 55 years 1950-2005 would project a transient warming 2.3 times greater than the observed 0.65 K, anthropogenic warming over the 21st century could be as little as 1 K (less than 2 F°), which would be harmless and beneficial.

Temperature: How, then, has observed, real-world global temperature responded?

The UAH satellite temperature record shows warming at a rate equivalent to 1.4 K/century over the past 30 years. However, the least-squared linear-regression trend is well below the lower bound of the IPCC projection zone.

The SPPI’s graph of the University of Alabama at Huntsville’s monthly global-temperature anomalies over the 30 years since 1 January 1980 shows warming at a rate equivalent to 1.4 K/century – almost double the rate for the 20th-century as a whole. However, most of the warming was attributable to a naturally-occurring reduction in cloud cover that allowed some 2.6 Watts per square meter of additional solar radiance to reach the Earth’s surface between 1981 and 2003 (Pinker et al., 2005; Wild et al., 2006; Boston, 2010, personal communication).

Even with this natural warming, the least-squares linear-regression trend on the UAH monthly global mean surface temperature anomalies is below the lower bound of the IPCC projection zone.

Some have said that the IPCC projection zone on our graphs should show exactly the values that the IPCC actually projects for the A2 scenario. However, as will soon become apparent, the IPCC’s “global-warming” projections for the early part of the present century appear to have been, in effect, artificially detuned to conform more closely to observation. In compiling our graphs, we decided not merely to accept the IPCC’s projections as being a true representation of the warming that using the IPCC’s own methods for determining climate sensitivity would lead us to expect, but to establish just how much warming the use of the IPCC’s methods would predict, and to take that warming as the basis for the definition of the IPCC projection zone.

Let us illustrate the problem with a concrete example. On the A2 scenario, the IPCC projects a warming of 0.2 K/decade for 2000-2020. However, given the IPCC’s projection that CO2 concentration will grow exponentially from 368 ppmv in 2000 towards 836 ppmv by 2100, CO2 should have been 368e(10/100) ln(836/368) = 399.5 ppmv in 2010, and equilibrium warming should thus have been 4.7 ln(399.5/368) = 0.39 K, which we reduce by one-fifth to yield transient warming of 0.31 K, more than half as much again as the IPCC’s 0.2 K. Of course, CO2 concentration in 2010 was only 388 ppmv, and, as the SPPI’s temperature graph shows (this time using the RSS satellite dataset), warming occurred at only 0.3 K/century: about a tenth of the transient warming that use of the IPCC’s methods would lead us to expect.

Barely significant warming: The RSS satellite data for the first decade of the 21st century show only a tenth of the warming that use of the IPCC’s methods would lead us to expect.

We make no apology, therefore, for labelling as “IPCC” a projection zone that is calculated on the basis of the methods described by the IPCC itself. Our intention in publishing these graphs is to provide a visual illustration of the extent to which the methods relied upon by the IPCC itself in determining climate sensitivity are reliable.

Some have also criticized us for displaying temperature records for as short a period as a decade. However, every month we also display the full 30-year satellite record, so as to place the current millennium’s temperature record in its proper context. And our detractors were somehow strangely silent when, not long ago, a US agency issued a statement that the past 13 months had been the warmest in the instrumental record, and drew inappropriate conclusions from it about catastrophic “global warming”.

We have made one adjustment to please our critics: the IPCC projection zone in the SPPI temperature graphs now shows transient rather than equilibrium warming.

One should not ignore the elephant in the room. Our CO2 graph shows one elephant: the failure of CO2 concentration over the past decade to follow the high trajectory projected by the IPCC on the basis of global emissions similar to today’s. As far as we can discover, no one but SPPI has pointed out this phenomenon. Our temperature graph shows another elephant: the 30-year warming trend – long enough to matter – is again well below what the IPCC’s methods would project. If either situation changes, followers of our monthly graphs will be among the first to know. As they say at Fox News, “We report: you decide.”

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

282 Comments
Inline Feedbacks
View all comments
George E. Smith
August 16, 2010 10:03 am

“”” david says:
August 14, 2010 at 6:25 am
Mr Monckton, I do hope you will respond to any comments critical of your presentation. Thanks. “””
Why so formal David? Who else (besides you) at WUWT refers to someone as Mr, or Mrs, or the PC Ms ? You did it twice so I assume it is just habit.
If you really want to be formal; why not address our Guest Poster properly as Lord Monckton; it’s more correct, and not as obvious as Mr.

Francisco
August 16, 2010 12:24 pm

Richard Courtney says:
“In the Middle Ages experts said, “We don’t know what causes crops to fail: it must be caused by witches so we must eliminate them.”
Now, experts say, “We don’t know what causes global climate change: it must be caused by emissions from human activity so we must eliminate them.”
==================
There is a 15th century German treatise called Malleus Maleficarum (The Hammer of Witchcraft) that deals not only with what should be done about witches, but also with what should be done with skeptics, i.e. those who “rashly” doubt the existence of witches or else try to minimize their formidable powers (their ability to conjure up plagues, floods, droughts, and so on). I copy below some quotes from this extraordinary piece of work. The similarities with the current orthodox attitude toward CAGW skeptics are remarkable.
quotes:
[…]
There are others who acknowledge indeed that witches exist, but they declare that the influence of magic and the effects of charms are purely imaginary and phantasmical. A third class of writers maintain that the effects said to be wrought by magic spells are altogether illusory and fanciful, although it may be that the devil does really lend his aid to some witch.
[…] This error seems to be based upon two passages from the Canons where certain women are condemned who falsely imagine that during the night they ride abroad with Diana or Herodias. This may read in the Canon. Yet because such things often happen by illusion are merely in the imagination, those who suppose that all the effects of witchcraft are mere illusion and imagination are very greatly deceived.
[…]
Accordingly, how can it be that the denial or frivolous contradiction of any of these propositions can be free from the mark of some notable heresy? Let every man judge for himself unless indeed his ignorance excuse him. But what sort of ignorance may excuse him we shall very shortly proceed to explain.
[…]
Here it must be noticed that there are fourteen distinct species which come under the genus superstition, but these for the sake of brevity it is hardly necessary to detail, since they have been most clearly set out by S. Isidore in his Etymologiae, Book 8, and by S. Thomas in his Second of the Second, question 92. Moreover, there will be explicit mention of these rather lower when we discuss the gravity of this heresy, and this will be in the last question of our First Part.
[…]
The questions arises whether people who hold that witches do not exist are to be regarded as notorious heretics, or whether they are to be regarded as gravely suspect of holding heretical opinions. It seems that the first opinion is the correct one. For this is undoubtedly in accordance with the opinion of the learned Bernard.
[…]
And yet there are some who rashly opposing themselves to all authority publicly proclaim that witches do not exist, or at any rate that they can in no way afflict and hurt mankind. Wherefore, strictly speaking those who are convicted of such evil doctrine may be excommunicated, since they are openly and unmistakably to be convicted of false doctrine. The reader may consult the works of Bernard, where he will find that this sentence is just, right, and true. Yet perhaps this may seem to be altogether too severe a judgement mainly because of the penalties which follow upon excommunication: for the Canon prescribes that a cleric is to be degraded and that a layman is to be handed over to the power of the secular courts, who are admonished to punish him as his offence deserves. Moreover, we must take into consideration the very great numbers of persons who, owing to their ignorance, will surely be found guilty of this error. And since the error is very common the rigor of strict justice may be tempered with mercy.
[…]
and there are some who, owing to the fact that they are badly informed and insufficiently read, waver in their opinions and cannot make up their minds, and since an idea merely kept to oneself is not heresy unless it be afterwards put forward, obstinately and openly maintained, it should certainly be said that persons such as we have just mentioned are not to be openly condemned for the crime of heresy. But let no man think he may escape by pleading ignorance. […] It is true that according to Raymond of Sabunde
and S. Thomas, those who have the cure of souls are certainly not bound to be men of any extraordinary learning, but they certainly should have a competent knowledge, that is to say, knowledge sufficient to carry out the duties of their state.
[…]
For sometimes persons do not know, they do not wish to know, and they have no intention of knowing. For such persons there is no excuse, but they are to be altogether condemned.

John from CA
August 16, 2010 1:42 pm

Lucy Skywalker says:
August 14, 2010 at 5:14 am
I have the impression that Monckton here deals specifically with [presents] the exact maths, science, [“]IPCC[“], and quotation issues behind the commonest detractors’ [–] comments along the line of “Monckton has been falsified by xxxxxxxxxx” [group].
This post therefore looks like [is] a good resource to refer such [for] detractors to. And[,] to remind other [“]Monckton-has-bad-evidence-asserters[“] of the likelihood [fact] that their assertions can also be answered – indeed, the likelihood that they have already been answered somewhere by Monckton.
I agree — ; )

August 16, 2010 2:34 pm

In response to Brad Beeson, the central and simple point I make is that the trend in CO2 concentration remains a long way short of the least trend projected by the IPCC on its A2 emissions scenario, which comes close to today’s emissions.
Furthermore, if one takes cumulative CO2 concentration determined by reference to emissions of CO2 from fossil-fuel burning and subtracts total observed CO2 concentration net of the natural concentration, decade by decade since 1960, there is a growing difference between the two, suggesting that an increasing proportion of anthropogenic CO2 is indeed being sequestered both in the biosphere (indicated by satellite data showing rapidly-improving net primary productivity) and in the oceans (where, however, measurements are insufficiently numerous or precise to tell us whether there has been any appreciable global dealkalinization).
It is possible, therefore, that the correct model for future CO2 concentrations, even if they continue at or even above their present gross 4 ppmv/year or net 2 ppmv/year, is exponential decay. If so, one might expect atmospheric CO2 concentration to saturate eventually, and – on one calculation I have seen and am seeking to verify – peaking, on current trends, just 90 ppmv above today’s concentration.
To establish this intriguing case definitively, a great deal of data gathering and analysis will have to be done, and I have asked the distinguished correspondent who has sent the calculation to me to assist in this process. He wrote to me because, in my reply to you demonstrating the decay from exponentiality over the past couple of decades, he saw a glimmer of empirical evidence for the hypothesis he had already formulated. As I said in a previous note, my calculations have not yet taken account of the growing discrepancy between exponentially-increasing emissions and (recently, at any rate) exponentially-decaying atmospheric concentrations. This is something which, in the light of what my correspondent has kindly sent, will need to be addressed.
It may also be appropriate to apply the exponential-decay model, rather than the current logarithmic model, to determination of the CO2 radiative forcing itself. The logarithmic model suffers from the obvious defect that every doubling of CO2 concentration, even ad infinitum, leads to a further equal quantum of warming, when in practice there comes a point when the principal absorption bands of CO2 are saturated, and little or no further warming can occur. Indeed, in the lower troposphere there are indications that the absorption bands of CO2 are already close to saturation; and in the upper troposphere, where the continuing failure of measurements to identify the tripling of the tropical surface warming rate that the models predict, it is possible that subsidence drying – additional water vapor simply subsiding to lower altitudes where its own absorption bands are already saturated – may greatly reduce the water-vapor feedback, even if some CO2-driven warming were to occur at altitude.
In short, the model used by the IPCC tends to exaggerate the radiative forcing from any given proportionate increase in CO2 concentration.
One point I should perhaps have made clearer is that I am not attempting to push the numbers in any particular direction: merely to come to an informed opinion on whether and to what extent the anthropogenic increases in CO2 concentration are likely to cause dangerous warming. It is a long process of enquiry, because – frankly – the data are generally inadequate, the methods of analysis crude and (as with the models that take insufficient account of the Lorenz constraint) over-ambitious, and, regrettably, scientists on both sides of the case seem unduly anxious to impart an angular momentum to the facts, making it far more difficult than it usually is to discover the unspun truth. But it interests me to keep looking.

John from CA
August 16, 2010 2:53 pm

“One point I should perhaps have made clearer is that I am not attempting to push the numbers in any particular direction: merely to come to an informed opinion on whether and to what extent the anthropogenic increases in CO2 concentration are likely to cause dangerous warming.”
=====
The point, at CO2 levels far higher then we are able to accomplish by burning everything to the ground for the next century, it will not create a tipping point.
Beyond this, we have yet to develop a “Global Awareness” of the opportunities.

Jim D
August 16, 2010 6:18 pm

Re: Richard S Courtney: August 16, 2010 at 2:09 am
From your previous posts on this thread, it seems you even doubt that the observed rise in CO2 is manmade, despite the isotope signature that proves this new carbon is of the fossil fuel kind. Does this evidence still make it an “argument from ignorance” that CO2 increases are manmade? To gauge what you take as a starting point for “assertion”, I would also ask whether you believe at all in Pielke Sr.’s, post here linking CO2 to warming. If you doubt that, are you also taking theirs as an “argument from ignorance.”? Or is the “ignorance” just the exact value of the feedback factor, even if observations suggest one? It is obviously hard to debate when there is absolutely no common ground for assertion, which seems to be the problem here.

August 16, 2010 6:46 pm

Jim D.August 16, 2010 at 6:18 pm
Read my presentation http://www.kidswincom.net/climate.pdf for a different interpretation of the isotope depletion data. The oceans and forests are a much greater source of organic carbon than anthropogenic emissions and they change naturally as evidenced by the annual cycle.

Jim D
August 16, 2010 10:06 pm

Re: Fred H. Haynie: August 16, 2010 at 6:46 pm
I come to WUWT to read these interesting contrarian views, and this is one for sure. Basically it says that warming of the Arctic Ocean reduces the CO2 sink, and is responsible for the current rise in CO2, if I get the gist right. It begs the question as to what caused the Arctic to start rising in temperature, and can quite small local sea-surface temperature changes drive global CO2 so quickly?
I prefer the view that 280 ppm represented a long-term equilibrium between land/ocean/atmosphere reservoirs. In the last century, emissions have put out something like enough to double the CO2 in the atmosphere, and the equilibrium can’t be restored that quickly because the surface ocean is replaced by deeper water slowly, clouds and rain can extract CO2 only inefficiently, and vegetation can grow only so quickly. So, we now have 390 ppm, which is said to be more than has been in the atmosphere for 15 million years, and this would be a very unnatural variation to have taken place spontaneously, and coincidentally when we started burning fossil fuels. We can agree that equilibrium exchanges between the reservoirs are large compared to emissions, but changing the actual equilibrium level is not so easy, in my view.

C.W. Schoneveld
August 16, 2010 11:44 pm

How many years were there between 1 BC and 1 AD?

Robert
August 17, 2010 4:41 am

[Snip. O/T.]

Richard S Courtney
August 17, 2010 5:01 am

Jim D:
Your post at August 16, 2010 at 6:18 pm asks a series of questions that are not pertinent in any way to what I have written on this thread. This is yet another use by you of a logical fallacy (in this, case ‘straw man’).
Three logical fallacies from you and still counting.
Richard

August 17, 2010 5:54 am

Jim D
August 16, 2010 at 10:06 pm
The idea that the earth is somehow in dynamic equilibrium and that anthropogenic emissions disturb that equilibrium is at the heart of CAGW modelling. It is a bad assumption. The ice core data tells us the earth has allways been in a state of change that goes in cycles. The observable wave lengths of these cycles ranges from 24 hours to millions of years associated with plate techtonics. In between we have such things as el-Nino and other oceanic temperature cycles and seasonal changes that are not in any way associated with anthropogenic emissions. The 280 ppmv “equiblibrium” value that you prefer comes from measurements of trapped air bubbles in ice cores and that is only for the last 10,000 years. The accuracy of both estimated time and magnitude is poor and does not agree with more accurate chemical measurements for the same time periods of the 19th and 20th century.

Brad Beeson
August 17, 2010 7:06 am

My Dear Lord Monckton,
Thank you for your response.
To be brief, your “central and simple point” about the CO2 trend is based on a simple mathematical error. You have conflated the IPCC’s projection of a continued exponential increase in CO2 emissions with an exponential increase in total atmospheric CO2 concentration. These two are not equivalent… let me explain:
The existing atmospheric CO2 concentration consists of a pre-industrial equlibrium of about 270-280 ppm. On top of that, anthropogenic emissions have added, at an exponentially increasing rate, an additional component, totalling about 120-130 ppm to date. But your calculation method requires both components to increase at an exponential rate, which is not an accurate physical representation of what is happening, and not what the IPCC has projected.
In other words, the human contribution has grown exponentially, with a doubling time of about 30-40 years (in line with the growth of the fossil-fuel-based world economy), but it is unphysical to expect the pre-existing equilbrium CO2 reservoir to also grow exponentially. The IPCC, in their A2 scenario, projects human emissions will continue doubling at about the same rate, and the human component of atmospheric CO2 will continue to double at its current rate (after about 50% is absorbed by natural sinks, as in the past) for the next 20-30 years, but they do not project that the 275 ppm baseline amount will grow at all.
Your projection, on the other hand … “368*e^(10/100) ln(836/368) = 399.5 ppmv in 2010”… is not the correct mathematical formula to represent this scenario. Instead, you should break up the two components of CO2 into two seperate terms. One that is constant, representing the pre-industrial equilibrium, and a second that grows exponentially… the human contribution. Something like this:
CO2 = 275 + 2^((year-1780)/33.5)
As a result of your incorrect math, fitting an exponential curve to total CO2 concentration gives you the wrong answer, and explains why your estimate of the IPCC scenario A2 differs from theirs, as you demonstrate here in your OP:
Secondly, the exponential curve most closely fitting the NOAA data would be barely supra-linear, reaching just 614 ppmv by 2100…
and here:
However, given the IPCC’s projection that CO2 concentration will grow exponentially from 368 ppmv in 2000 towards 836 ppmv by 2100, CO2 should have been 368e(10/100) ln(836/368) = 399.5 ppmv in 2010
If, instead, you had modeled the constant equilibrium CO2 term and the exponentially growing human term seperately, you would have found a very close match between A2 projected and 2010 actual CO2 concentration -> 389ppm.
Another corroborating point… extrapolating backwards using your method, past CO2 levels are not accurately modelled (as pointed out by Jim D on August 14). But the corrected formula above is valid throughout the historical CO2 record back to pre-industrial levels. This is true even if you use only the last 10 or 12 years of NOAA data, as you did in a later comment, so even the most recent data points support this method, and do not show any deviation from the exponential trend of the human contribution to CO2.
Fortunately, you have presented your findings here, where you can benefit from a rigorous evaluation of your methods and data to correct little problems like this. I am sure the other contributors here would have pointed out this mistake soon, but I am happy to be of service, and to be the first to help you refine the math in this article.
Sincerely, and at your service,
Brad

Russell Seitz
August 17, 2010 1:43 pm

Though Monckton concludes:
“We report. You decide.”,
what he presents invites another outcome :
He distorts. We deride

August 17, 2010 2:33 pm

You are completely right about the linear trend. If yuou look at the entire Mauna Loa curve from its inception in 1958 that is the only conclusion you can come to. A trend that has been linear that long is simply not going to take off exponentially because Y2K has arrived. But I strongly advise you to check out my book that analyzes the UAH and RSS satellite temperature records. It is illegitimate to draw a single straight line through that record as you do and ascribe a trend to it. The time period of the eighties and the nineties is distinct from what follows and was characterized by temperature oscillations that followed the ENSO system in the Pacific. The average temperature about which these oscillations swing did not change for twenty years and should be shown as a horizontal straight line. The super El Nino of 1998 did not belong to ENSO and brought a huge load of warm water from the Indo-Pacific Warm Pool to South American shores. In four years the global average temperature rose by 0.3 degrees and then stabilized for the next six. Show that as another horizontal line, discontinuous from the eighties and nineties. If you extend it to 2010 it very nearly bisects the difference between the 2008 La Nina and the 2010 El Nino. There is no warming in our future, just temperature oscillations like in the eighties and nineties. The abrupt temperature rise from 1998 to 2002 was not carboniferous but had an oceanic origin. It is the only warming within the last thirty years and is responsible for the first decade of this century being the warmest on record. If you consider that 0.3 degrees is half of the warming attributed to the entire twentieth century by IPCC this should not be too surprising.

Bart
August 17, 2010 5:08 pm

duckster August 14, 2010 at 7:34 am
RW August 14, 2010 at 8:03 am
eddieo August 14, 2010 at 12:13 pm
You guys need to plot the data under “Annual Mean Growth Rate for Mauna Loa, Hawaii” at the link you cite, which Henry Galt August 14, 2010 at 12:39 pm helpfully reproduces. The leveling off of the rate of rise in the past decade is immediate and obvious.
Brad Beeson August 14, 2010 at 8:54 am
Jim D August 14, 2010 at 9:28 am
R. Gates August 14, 2010 at 9:41 am
Bill Illis August 14, 2010 at 10:40 am
Icarus August 14, 2010 at 12:41 pm
orkneygal August 14, 2010 at 4:42 pm
You guys need to do the same thing.

Jim D
August 17, 2010 5:17 pm

Richard S Courtney
Obviously, I am not getting anywhere with my last three posts to you. Lets go back to your original statement (at 3:20pm 15th) that an explanation is “dangerous” in some way. Surely explanations mark progress, and only cease to be useful when they are proved wrong, which AGW hasn’t. It is surviving the tests, much in the way other scientific theories do. Satellite data all but proves the CO2 effect with the fingerprint of a cooling stratosphere that other “explanations” fall apart on. So to me, it remains very certainly an explanation.

Jim D
August 17, 2010 5:34 pm

Re:
http://wattsupwiththat.com/2010/08/14/monckton-why-current-trends-are-not-alarming/#comment-459403
You are making assumptions about the existence of cycles that themselves are not explainable by any natural means. If you can explain exactly what causes a 308-year cycle, that would help your theory, but I suspect it is a quirk of statistics, and any series can be broken into a few component cycles, which can only be useful if they are seen to repeat several times having predictive value. You expect SST to start dropping soon, so let’s see how that holds up.

Jim D
August 17, 2010 5:45 pm

Re: Bart
It is noticeable, and not fully explained, that CO2 increases more quickly in globally warm years (especially warm SST years, a thread here called “The Trend” was interesting). Now that warming has resumed after the lull of the 2000’s, I fully expect CO2 to resume in the same way. The warming itself is already back on the pace it has had since 1979.

Brad Beeson
August 17, 2010 5:58 pm

Bart,
How is this “leveling off of the rate of rise” different from the previous times it levelled off at solar minimums?
(1965, 1976, 1987, 1997, 2008)
Another way to look at the data is to find the cumulative change by decade:
1960’s = 8.5 ppm
1970’s = 12.8 ppm
1980’s = 15.7 ppm
1990’s = 15.3 ppm
2000’s = 19.4 ppm
The recent decade had the largest increase. The “rate of rise” is not always constant, and sometimes has reversed, even for a decade at a time. If you look at the monthly data, the dips at the solar minimums are quite obvious also.
What do you think was responsible for the increasing rate of rise in all those previous decades, and what do you think will be different this time?
Sincerely, Brad

August 17, 2010 6:08 pm

Russell Seitz says:
“Though Monckton concludes: ‘We report. You decide'”…
…Michael Mann tells his media pals: “We decide. You report.”
And they do what they’re told.
Jim D says:
“You are making assumptions about the existence of cycles that themselves are not explainable by any natural means.”
Classic Argumentum ad Ignorantium: ‘Since I can’t explain cycles by any natural means, the only possible answer is that humans are at fault.’
See, Jim, that is a logical fallacy. Maybe this chart by Willis Eschenbach will show you what’s wrong with using that particular fallacy: click
You may be entirely correct in believing that human emissions cause a change in cycles. But you could just as easily be wrong.
And even if you turn out to be correct, the next question is: Is more CO2 harmful or beneficial on balance?
If you believe it is harmful, please point out exactly how the recent 40% increase in CO2 has harmed anyone. OTOH, we know that more CO2 is beneficial:
click1
click2
click3
click4
click5
That more CO2 is beneficial is beyond question. But there is no evidence of any harm. Is there?

Joel Shore
August 17, 2010 6:32 pm

Brad Beesen:

To be brief, your “central and simple point” about the CO2 trend is based on a simple mathematical error. You have conflated the IPCC’s projection of a continued exponential increase in CO2 emissions with an exponential increase in total atmospheric CO2 concentration.

In other words, the human contribution has grown exponentially, with a doubling time of about 30-40 years (in line with the growth of the fossil-fuel-based world economy), but it is unphysical to expect the pre-existing equilbrium CO2 reservoir to also grow exponentially.

As a result of your incorrect math, fitting an exponential curve to total CO2 concentration gives you the wrong answer, and explains why your estimate of the IPCC scenario A2 differs from theirs, as you demonstrate here in your OP:
“However, given the IPCC’s projection that CO2 concentration will grow exponentially from 368 ppmv in 2000 towards 836 ppmv by 2100, CO2 should have been 368e(10/100) ln(836/368) = 399.5 ppmv in 2010”
If, instead, you had modeled the constant equilibrium CO2 term and the exponentially growing human term seperately, you would have found a very close match between A2 projected and 2010 actual CO2 concentration -> 389ppm.

Good call, Brad. That indeed seems to be the error in Monckton’s CO2 calculation. I just tried redoing the calculation but used the corrected formula and assumed that the background value of CO2 is 280ppm. Then the expression for the expected CO2 concentration in 2010 reads
([CO2] in 2010) = 280 + (368-280)*exp[ (10/100)*ln([836-280]/[368-280]) ] ,
which gives a value of 386 ppm for 2010, so in fact the rise in CO2 levels that we have seen is a bit above what is expected for the A2 scenario if we assume that the amount above the 280ppm background is increasing exponentially to a total concentration of 836ppm in 2100 and that this total concentration had the value of 368ppm in 2000. Of course, this is more in line with what I have seen in the peer-reviewed literature…i.e., that CO2 concentrations are, running at expectations if not a little higher (e.g., http://www.pik-potsdam.de/~stefan/Publications/Nature/rahmstorf_etal_science_2007.pdf ).
By the way, if one does the same calculation but starts instead in 1990 with the level of 354 ppm (from ftp://ftp.cmdl.noaa.gov/ccg/co2/trends/co2_annmean_gl.txt ) then one gets a predicted level of 387 ppm for 2010.
I look forward to seeing Lord Monckton correct this error.

Joel Shore
August 17, 2010 7:25 pm

Finally, I will add that I have just used Brad Beeson’s correct exponential formula (with 280ppm as the pre-industrial baseline) to test what the predicted CO2 concentrations would be in 2010 if it were headed to a value of 614ppm in 2100 (as Monckton suggests is a better forecast), rather than 836ppm. What I find is that given the 1990 value of 354ppm, the expected value in 2010 would be 377ppm. If you instead use the 2000 value of 368ppm in the formula then the expected value in 2010 would be 381ppm. Clearly, using Monckton’s expectation for the value in 2100, the correct exponential formula is underpredicting the actual 2010 CO2 concentration. And, the underprediction is getting worse as you try to predict the 2010 concentration over a longer time period (i.e., 20 years vs. 10 years).
By the way, in case someone wants to quibble with the assumed value of 280ppm for the pre-industrial baseline, I will note that the forecasts are only weakly dependent on this. For example, using a baseline of 260ppm instead of 280ppm only changes the two values I give above for 2010 prediction (based on the value in 2000) by +1.1ppm. And, 260ppm is lower than anything I have ever seen suggested before for the pre-industrial baseline.
It looks like Lord Monckton has some serious corrections to make to what he has been posting here and elsewhere.

Joel Shore
August 17, 2010 7:53 pm

Monckton of Brenchley says:

Furthermore, if one takes cumulative CO2 concentration determined by reference to emissions of CO2 from fossil-fuel burning and subtracts total observed CO2 concentration net of the natural concentration, decade by decade since 1960, there is a growing difference between the two, suggesting that an increasing proportion of anthropogenic CO2 is indeed being sequestered both in the biosphere (indicated by satellite data showing rapidly-improving net primary productivity) and in the oceans (where, however, measurements are insufficiently numerous or precise to tell us whether there has been any appreciable global dealkalinization).

The best estimates of added CO2 also include those from cement production and estimated land use changes. However, independent of that issue, if I understand what you say you are doing here by your statement “if one takes cumulative CO2 concentration determined by reference to emissions of CO2 from fossil-fuel burning and subtracts total observed CO2 concentration net of the natural concentration, decade by decade since 1960, there is a growing difference between the two” then the growing difference is not an indication that “an increasing proportion of anthropogenic CO2 is indeed being sequestered both in the biosphere…and in the oceans”. In fact, one would expect that difference to grow even if the fraction of emitted CO2 that gets sequestered were to remain unchanged.
For example, let’s say that 40% our emissions get sequestered and this doesn’t change with time. Then once the equivalent of 100ppm has been emitted, we would be 60ppm above the pre-industrial baseline, for a difference of 40ppm (the amount sequestered). However, once the equivalent of 200ppm has been emitted, we would be 120ppm above the pre-industrial baseline, for a difference of 80ppm (the amount sequestered). This would be a growing difference, just like you mentioned, but it would not be an indication that “an increasing proportion of anthropogenic CO2 is indeed being sequestered”.
Perhaps you meant something different from what I believe that you said…But, it certainly does not seem to mean that the calculation that you describe shows what you think it shows.
Your analysis would also seem to contradict a recent paper reported here at WUWT that showed that the fraction of our emissions that gets sequestered has remained about constant in time. Of course, that paper was touted as a big deal here since there is some belief that eventually some of the sinks will saturate and the fraction sequestered will DECREASE (a positive feedback in the carbon cycle). And, indeed, while it is good news if their analysis is correct and there really is no evidence of saturation yet, I don’t know of any compelling evidence that has been presented that shows that the fraction sequestered is actually INCREASING.

Jim D
August 17, 2010 8:22 pm

Brad and Joel,
I couldn’t help playing with the numbers. The rates averaged over the last three decades very closely give an exponential with a doubling time of 33.3 years, i.e. eightfold in a century, meaning starting with 370 in 2000 (90+280) gives exactly 1000 in 2100 (=720+280). Food for thought.

Verified by MonsterInsights