Since there has been a lot of discussion about Monckton here and elsewhere, I’ve offered him the opportunity to present his views here. – Anthony
Guest post by Christopher Monckton of Brenchley
At www.scienceandpublicpolicy.org I publish a widely-circulated and vigorously-debated Monthly CO2 Report, including graphs showing changes in CO2 concentration and in global mean surface temperature since 1980, when the satellites went on weather watch and the NOAA first published its global CO2 concentration series. Since some commenters here at Wattsup have queried some of our findings, I have asked Anthony to allow me to contribute this short discussion.
We were among the first to show that CO2 concentration is not rising at the fast, exponential rate that current anthropogenic emissions would lead the IPCC to expect, and that global temperature has scarcely changed since the turn of the millennium on 1 January 2001.
CO2 concentration: On emissions reduction, the international community has talked the talk, but – not least because China, India, Indonesia, Russia, Brazil, and South Africa are growing so quickly – it has not walked the walk. Accordingly, carbon emissions are at the high end of the IPCC’s projections, close to the A2 (“business as usual”) emissions scenario, which projects that atmospheric CO2 will grow at an exponential rate between now and 2100 in the absence of global cuts in emissions:
Exponential increase in CO2 concentration from 2000-2100 is projected by the IPCC on its A2 emissions scenario, which comes closest to today’s CO2 emissions. On the SPPI CO2-concentration graph, this projection is implemented by way of an exponential function that generates the projection zone. This IPCC graph has been enlarged, its ordinate and abscissa labeled, and its aspect ratio altered to provide a comparison with the landscape format of the SPPI graph.
On the A2 emissions scenario, the IPCC foresees CO2 rising from a measured 368 ppmv in 2000 (NOAA global CO2 dataset) to a projected 836[730, 1020] ppmv by 2100. However, reality is not obliging. The rate of increase in CO2 concentration has been slowing in recent years: an exponential curve cannot behave thus. In fact, the the NOAA’s deseasonalized CO2 concentration curve is very close to linear:
CO2 concentration change from 2000-2010 (upper panel) and projected to 2100 (lower panel). The least-squares linear-regression trend on the data shows CO2 concentration rising to just 570 ppmv by 2100, well below the IPCC’s least estimate of 730 ppmv on the A2 emissions scenario.
The IPCC projection zone on the SPPI graphs has its origin at the left-hand end of the linear-regression trend on the NOAA data, and the exponential curves are calculated from that point so that they reach the IPCC’s projected concentrations in 2100.
We present the graph thus to show the crucial point: that the CO2 concentration trend is well below the least IPCC estimate. Some have criticized our approach on the ground that over a short enough distance a linear and an exponential trend may be near-coincident. This objection is more theoretical than real.
First, the fit of the dark-blue deseasonalized NOAA data to the underlying linear-regression trend line (light blue) is very much closer than it is even to the IPCC’s least projection on scenario A2. If CO2 were now in fact rising at a merely linear rate, and if that rate were to continue, concentration would reach only 570 ppmv by 2100.
Secondly, the exponential curve most closely fitting the NOAA data would be barely supra-linear, reaching just 614 ppmv by 2100, rather than the linear 570 ppmv. In practice, the substantial shortfall between prediction and outturn is important, as we now demonstrate. The equation for the IPCC’s central estimate of equilibrium warming from a given rise in CO2 concentration is:
∆T = 4.7 ln(C/C0),
where the bracketed term represents a proportionate increase in CO2 concentration. Thus, at CO2 doubling, the IPCC would expect 4.7 ln 2 = 3.26 K warming – or around 5.9 F° (IPCC, 2007, ch.10, p.798, box 10.2). On the A2 scenario, CO2 is projected to increase by more than double: equilibrium warming would be 3.86 K, and transient warming would be <0.5 K less, at 3.4 K.
But if we were to take the best-fit exponential trend on the CO2 data over the past decade, equilibrium warming from 2000-2100 would be 4.7 ln(614/368) = 2.41 K, comfortably below the IPCC’s least estimate and a hefty 26% below its central estimate. Combining the IPCC’s apparent overestimate of CO2 concentration growth with the fact that use of the IPCC’s methods for determining climate sensitivity to observed increases in the concentration of CO2 and five other climate-relevant greenhouse gases over the 55 years 1950-2005 would project a transient warming 2.3 times greater than the observed 0.65 K, anthropogenic warming over the 21st century could be as little as 1 K (less than 2 F°), which would be harmless and beneficial.
Temperature: How, then, has observed, real-world global temperature responded?
The UAH satellite temperature record shows warming at a rate equivalent to 1.4 K/century over the past 30 years. However, the least-squared linear-regression trend is well below the lower bound of the IPCC projection zone.
The SPPI’s graph of the University of Alabama at Huntsville’s monthly global-temperature anomalies over the 30 years since 1 January 1980 shows warming at a rate equivalent to 1.4 K/century – almost double the rate for the 20th-century as a whole. However, most of the warming was attributable to a naturally-occurring reduction in cloud cover that allowed some 2.6 Watts per square meter of additional solar radiance to reach the Earth’s surface between 1981 and 2003 (Pinker et al., 2005; Wild et al., 2006; Boston, 2010, personal communication).
Even with this natural warming, the least-squares linear-regression trend on the UAH monthly global mean surface temperature anomalies is below the lower bound of the IPCC projection zone.
Some have said that the IPCC projection zone on our graphs should show exactly the values that the IPCC actually projects for the A2 scenario. However, as will soon become apparent, the IPCC’s “global-warming” projections for the early part of the present century appear to have been, in effect, artificially detuned to conform more closely to observation. In compiling our graphs, we decided not merely to accept the IPCC’s projections as being a true representation of the warming that using the IPCC’s own methods for determining climate sensitivity would lead us to expect, but to establish just how much warming the use of the IPCC’s methods would predict, and to take that warming as the basis for the definition of the IPCC projection zone.
Let us illustrate the problem with a concrete example. On the A2 scenario, the IPCC projects a warming of 0.2 K/decade for 2000-2020. However, given the IPCC’s projection that CO2 concentration will grow exponentially from 368 ppmv in 2000 towards 836 ppmv by 2100, CO2 should have been 368e(10/100) ln(836/368) = 399.5 ppmv in 2010, and equilibrium warming should thus have been 4.7 ln(399.5/368) = 0.39 K, which we reduce by one-fifth to yield transient warming of 0.31 K, more than half as much again as the IPCC’s 0.2 K. Of course, CO2 concentration in 2010 was only 388 ppmv, and, as the SPPI’s temperature graph shows (this time using the RSS satellite dataset), warming occurred at only 0.3 K/century: about a tenth of the transient warming that use of the IPCC’s methods would lead us to expect.
Barely significant warming: The RSS satellite data for the first decade of the 21st century show only a tenth of the warming that use of the IPCC’s methods would lead us to expect.
We make no apology, therefore, for labelling as “IPCC” a projection zone that is calculated on the basis of the methods described by the IPCC itself. Our intention in publishing these graphs is to provide a visual illustration of the extent to which the methods relied upon by the IPCC itself in determining climate sensitivity are reliable.
Some have also criticized us for displaying temperature records for as short a period as a decade. However, every month we also display the full 30-year satellite record, so as to place the current millennium’s temperature record in its proper context. And our detractors were somehow strangely silent when, not long ago, a US agency issued a statement that the past 13 months had been the warmest in the instrumental record, and drew inappropriate conclusions from it about catastrophic “global warming”.
We have made one adjustment to please our critics: the IPCC projection zone in the SPPI temperature graphs now shows transient rather than equilibrium warming.
One should not ignore the elephant in the room. Our CO2 graph shows one elephant: the failure of CO2 concentration over the past decade to follow the high trajectory projected by the IPCC on the basis of global emissions similar to today’s. As far as we can discover, no one but SPPI has pointed out this phenomenon. Our temperature graph shows another elephant: the 30-year warming trend – long enough to matter – is again well below what the IPCC’s methods would project. If either situation changes, followers of our monthly graphs will be among the first to know. As they say at Fox News, “We report: you decide.”





Sorry, all bets are off, it disappeared from moderation already. I posted at Tamino’s blog, I am sure it, too, will disappear, as if ignoring it will make the paper go away.
Dear Lord Monckton,
Thank you so much for this article! You are illustrating quite clearly the failure from the IPCC on predicting both the increase of the mass fraction of CO2 and the “anomalies” of the tropospheric temperature.
In your article, you have pointed at the criticism for not including the whole path length in your assessments; we have been criticized also for the same reason when assessing on other issues. We have granted meticulous theoretical analyses that include path lengths and we have demonstrated that our approach on ground is not significantly different from the outcome obtained by considering the whole length of the troposphere’s column. We have demonstrated that even on Venus, a planet with an atmosphere loaded by 95% of carbon dioxide, the latter cannot cause such “greenhouse” effect.
The real problem is that those people do not listen to the truth, they do not recognize facts and they do not know about reality because their set of beliefs is like a religion which confers to them such fabulous dividends that they are compelled to maintain their pseudoscience by denigrating honest scientists.
They are a bunch or science illiterates that have inundated the Media and the Internet with their false “theories” and weird “algorithms” that, when confronted against experimental and observational information, fall down to misery. They solitary purpose is denigrating to real scientists.
Those people continue ignoring science and they continue spreading their pseudoscience through diverse means. The rea$ons are more than obvious. This is why they get angry at any honest scientist who enlightens them on real science and shows them how the real world works.
Thanks again for your brilliant article and thanks to Anthony Watts and his team of collaborators for allowing Lord Monckton to share his knowledge with us.
All the best,
Nasif S. Nahle
Monckton: Why current trends are not alarming
To the contrary: there is a trend which is alarming indeed, not of the global warming but of another 1960’s cooling or worse during next 15-20 years. Winters’ heating energy requirements will be huge.
http://www.vukcevic.talktalk.net/Driver.htm
Driver impulse is required to keep up the temps, if it is not there or low (eg. 1640-1700) or not frequent enough temps may fall to the LIA levels.
I wish climate science behaved more like medical science. I understand medical science. Medical science constantly tests and retests various and sundry hypotheses and is constantly making adjustments according to the findings (i.e. empirical data). This is where we get the concept of “evidence based” medicine.
The AGW theory has been extant for at least 20 years. To date it remains an unproven theory. It has become heresy to dare challenge this theory. So let’s look at medicine as an example. Twenty years ago the theory that peptic ulcer disease may be due to infection with Helicobacter pylori was a novel idea (back then it was called Campylobater pylori)…microbiologists earn their Ph.D.s by redefining taxonomy. Jeepers…turns out a couple of Australian FPs were right and the entire world of gastroenterologists were wrong. Most peptic ulcer disease actually IS due to infection by H. pylori. But it goes far beyond this.
Twenty years ago it was believed that antidepressants were ineffective in cases of reactional depression. Turns out they were, indeed, effective. Twenty years ago every med student was taught that beta-adrenergic blockers were absolutely contraindicated in cases of CHF. There was pretty sound pharmacologic theory to back this up. Today beta-blockers are a mainstay of treatment of CHF. In the interim billions of dollars have been wagered by pharmaceutical concerns on each side if the debate.
What stuns me is that climate science remains so rooted to a single, unprovable hypothesis…CO2 will heat up the planet. Even with the most convoluted proxies, anthropogenic CO2 has NEVER been PROVEN to cause global warming (or “climate change” if you prefer). It’s all theory backed by no empiric evidence. Using structure-activity relationships I can design a drug in a computer model. Do you think I could bring it to market without empirical evidence of efficacy? Why do we but this crap?
Asked another way…would you travel a bridge designed by Michael Mann?
Shorter sentences, or at least semi-colons breaking up comma-containing list items, please. And, speaking, of commas, less of, them.
Fred H. Haynie says:
August 14, 2010 at 1:28 pm
“……..What appears to you as exponential is a segment of a natural wave near it’s minimum and rising. …………”
That’s brilliant Fred, and much more plausible than the alarming exponentials. It is certainly worth further consideration & I’ll look forward to seeing it borne out in the years to come. Just for the sake of this argument though, it is good that Christopher is giving the IPCC as much benefit of the doubt as one possibly can, before considering more likely scenarios such as you suggest. One hopes he might consider taking this on board though.
Gone from Tamino also. Romm and Tamino are cowards.
Bill Illis,
thank you for catching my mistake. When I did the screen-shot of the IPCC-pdf I slipped into the wrong column. It does not materially change the graphical impact in the period 2000 to 2010. Here is the corrected version:
http://bluegrue.files.wordpress.com/2010/08/co2-ipcc-a2-scenario1.png
In addition, here are the IPCC scenarios you kindly provided the link for and the NOAA data plotted from 2000 to 2010.5, using the same scaling as Lord Monckton’s ten year plot.
http://bluegrue.files.wordpress.com/2010/08/co2-ipcc-vs-observation.png
Again, a stark contrast to the depiction by Lord Monckton.
Since people are fitting lines, and following up on what I said about the widget scaling of CO2 versus UAH temperature since 1979, I invite people with spreadsheet tools and annual CO2 data and UAH data since 1979, to plot the annual CO2 scaled by 0.01 with 350 subtracted, against the 9-year running average UAH temperature since 1979.
It is somewhat amazing agreement, showing that
UAH9 = 0.01*(CO2-350)-0.1
to a good approximation.
That is, the 9-year running mean temperature increases by 0.1 for every 10 ppm CO2 increase. Nice round numbers to remember too. As I mentioned, this is consistent with a feedback factor of 3. I hope someone with a Web page can plot this calculation to show.
The numerics in this thread are interesting. I have worked with the Mauna Loa CO2 data too, quite carefully I think, and have found that a very good fit indeed can be produced by first “deseasonalising” the monthly values to give what I call monthly differences. A preliminary step is to fit a simple regression to these data and generate the residuals. Plotting these against the time axis clearly invites modification of the model, and looks rather as if a quadratic would do a good job. Given the numbers involved it is useful to centre the time axis data, (central date is May 1984), before generating the squared term. Now, fit the “Monthly differences” to Centred Decimal Year and its square, and you get a very good fit indeed to the observations. The adjusted Rsquared statistic which seems to be favoured in the climate business comes out at 0.9986, a staggering value. Extrapolating to 2050 gives a value that translates to 492 ppm at 2050 and 675 at 2100. The 95 % confidence interval is about 5 units at this date!
These values seem to fall between Lord Monckton’s and those of the IPCC lower limit.
Now, no-one would seriously believe these figures. They are simply what comes out if you do the sums carefully /and/ assume that the current behaviour of factors influencing CO2 concentration will remain the same. This of course might also mean that the primary increases in CO2 generation caused by significant newcomers to the fuel burning scene would also be at a “constant” rate. Neither of these is likely to be true in practice, I guess.
So, we do careful projections but they are almost sure to be invalid.
Ho hum!
@Dale Rainwater. Dave says:
August 14, 2010 at 2:12 pm
—–
Nice to meet you, Doc! Thanks for the post, the comparison to medicine is very compelling. The H. pylori story is a good one!
Back in the old days, folks thought bad smells and gases (miasmas) caused infectious illnesses. One of my heroes of history, Dr. John Snow, proved the relationship between contaminated water and cholera in London in 1854.
Did Snow over-react? Here’s a brief read on the topic:
http://www.ph.ucla.edu/epi/snow/removal.html
At least Snow had some sound data, was dealing with a lethal disease (unlike AGW, which hasn’t even been proven to exist, much less have deleterious effects) and backed up his policy with good, no B.S. politics and public relations.
The AGW team? Not so much….
From “A New And Effective Climate Model” (Stephen Wilde, Watt’s Up with That, April 6 ’10) http://wattsupwiththat.com/2010/04/06/a-new-and-effective-climate-model/
“Despite a substantial increase in the power of the Sun over billions of years the temperature of the Earth has remained remarkably stable.
My proposition is that the reason for that is the existence of water in liquid form in the oceans combined with a relatively stable total atmospheric density. If the power input from the sun changes then the effect is simply to speed up or slow down the hydrological cycle.”
“A change in the speed of the entire hydrological cycle does have a climate effect but as we shall see on timescales relevant to human existence it is too small to measure in the face of internal system variability from other causes.”
From “The Thermostat Hypothesis”, http://wattsupwiththat.com/2009/06/14/the-thermostat-hypothesis/ (Willis Eschenbach, June 14, ’09)
The Thermostat Hypothesis is that tropical clouds and thunderstorms actively regulate the temperature of the Earth. This keeps the Earth at an equilibrium temperature.
Several kinds of evidence are presented to establish and elucidate the Thermostat Hypothesis – historical temperature stability of the Earth, theoretical considerations, satellite photos, and a description of the equilibrium mechanism.
In response to Bluegrue, who says that the NOAA data falls within the bounds of the IPCC’s A2 projection, while on the SPPI graph the NOAA data fall below the projection zone, the SPPI graph is zeroed so that the projection zone commences from the left-hand (lower) end of the linear-regression trend-line. This is the clearest way to see whether the direction the data are taking is consistent with the projections, and that – as I have explained often before – is the purpose of the graph. No conclusion is to be drawn (or is drawn) from the fact that on our graph the data do not fall within the projection zone at the left-hand end.
In response to Geoflynx, who cites Wikipedia as authority for the “precautionary” “principle”, in common with many universities I do not cite or use Wikipedia for any purpose, since it has justly been described as “the encyclopedia that any idiot can edit but only a cretin would credit”. That is not meant as an ad-hom against you, but Wikipedia is not an authority on anything. As for the “precautionary” “principle”, it is neither precautionary nor a principle. It is a political expedient, and a costly and undesirable one. Before taking precautions, one should of course also take steps to ensure that the precautions themselves will not cause more harm than the supposed catastrophe that they are intended to prevent. The failure of the international community to remember that the “precautionary” “principle” should also be applied to the precautions themselves led directly to the deaths of some 40 million people, most of them children, from malaria when DDT was carelessly banned. It was only on 15 September 2006, long after those responsiple for this murderous ban had retired or died, that Dr. Arata Kochi of the World Health Organization was able to say, “In this field, politics usually comes first and science second. We will now take a stand on the science and the data.” He lifted the ban on DDT and the WHO once again recommended it as the first line of defense against the mosquito. Mutatis mutandis, much the same could be said of the “precautions” against “global warming” today: there has been insufficient attention paid to the cost of the “precautions”, and the supposed consequences of “global warming”, as well as the extent of “global warming” itself, have been exaggerated. That is not a good or mature way to make policy.
In response to R. Gates, who asks whether there is an exponential growth rate for CO2, yes there is, but in the past dozen years there has been a decline from exponentiality in the direction of linearity. On any view, CO2 concentration is not rising at anything like even the lower-bound exponential curve projected by the IPCC on the A2 scenario. The implications for the final quantum of 21st-century warming are considerable: there will be a lot less of it than the IPCC projects, if the present decay towards linearity continues. Of course, it may not continue: but that is how things stand at present.
Icarus says:
August 14, 2010 at 12:41 pm
Icarus says,
“Considering the climate chaos already being seen from the ~100ppm increase in atmospheric CO2 so far”
Icarus, the rest of your post is intelligent and constructive, but this statement is just asinine. Nobody has seen anything but weather, so far. Our only view of climate is into the past and that is peering through a lens almost opaque.
The data of IPCC scenario A2 as stored at ippc-data.org does not correspond to the thick blue line in Lord Monckton’s third plot:
http://bluegrue.files.wordpress.com/2010/08/overlay-monckton-notalarming3-vs-a2.png
The data does however correspond to scenario A2 as depicted in fig. 10.26 in the IPCC AR4 on page 803.
I downloaded the Mauna Loa CO2 monthly data from March 1958 through July 2010 and plotted it (Interpolated) using a common spreadsheet program that has trending analysis functions built in. There are 629 monthly data points.
Results-
Linear Trend
Y=0.1201x+308.62
R^2=0.9771
Exponential Trend
y=310.13e^0.000x
R^2=0.9822
The conclusion, based upon the data, is that the monthly level of CO2 increase at Mauna Loa can be accurate represented by either a Linear Trend or an Exponential Trend, with the Exponential trend having a slightly higher R^2 value.
However, the plot for the month to month Delta PPM is hardly convincing that the rate of change is growing alarmingly.
For the monthly rate of change in Delta PPM (Interpolated) results are as follows-
Linear Trend
y=0.0002x+0.0566
R^2=0.0009
Julian in Wales
I wrote a non-tech bit as to why the shape of a molecule does or does not contribute to that molecule’s role in being a GHG. This is from an Aug. 5th post (No.2) about CO2. The post and comments are a bit tough to follow but if you use some of the terms you have now to search, there are many papers on the web for various levels of expertise. Anyway, I wrote:
“Here is a very rough analogy. Recall the small hand exercise thing shaped like a V with a spring. Squeeze the handles, let go, and the thing springs back to its original shape. Think of your squeezing as the absorption and spring-back part as the release of that energy. Think of this as a CO2 molecule. Now cut two 25 cm long pieces, one blue and one green, from wood broom handles. Blue can be N2 and green can be O2. There is nothing to squeeze together. No squeeze – no energy absorbed. So, think of the nitrogen and oxygen gases (major components of earth’s atmosphere) as having no role to play in this little game. Only molecules with particular characteristics (read the post for them), and CO2 is one such, can play in this game.
August 5, @ur momisugly 7:11 am http://wattsupwiththat.com/2010/08/05/co2-heats-the-atmosphere-a-counter-view/
A man with outstanding credentials such as the ones mentioned in the link below, and the knowledge in Biochemistry to cure his ailment, will have my vote anytime.
http://www.ukip.org/content/latest-news/1675-christopher-a-man-of-many-talents
2008-present: RESURREXI Pharmaceutical: Director responsible for invention and development of a broad-spectrum cure for infectious diseases. Patents have now been filed. Patients have been cured of various infectious diseases, including Graves’ Disease, multiple sclerosis, influenza, and herpes simplex VI. Our first HIV patient had his viral titre reduced by 38% in five days, with no side-effects. Tests continue.
PJP says:
August 14, 2010 at 10:26 am
Thanks, PJP, for that interesting and informative description of feedback. The only issue I might take with it is that you describe a feeback of 10x, or 1000%. This would be exceedingly rare in most systems. I would expect a feedback to be below 100% in most cases, or any ‘movement’ would be wildly exaggerated.
Also, as I understand it (and that understanding may well be limited) any feedback below 100% will not result in a runaway feedback loop, as it will reach a plateau. Take 10% feedback, the steps go 100%, 110%, 111%, 111.1% I believe, as you only get additional feedback on the feedback (if that makes sense).
BTW, the scenario with the microphone feedback, although a well-known one, is not ‘typical’ as there is a powerful amplifier in the system itself generating many times the original signal.
I fully agree, that without the imagined forcing and feedback, there is little or nothing to worry about. In fact, only if the feedback itself is massive, ie over 100%, would any scary scenario be possible.
Sorry in that last comment of mine it should read.
Exponential Trend
y=310.13e^0.003x
R^2=0.9822
Let’s remember that not only are the IPCC main CO2 assumptions/projections/scenarios / whatever RealClimate wants to call them, likely to be too high by the year 2100, the temperature response to that increased CO2 is only half of that currently being projected.
This discussion should be mainly directed toward the temperature response per unit CO2 increase, not the CO2 increase. I may have inadvertently participated in that.
Now RealClimate says some model runs don’t have much warming in the last ten years so they accurately reflect the current 50% response trends (and somehow that says the models, as a whole, are therefore accurate). Well, that really says that the low temperature growth models/runs are the more accurate models so we should throw out the high temperature growth ones and focus on the low temperature growth ones. Or more accurately, we should be using the accurate almost-no-temperature-growth models.
CO2 (and Methane) are / will be lower and the temperature response per CO2 have been / will be lower.
To paraphrase James Sexton (August 14, 2010 at 12:59 pm) and CRS, Dr.P.H. (August 14, 2010 at 1:19 pm) on GeoFlynx (August 14, 2010 at 11:40 am)
a) The precautionary principle does not trump the scientific method.
b) Not withstanding a) Any proposed action in mitigation of the hypothecated CAGW must itself pass the precautionary test.
The burden, Mr Flynx, remains yours.
Monckton of Brenchley says:
August 14, 2010 at 4:04 pm
I cannot see why this, which seems to be the main point, is so hard to get across. Many seem to be either just not reading properly, or being wilfully ignorant. The projected exponential curve is just not being sustained.
I see on further reading of comments that my last was also paraphrasing the author of the post. My apologies for the omission your Lordship.