Monckton: Why current trends are not alarming

Since there has been a lot of discussion about Monckton here and elsewhere, I’ve offered him the opportunity to present his views here. – Anthony

Guest post by Christopher Monckton of Brenchley

At www.scienceandpublicpolicy.org I publish a widely-circulated and vigorously-debated Monthly CO2 Report, including graphs showing changes in CO2 concentration and in global mean surface temperature since 1980, when the satellites went on weather watch and the NOAA first published its global CO2 concentration series. Since some commenters here at Wattsup have queried some of our findings, I have asked Anthony to allow me to contribute this short discussion.

We were among the first to show that CO2 concentration is not rising at the fast, exponential rate that current anthropogenic emissions would lead the IPCC to expect, and that global temperature has scarcely changed since the turn of the millennium on 1 January 2001.

CO2 concentration: On emissions reduction, the international community has talked the talk, but – not least because China, India, Indonesia, Russia, Brazil, and South Africa are growing so quickly – it has not walked the walk. Accordingly, carbon emissions are at the high end of the IPCC’s projections, close to the A2 (“business as usual”) emissions scenario, which projects that atmospheric CO2 will grow at an exponential rate between now and 2100 in the absence of global cuts in emissions:

Exponential increase in CO2 concentration from 2000-2100 is projected by the IPCC on its A2 emissions scenario, which comes closest to today’s CO2 emissions. On the SPPI CO2-concentration graph, this projection is implemented by way of an exponential function that generates the projection zone. This IPCC graph has been enlarged, its ordinate and abscissa labeled, and its aspect ratio altered to provide a comparison with the landscape format of the SPPI graph.

On the A2 emissions scenario, the IPCC foresees CO2 rising from a measured 368 ppmv in 2000 (NOAA global CO2 dataset) to a projected 836[730, 1020] ppmv by 2100. However, reality is not obliging. The rate of increase in CO2 concentration has been slowing in recent years: an exponential curve cannot behave thus. In fact, the the NOAA’s deseasonalized CO2 concentration curve is very close to linear:

CO2 concentration change from 2000-2010 (upper panel) and projected to 2100 (lower panel). The least-squares linear-regression trend on the data shows CO2 concentration rising to just 570 ppmv by 2100, well below the IPCC’s least estimate of 730 ppmv on the A2 emissions scenario.

The IPCC projection zone on the SPPI graphs has its origin at the left-hand end of the linear-regression trend on the NOAA data, and the exponential curves are calculated from that point so that they reach the IPCC’s projected concentrations in 2100.

We present the graph thus to show the crucial point: that the CO2 concentration trend is well below the least IPCC estimate. Some have criticized our approach on the ground that over a short enough distance a linear and an exponential trend may be near-coincident. This objection is more theoretical than real.

First, the fit of the dark-blue deseasonalized NOAA data to the underlying linear-regression trend line (light blue) is very much closer than it is even to the IPCC’s least projection on scenario A2. If CO2 were now in fact rising at a merely linear rate, and if that rate were to continue, concentration would reach only 570 ppmv by 2100.

Secondly, the exponential curve most closely fitting the NOAA data would be barely supra-linear, reaching just 614 ppmv by 2100, rather than the linear 570 ppmv. In practice, the substantial shortfall between prediction and outturn is important, as we now demonstrate. The equation for the IPCC’s central estimate of equilibrium warming from a given rise in CO2 concentration is:

T = 4.7 ln(C/C0),

where the bracketed term represents a proportionate increase in CO2 concentration. Thus, at CO2 doubling, the IPCC would expect 4.7 ln 2 = 3.26 K warming – or around 5.9 F° (IPCC, 2007, ch.10, p.798, box 10.2). On the A2 scenario, CO2 is projected to increase by more than double: equilibrium warming would be 3.86 K, and transient warming would be <0.5 K less, at 3.4 K.

But if we were to take the best-fit exponential trend on the CO2 data over the past decade, equilibrium warming from 2000-2100 would be 4.7 ln(614/368) = 2.41 K, comfortably below the IPCC’s least estimate and a hefty 26% below its central estimate. Combining the IPCC’s apparent overestimate of CO2 concentration growth with the fact that use of the IPCC’s methods for determining climate sensitivity to observed increases in the concentration of CO2 and five other climate-relevant greenhouse gases over the 55 years 1950-2005 would project a transient warming 2.3 times greater than the observed 0.65 K, anthropogenic warming over the 21st century could be as little as 1 K (less than 2 F°), which would be harmless and beneficial.

Temperature: How, then, has observed, real-world global temperature responded?

The UAH satellite temperature record shows warming at a rate equivalent to 1.4 K/century over the past 30 years. However, the least-squared linear-regression trend is well below the lower bound of the IPCC projection zone.

The SPPI’s graph of the University of Alabama at Huntsville’s monthly global-temperature anomalies over the 30 years since 1 January 1980 shows warming at a rate equivalent to 1.4 K/century – almost double the rate for the 20th-century as a whole. However, most of the warming was attributable to a naturally-occurring reduction in cloud cover that allowed some 2.6 Watts per square meter of additional solar radiance to reach the Earth’s surface between 1981 and 2003 (Pinker et al., 2005; Wild et al., 2006; Boston, 2010, personal communication).

Even with this natural warming, the least-squares linear-regression trend on the UAH monthly global mean surface temperature anomalies is below the lower bound of the IPCC projection zone.

Some have said that the IPCC projection zone on our graphs should show exactly the values that the IPCC actually projects for the A2 scenario. However, as will soon become apparent, the IPCC’s “global-warming” projections for the early part of the present century appear to have been, in effect, artificially detuned to conform more closely to observation. In compiling our graphs, we decided not merely to accept the IPCC’s projections as being a true representation of the warming that using the IPCC’s own methods for determining climate sensitivity would lead us to expect, but to establish just how much warming the use of the IPCC’s methods would predict, and to take that warming as the basis for the definition of the IPCC projection zone.

Let us illustrate the problem with a concrete example. On the A2 scenario, the IPCC projects a warming of 0.2 K/decade for 2000-2020. However, given the IPCC’s projection that CO2 concentration will grow exponentially from 368 ppmv in 2000 towards 836 ppmv by 2100, CO2 should have been 368e(10/100) ln(836/368) = 399.5 ppmv in 2010, and equilibrium warming should thus have been 4.7 ln(399.5/368) = 0.39 K, which we reduce by one-fifth to yield transient warming of 0.31 K, more than half as much again as the IPCC’s 0.2 K. Of course, CO2 concentration in 2010 was only 388 ppmv, and, as the SPPI’s temperature graph shows (this time using the RSS satellite dataset), warming occurred at only 0.3 K/century: about a tenth of the transient warming that use of the IPCC’s methods would lead us to expect.

Barely significant warming: The RSS satellite data for the first decade of the 21st century show only a tenth of the warming that use of the IPCC’s methods would lead us to expect.

We make no apology, therefore, for labelling as “IPCC” a projection zone that is calculated on the basis of the methods described by the IPCC itself. Our intention in publishing these graphs is to provide a visual illustration of the extent to which the methods relied upon by the IPCC itself in determining climate sensitivity are reliable.

Some have also criticized us for displaying temperature records for as short a period as a decade. However, every month we also display the full 30-year satellite record, so as to place the current millennium’s temperature record in its proper context. And our detractors were somehow strangely silent when, not long ago, a US agency issued a statement that the past 13 months had been the warmest in the instrumental record, and drew inappropriate conclusions from it about catastrophic “global warming”.

We have made one adjustment to please our critics: the IPCC projection zone in the SPPI temperature graphs now shows transient rather than equilibrium warming.

One should not ignore the elephant in the room. Our CO2 graph shows one elephant: the failure of CO2 concentration over the past decade to follow the high trajectory projected by the IPCC on the basis of global emissions similar to today’s. As far as we can discover, no one but SPPI has pointed out this phenomenon. Our temperature graph shows another elephant: the 30-year warming trend – long enough to matter – is again well below what the IPCC’s methods would project. If either situation changes, followers of our monthly graphs will be among the first to know. As they say at Fox News, “We report: you decide.”

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
282 Comments
Inline Feedbacks
View all comments
James Davidson
August 14, 2010 11:32 am

All the IPCC’s “scenarios” are based on computer climate models. The first person to try to make a computer climate model was Edward Lorenz. He found that tiny changes in the initial conditions, – changes to the fourth, fifth and sixth places after the decimal point, – made huge differences to the outcome ( the “butterfly effect.”) Since measurements of initial conditions, ( temperature, pressure etc,) can never be made to that degree of accuracy, he concluded that computer climate modeling was not possible in principle. ( Deterministic Non- Periodic Flow, Journal of Atmospheric Sciences, 1963.) This paper has been seen to be the foundation for the mathematical theory of Deterministic Chaos. As one of the scientists writing the Third Assessment Report of the IPCC wrote: ” In climate research and modeling, we should realise that we are dealing with a coupled, non-linear chaotic system, and therefore that long range forecasting of future climate states is not possible.” ( 3 AR Sec 14.2.2.2.) Of course, computer climate models are still used in forecasting and they give fairly accurate forecasts for about three days in advance. For three days to seven days they are more speculative, and beyond that. the Old Farmer’s Almanac is just as likely to give you an accurate forecast. The UK Meteorological Office has acknowledged this as it will no longer issue seasonal forecasts after last years debacle. In 2009 they forecast a “barbecue summer.” ( It turned out cold and wet with record flooding in some areas.) This was to be followed by a ” mild winter;” (record snowfalls and record low temperatures.) Now the Met Office issues forecasts for one month in advance, updated weekly.

GeoFlynx
August 14, 2010 11:40 am

“The precautionary principle states that if an action or policy has a suspected risk of causing harm to the public or to the environment, in the absence of scientific consensus that the action or policy is harmful, the burden of proof that it is not harmful falls on those taking the action …..(Wikipedia).”
GeoFlynx – Smokey, this principal was drilled into all who studied under a certain Prof. Robert E. Riecker in the late sixties. It forms the foundation of risk analysis in Civil Engineering and many other disciplines that have brought society great benefit. While most who will read this post have expressed a great affinity for carbon dioxide, there are many others who have indicated that our continued dumping of some 28 billion tons yearly into the Earth’s atmosphere is changing the environment in ways we are only beginning to understand. By requesting unsalable proof that these changes are harmful you are, in kind, reversing the meaning of this long held tenet of science and engineering.

Tamsie
August 14, 2010 11:46 am

Monckton and Fox News, now that makes sense.

Dr. Dave
August 14, 2010 11:56 am

[moderators…could you check the black pit of lost comments, please]

Phil's Dad
August 14, 2010 12:04 pm

(I have posted the data/math behind this statement on this forum before.) The simple fact is there is not enough fossil fuel available to us to sustain an exponential increase in CO2 to the end of the century.
With regard to GeoFlynx’s (August 14, 2010 at 9:47 am) rather tired old “prove me wrong” challenge I would suggest that the real danger is of the proposed mitigations (of the un-proven hypothesised cause of future potential harm) causing more certain harm than the thing they propose to avoid.

August 14, 2010 12:04 pm

GeoFlynx says:
“…our continued dumping of some 28 billion tons yearly into the Earth’s atmosphere is changing the environment in ways we are only beginning to understand.”
Well, try. Try to show any actual damage resulting. And keep in mind that your scary “28 billion tons” is only a few percent of the total.
Find the CO2. ☺

latitude
August 14, 2010 12:12 pm

Peter Whale says:
August 14, 2010 at 11:25 am
Could someone on either side of the debate give me what weather conditions over an agreed period of time, that would then turn the observations so that they could be called climate, which would then either confirm catastrophic warming or confirm natural cause and variation
=======================================================
Peter, zip zero zilch
I hate to keep bring it up, but it’s such a good one I can’t help it.
The American dust bowl lasted for almost a decade. If that happened now, even most un-believers would believe it.
But what happened right after it?
Almost the same people that are now claiming catastrophic global warming, were predicting catastrophic ice age….
Hansen just got bit in the rear on that same thing.
His predictions of only 20 years have made him a joke.
So if a decade drought and heat, is followed by predictions of an ice age, is followed by 20 year predictions that are wrong….
And anyone with any common sense can look back at the record and see what good shape we are really in….
My guess would be around 500-1000 years.

eddieo
August 14, 2010 12:13 pm

http://www.esrl.noaa.gov/gmd/ccgg/trends/#mlo_full
I’m afraid that the good Lord is cherry picking when he claims that CO2 is increasing linearly. The long term data suggests a slow growth in the rate of growth which is not characteristic of a linear increase.
We are quick to accuse the alarmists of cherry picking data so lets not fall into the same trap, it just gives them an easy target.

Lady Life Grows
August 14, 2010 12:15 pm

Drat! I want more CO2 because I am a biologist and more CO2 means more life–animals as well as plants.
I think it also means more longevity.
Esther Cook

GeoFlynx
August 14, 2010 12:25 pm

Smokey says:
August 14, 2010 at 12:04 pm
Well, try. Try to show any actual damage resulting. And keep in mind that your scary “28 billion tons” is only a few percent of the total.
GeoFlynx – Try typing in “environmental damage due to anthropogenic co2” into Google Scholar. I got 23,600 hits that would meet your request.

August 14, 2010 12:39 pm

duckster says:
August 14, 2010 at 7:34 am
“Have a look at the recent trends in CO2 emissions from Mauna Loa, and they don’t look anything like linear. Of course, if you cherry pick a couple of years from the record, it’s very easy to make them look like they aren’t increasing, but when you look at the whole record from 1960, it is VERY apparent that they are increasing exponentially. But hey, don’t believe me – check for yourself: Full Mauna Loa CO2 record.”
year ppm/yr
1959 0.95
1960 0.51
1961 0.95
1962 0.69
1963 0.73
1964 0.29
1965 0.98
1966 1.23
1967 0.75
1968 1.02
1969 1.34
1970 1.02
1971 0.82
1972 1.76
1973 1.18
1974 0.78
1975 1.10
1976 0.92
1977 2.09
1978 1.31
1979 1.68
1980 1.80
1981 1.43
1982 0.72
1983 2.16
1984 1.37
1985 1.24
1986 1.51
1987 2.33
1988 2.09
1989 1.27
1990 1.31
1991 1.02
1992 0.43
1993 1.35
1994 1.90
1995 1.98
1996 1.19
1997 1.96
1998 2.93
1999 0.94
2000 1.74
2001 1.59
2002 2.56
2003 2.29
2004 1.55
2005 2.56
2006 1.72
2007 2.17
2008 1.66
2009 1.89
I checked.
Looks highly un-correlated to CO2 increase (maybe to temperature though) to me. eg.
1982 0.72
1983 2.16
and
1998 2.93
1999 0.94
stand out to this mark one eyeball, not exponential anything.

James Sexton
August 14, 2010 12:40 pm

duckster says:
August 14, 2010 at 11:31 am
“You are actually willing to accept Beck and Frieburg uncritically? Have you looked at their graphs? What would be the mechanisms for such a massive release of CO2, and then its reabsorption into environmental sinks – and in such a tiny period of time? Almost none of the other proxies even remotely reflect this massive amount of CO2. You’d need to do a lot of accounting for all this to show how it would work.”
Beck and Frieburg do make a compelling statement but I don’t accept very much uncritically. Why to you(by implication) reject it in an out-of-hand manner? Wouldn’t this point to factors in the climate we don’t fully understand yet? The fact is I was merely pointing out that atmospheric CO2 existed prior to 1960. I gave to two examples of higher CO2 levels by other peoples’ assertion.
It has been stated plant life is maintained by atmospheric CO2 being > 150 ppmv. Given our understood history of the planet, or even the history of mankind, does it stand to reason to you that our CO2 levels were static? How is it through all of the campfires, smelts, coal consumption, volcanoes, forest fires, exhaling, ect. that from say, 10,000 BC to 1960 AD the CO2 level never exceeded what is today? Is is logical to believe CO2 levels started at 150 and ever so slowly graduated to 300 by 1960? Not to me. It is not. The thought doesn’t stand to logic.
If it were me arguing CO2 levels are coming to dangerous levels, I’d probably feel compelled to endeavor to find examples in the past where one may draw those conclusions instead of trying to pretend the history of CO2 didn’t exist before 1960. Given man’s inability to create elements such as Carbon and Oxygen, I’d be inclined to believe the changing of C and O into various forms would have happened sometime in the past given that the climatic conditions of the earth has been and always will be volatile. But that is only if I were advocating significant social and economic change of the earth’s inhabitants.

Icarus
August 14, 2010 12:41 pm

It’s easy to see that the rise in CO2 has been greater than linear for many decades – simply projecting this increase forwards as a mathematical exercise would get it to ~660ppm in the year 2100 (there is no legitimate reason to choose a linear projection, as Monckton has done, when the historical rate of increase hasn’t been linear).
However, it’s extremely unlikely that a simple mathematical projection for 90 years into the future is of any value. Considering the climate chaos already being seen from the ~100ppm increase in atmospheric CO2 so far, the fact that fossil fuels are getting harder and more expensive to extract, and all the other constraints associated with growth, ‘business as usual’ for global industrial society for the next 90 years seems unlikely.
As far as the rise in global temperature is concerned, this is proceeding at around 0.18°C with no sign of any change in this warming trend. Any analysis of global temperature based on 10 years of data is of course completely worthless, as the signal of any long-term trend would not be distinguishable from the noise of interannual variability. The current global warming, which Monckton acknowledges, is more than enough to be alarmed about, given its effects so far and the fact that more warming is inevitable for some decades to come, regardless of what happens (from the unrealised warming of the current radiative imbalance, plus whatever increase comes from continued rise of greenhouse gas emissions from human activity in the next few decades).

Bill Illis
August 14, 2010 12:52 pm

bluegrue says:
August 14, 2010 at 11:24 am
Your chart is the A1F1 scenario which ends up being the highest one. Here is a short table of all the main IPCC CO2 scenarios. 2010 will around 388 ppm, so one can see the A1F1, A2 and the more commonly cited A1B are a little too high right now. The out-year growth rates will make them far too high by 2100. The “B” slow-growth group looks to be closer.
http://www.ipcc-data.org/ancilliary/tar-isam.txt
Monthly global Co2 levels to the end of June are here.
ftp://ftp.cmdl.noaa.gov/ccg/co2/trends/co2_mm_gl.txt

R. Gates
August 14, 2010 12:53 pm

Monckton of Brenchley says:
August 14, 2010 at 10:09 am
In answer to R. Gates, I did not state in my posting that the CO2 concentration graph for the past ten years was “linear”: merely that it was “very close to linear”.
Summing the absolute differences between the monthly NOAA data and the corresponding points on the least-squares linear regression trend generates a very substantially lower value than similarly comparing the monthly NOAA data and the IPCC’s lower-bound exponential curve.
Furthermore, I did not confine the analysis to the linear regression trend. I also determined the exponential curve over the past decade that would give the closest fit to the NOAA data, and, as I explained in my posting, that curve would reach 614 ppmv by 2100 – not much above the 570 ppmv towards which the linear trend is heading.
My conclusion was, and is, that if the current decay of the NOAA data from true exponentiality – which has persisted now for a dozen years – were to continue, a substantial fraction of the anthropogenic warming that would otherwise have been expected to occur over the 21st century on the basis of the IPCC’s methods will not in fact occur. That is the central point demonstrated by our CO2 graph
_______
Thanks for the reply. I’d be very curious to get your response to these two graphs. The first is a log transform of the whole data set for CO2, which, as you know, if it is not straight, will show if there is an exponential growth rate to CO2. The graph looks like this:
http://tamino.files.wordpress.com/2010/04/mloco21.jpg
Which of course shows that there is an exponential growth rate, and perhaps this is what you mean by “nearly linear”, though you’ve only looked at 10 years. But a second graph seems to be far more interesting, as it is a log transform the the complete CO2 data set, which will show us not just if there is an exponential rate of growth in CO2, but how that growth rate (over the longest term) is changing. Here’s what is shows:
http://tamino.files.wordpress.com/2010/04/logco2.jpg
It would appear that not only is there definitely an exponential growth rate, but that growth rate seems to be changing in an exponential fashion as seen in the log data.
Doesn’t it seem more accurate to take this longer term perspective, when we can smooth out shorter term noise form shorter term climate fluctuations, that might show up in a 10 year chart such as you’re using?
Also, the IPCC presented several scenarios for future climate projections, of which the A2 scenario was only one, and none of them represent any actual forecast or predictions, and some of them show temps and CO2 more in line with the 600-650 ppm for CO2. Why did you chose one several IPCC scenarios over the others, when none of them are any more valid than another, and several have similar model points that match what we’ve actually seen happen over the past decade?
Finally, I do want to say how pleased I am to actually get a chance to converse with you, as I’ve followed you for several years, and though I am one of the few AGW “warmists” here on WUWT, I disagree with some of your positions related to climate, I do respect the fact that your efforts do generate honest thought and dialog on these topics, and as usual, hats off to Anthony for creating this opportunity for that dialog.

James Sexton
August 14, 2010 12:59 pm

GeoFlynx says:
August 14, 2010 at 11:40 am
“The precautionary principle states that if an action or policy has a suspected risk of causing harm to the public or to the environment, in the absence of scientific consensus that the action or policy is harmful, the burden of proof that it is not harmful falls on those taking the action …..(Wikipedia).”
Which is exactly why we shouldn’t proceed with any of these alleged cures for CO2 levels.
The advocates of the CAGW theory are asking us to shut down our industrial growth. The implications are far reaching as to land use and movement of social and economic goods and services. Further, by applying the various suggestions in an attempt to lower global CO2 emissions, this necessitates stunted growth in 3rd world nations and would preclude them from the benefits of industrialization, such as cheap and reliable electricity, and water plants able to make water potable, power for computers and servers, communication, ect. With this in mind, the 3rd world nations’ problems would necessarily expand to the rest of the world in terms of civil and uncivil unrest.
Therefore, applying the “precautionary principle”, I say we should avoid the restructuring of the social-economic systems of the planet. I think the dangers far exceed any potential gain. But then, that isn’t for me to make the case, it would be for the ones advocating such things as “cap and trade”, the shut down of coal fired energy plants, the withholding of aid to impoverished nations for energy advancement, ect.

August 14, 2010 1:13 pm

GeoFlynx,
Please. You can type just about anything into an internet search and get thousands of hits. That means nothing.
Try to give a verifiable, testable observation that shows an increase in CO2 is causing measurable damage. If you can.
R. Gates: put this in your non-“linear” pipe and smoke it.☺

CRS, Dr.P.H.
August 14, 2010 1:19 pm

GeoFlynx says:
August 14, 2010 at 9:47 am
The contention and absolute burden of proof, that the continued artificial alteration of the Earth’s atmosphere is harmless, remains with those who would continue that alteration. Sadly, the changes wrought on this planet in the last half century have rendered that position untenable. Margaret Thatcher herself would agree that we have a duty and obligation, for the benefit of future generations, to end this experiment as quickly and thoroughly as we can.
—-
What exactly are you talking about?? Ever since passage of the Clean Air Act in the 1970’s (signed into law by Pres. Nixon), vast strides have been made in reducing the burden of particulates, aerosols, and compounds including sulfur and mercury to the atmosphere.
According to your Precautionary Principle, the trial is over, the data is established, so let’s now save the world despite the fact that the costs will likely drive the already fragile global economy into a tailspin.
Sorry, I’ve worked in climate change science for over 30 years (natural wetland and manure methane mitigation), and I don’t buy the CAGW argument. Destroying the world’s wealth to chase chimeras like that is supremely counterproductive, and it doesn’t appear that many world citizens are lining up to participate.
As we say in Chicago, “stick a fork in it”….it’s done.

August 14, 2010 1:28 pm

Duckster,
What appears to you as exponential is a segment of a natural wave near it’s minimum and rising. I have analyzed the raw flask data from all over the globe and have come up with a global model for the natural background. What Scripps reports as monthly averages is natural background because they do not include in their averages the measured high values that could potentially be anthropogenic. Do the math. First convert time to radians by multiplying by 2*PI. Then regress the monthly averages on the following cycles. The annual cycle at Mauna Loa has a saw tooth form which can be expressed as sine(x-0.81)-sin(2*(x-0.81)/4. Four other statistically significant cycles have the form sine(x/a+b). The x term is 2*pi*year. The a term is wave length in years and the term b positions the wave length. The four pairs for the cycles are: 307.5 and -.18, 20.5 and -1.28, 9.93 and 2.5, and 5.1 and .65. The global model accounts for the changing shape and position of the annual cycle and has an R square of better than 0.99. Extrapolating this model into the future suggests that CO2 at Mauna Loa will max out within 2 ppmv of 499 in March of 2091. Since I won’t be around then, pick a date within about 15 years and see how well the model is at predicting. I’ll bet it will do a lot better than exponential and a little better than linear.

thechuckr
August 14, 2010 1:32 pm

There has been an amazing paper in the Annals of Applied Statistics that may put the final stake in the “hockey stick” graph. “Robust” is an understatement.
http://www.e-publications.org/ ims/ submission/ index.php/ AOAS/ user/ submissionFile/ 6695?confirm=63ebfddf .
It can also be found at Climate Audit. Just for a few laughs I posted at Climate Progress regarding this paper. Anyone want to give me odds on Romm actually letting it got through moderation.

Julian in Wales
August 14, 2010 1:38 pm

PJP that is a brilliant summary I was looking for. Thank you too to James Sexton and Steve. Being a regular reader of WUWT I had absorbed bits of the picture but not put it together into a coherent picture. I think all science is in the end an extension of using common sense and in the end the only science worth taking seriously is science that is understandable to anyone willing to be sensible. But it is possible to have the wool pulled over ones eye’s by scientists who are dishonest.
So I think that honesty is the other half to common sense, it really is not sensible to trust someone who inverts graphs twice or sets out to deceive about the number of papers they have written. Anyone who is dishonest on such a basic level is not going to be a trustworthy source. This alone rules a non scientist ever being able to take the hockey stick graph seriously.
I am an artist by trade and my work involves finding patterns that are consistent, I do not know if a pattern is consistent in nature until I have seen it maybe hundreds of times. For instance I will sit in front of the television and draw faces for three hours every evening, and in the end I see patterns in the faces that are common to all faces, these consistent repeating patterns become the basis of all my drawings of faces. I believe a chess player sees the chess board in terms of patterns too. Surely science is the same, you can see any number of patterns in nature but only a few are consistent. When you see a pattern jumping out at you again and again you know it has some fundamental truth to it.
The C02 theory has no consistency, it is a pattern we may see once or twice but when you look for it again it simply is not there anymore. It is an illusion.

thechuckr
August 14, 2010 1:42 pm

It has been submitted, not yet published, my bad.

Joe Spencer
August 14, 2010 1:46 pm

Ralph says:
August 14, 2010 at 10:26 am


Isn’t it rather more simply Ralph, that they just know there’s nothing to be gained from blaming it on nations who either cann’t afford or simply aren’t going to indulge their elaborate but fantastic fabrications about CO2.
They know how to play a guilt trip in those self indulgent enough to be taken in by it, which seems to be the developed Western world in the main.

Joe Spencer
August 14, 2010 1:48 pm

That was followingg Ralph’s observation
Ralph says:
August 14, 2010 at 10:26 am
“……….
Only Westerners can be evil bogey-men, in the eyes of AGW Greenies, while all other nations are as pure as the driven snow.
……..”

Harold Pierce Jr
August 14, 2010 1:56 pm

PJP says:
“The AGW proponents realize that on its own, increasing CO2 concentrations are not going to make a huge difference, so they have to include other factors to multiply the effect a small temperature increase due to increasing CO2. Typically, they assume that increasing CO2/temperature will drastically increase the concentration of water vapor, and that will make the big change in temperature.”
Humidty table does not support the positive water vapor feedback mechanism. Air at 14 deg C, one atm. pressure and 100% humidity has 12.1 g of water vapor per cu. meter. If the temperature of this air is increased to 15 deg C, one cu. meter wiil now contain 12.8 g of water vapor, an increase of only 0.8 g per cu. meter or about 16 drops of water from an eye dropper. Air only has 100% humidity if there is rain, snow, or heavy fog.
So how does the enormous amount of water from surface waters enter the atmosphere? The wind is the major force that transports water into the air.. A strong wind can transport water into air by 1-2 orders of magnitude compared to evaporation. Even though a strong wind will cool the surface water, the molecules of nitrogen and oxygen have so much momentum that they just blast water molecules right out of the surface water.
Air pressure also controls the transport of water into the air. High pressure cells have dry air whereas low pressure cells have moist air and often bring rain, snow sleet and hail. The lake effect is an example of a strong wind that moves water form the lake surface onto the land.
The ever decreasing low pressure of hurricane as it moves over warm water causes it to “flash evaporate” into the air.
See how easy it is to shoot down the AGW hypothesis.

1 3 4 5 6 7 12