Statistical Significances – How Long Is "The Pause"? (Now Includes September Data)

WoodForTrees.org – Paul Clark – Click the pic to view at source

Image Credit: WoodForTrees.org

Guest Post By Werner Brozek, Edited By Just The Facts, Update/Additional Explanatory Commentary from Nick Stokes

UPDATE: RSS for October has just come out and the value was 0.207. As a result, RSS has now reached the 204 month or 17 year mark. The slope over the last 17 years is -0.000122111 per year.

The graphic above shows 5 lines. The long horizontal line shows that RSS is flat since November 1996 to September 2013, which is a period of 16 years and 11 months or 203 months. All three programs are unanimous on this point. The two lines that are sloped up and down and which are closer together include the error bars based on Nick Stokes’ Temperature Trend Viewer page. The two lines that are sloped up and down and which are further apart include the error bars based on SkS’s Temperature Trend Calculator. Nick Stokes’ program provides much tighter error bars and therefore his times for a 95% significance are less than that of SkS. In my previous post on August 25, I said: On six different data sets, there has been no statistically significant warming for between 18 and 23 years. That statement was based on the trend from the SkS page. However based on the trend from Nick Stokes’ page, there has been no statistically significant warming for between 16 and 20 years on several different data sets. In this post, I have used Nick Stokes’ numbers in section 2 as well as row 8 of the table below. Please let us know what you think of this change. I have asked that Nick Stokes join this thread to answer any questions pertaining to the different methods of calculating 95% significance and defend his chosen method. Nick’s trend methodology/page offers the numbers for Hadsst3 so I have also switched from Hadsst2 to Hadsst3. WFT offers numbers for Hadcrut3 but I can no longer offer error bars for that set since Nick’s program only does Hadcrut4.

In the future, I am not interested in using the trend methodology/page that offers the longest times. I am not interested in using trend methodology/page that offers the shortest times. And I am not interested in using trend methodology/page that offers the highest consensus. What I am interested in is using the trend methodology/page that offers that is the most accurate representation of Earth’s temperature trend. I thought it was SkS, but I may have been wrong. Please let us know in comments if you think that SkS or Nick Stokes’s methodology/page is more accurate, and if you can offer a more accurate one, please let us know that too.

According to NOAA’s State of the Climate In 2008 report:

The simulations rule out (at the 95% level) zero trends for intervals of 15 yr or more, suggesting that an observed absence of warming of this duration is needed to create a discrepancy with the expected present-day warming rate.

In this 2011 paper “Separating signal and noise in atmospheric temperature changes: The importance of timescale” Santer et al. found that:

Because of the pronounced effect of interannual noise on decadal trends, a multi-model ensemble of anthropogenically-forced simulations displays many 10-year periods with little warming. A single decade of observational TLT data is therefore inadequate for identifying a slowly evolving anthropogenic warming signal. Our results show that temperature records of at least 17 years in length are required for identifying human effects on global-mean tropospheric temperature.

In 2010 Phil Jones was asked by the BBC, “Do you agree that from 1995 to the present there has been no statistically-significant global warming?”, Phil Jones replied:

Yes, but only just. I also calculated the trend for the period 1995 to 2009. This trend (0.12C per decade) is positive, but not significant at the 95% significance level. The positive trend is quite close to the significance level. Achieving statistical significance in scientific terms is much more likely for longer periods, and much less likely for shorter periods.

I’ll leave it to you to draw your own conclusions based upon the data below.

Note: If you read my recent article RSS Flat For 200 Months (Now Includes July Data) and just wish to know what is new with the August and September data, you will find the most important new information from lines 7 to the end of the table. And as mentioned above, all lines for Hadsst3 are new.

In the sections below, we will present you with the latest facts. The information will be presented in three sections and an appendix. The first section will show for how long there has been no warming on several data sets. The second section will show for how long there has been no statistically significant warming on several data sets. The third section will show how 2013 to date compares with 2012 and the warmest years and months on record so far. The appendix will illustrate sections 1 and 2 in a different way. Graphs and a table will be used to illustrate the data.

Section 1

This analysis uses the latest month for which data is available on WoodForTrees.com (WFT). All of the data on WFT is also available at the specific sources as outlined below. We start with the present date and go to the furthest month in the past where the slope is a least slightly negative. So if the slope from September is 4 x 10^-4 but it is – 4 x 10^-4 from October, we give the time from October so no one can accuse us of being less than honest if we say the slope is flat from a certain month.

On all data sets below, the different times for a slope that is at least very slightly negative ranges from 8 years and 9 months to 16 years and 11 months.

1. For GISS, the slope is flat since September 1, 2001 or 12 years, 1 month. (goes to September 30, 2013)

2. For Hadcrut3, the slope is flat since May 1997 or 16 years, 5 months. (goes to September)

3. For a combination of GISS, Hadcrut3, UAH and RSS, the slope is flat since December 2000 or 12 years, 10 months. (goes to September)

4. For Hadcrut4, the slope is flat since December 2000 or 12 years, 10 months. (goes to September)

5. For Hadsst3, the slope is flat since November 2000 or 12 years, 11 months. (goes to September)

6. For UAH, the slope is flat since January 2005 or 8 years, 9 months. (goes to September using version 5.5)

7. For RSS, the slope is flat since November 1996 or 17 years (goes to October)

RSS is 203/204 or 99.5% of the way to Ben Santer’s 17 years.

The next link shows just the lines to illustrate the above for what can be shown. Think of it as a sideways bar graph where the lengths of the lines indicate the relative times where the slope is 0. In addition, the sloped wiggly line shows how CO2 has increased over this period.

WoodForTrees.org – Paul Clark – Click the pic to view at source

When two things are plotted as I have done, the left only shows a temperature anomaly.

The actual numbers are meaningless since all slopes are essentially zero and the position of each line is merely a reflection of the base period from which anomalies are taken for each set. No numbers are given for CO2. Some have asked that the log of the concentration of CO2 be plotted. However WFT does not give this option. The upward sloping CO2 line only shows that while CO2 has been going up over the last 17 years, the temperatures have been flat for varying periods on various data sets.

The next graph shows the above, but this time, the actual plotted points are shown along with the slope lines and the CO2 is omitted:

WoodForTrees.org – Paul Clark – Click the pic to view at source

Section 2

For this analysis, data was retrieved from Nick Stokes moyhu.blogspot.com. This analysis indicates for how long there has not been statistically significant warming according to Nick’s criteria. Data go to their latest update for each set. In every case, note that the lower error bar is negative so a slope of 0 cannot be ruled out from the month indicated.

On several different data sets, there has been no statistically significant warming for between 16 and 20 years.

The details for several sets are below.

For UAH: Since November 1995: CI from -0.001 to 2.501

For RSS: Since December 1992: CI from -0.005 to 1.968

For Hadcrut4: Since August 1996: CI from -0.006 to 1.358

For Hadsst3: Since May 1993: CI from -0.002 to 1.768

For GISS: Since August 1997: CI from -0.030 to 1.326

Section 3

This section shows data about 2013 and other information in the form of a table. The table shows the six data sources along the top and other places so they should be visible at all times. The sources are UAH, RSS, Hadcrut4, Hadcrut3, Hadsst3, and GISS. Down the column, are the following:

1. 12ra: This is the final ranking for 2012 on each data set.

2. 12a: Here I give the average anomaly for 2012.

3. year: This indicates the warmest year on record so far for that particular data set. Note that two of the data sets have 2010 as the warmest year and four have 1998 as the warmest year.

4. ano: This is the average of the monthly anomalies of the warmest year just above.

5. mon: This is the month where that particular data set showed the highest anomaly. The months are identified by the first three letters of the month and the last two numbers of the year.

6. ano: This is the anomaly of the month just above.

7. y/m: This is the longest period of time where the slope is not positive given in years/months. So 16/2 means that for 16 years and 2 months the slope is essentially 0.

8. sig: This the first month for which warming is not statistically significant according to Nick’s criteria. The first three letters of the month is followed by the last two numbers of the year.

9. Jan: This is the January, 2013, anomaly for that particular data set.

10. Feb: This is the February, 2013, anomaly for that particular data set, etc.

21. ave: This is the average anomaly of all months to date taken by adding all numbers and dividing by the number of months. However if the data set itself gives that average, I may use their number. Sometimes the number in the third decimal place differs by one, presumably due to all months not having the same number of days.

22. rnk: This is the rank that each particular data set would have if the anomaly above were to remain that way for the rest of the year. It may not, but think of it as an update 45 minutes into a game. Due to different base periods, the rank is more meaningful than the average anomaly.

Source UAH RSS Had4 Had3 Sst3 GISS
1. 12ra 9th 11th 9th 10th 9th 9th
2. 12a 0.161 0.192 0.448 0.406 0.346 0.58
3. year 1998 1998 2010 1998 1998 2010
4. ano 0.419 0.55 0.547 0.548 0.416 0.67
5. mon Apr98 Apr98 Jan07 Feb98 Jul98 Jan07
6. ano 0.66 0.857 0.829 0.756 0.526 0.94
7. y/m 8/9 16/11 12/10 16/5 12/11 12/1
8. sig Nov95 Dec92 Aug96 May93 Aug97
Source UAH RSS Had4 Had3 Sst3 GISS
9. Jan 0.504 0.440 0.450 0.390 0.292 0.63
10.Feb 0.175 0.194 0.479 0.424 0.309 0.51
11.Mar 0.183 0.204 0.405 0.384 0.287 0.60
12.Apr 0.103 0.218 0.427 0.400 0.364 0.48
13.May 0.077 0.139 0.498 0.472 0.382 0.57
14.Jun 0.269 0.291 0.457 0.426 0.314 0.61
15.Jul 0.118 0.222 0.514 0.488 0.479 0.54
16.Aug 0.122 0.167 0.527 0.491 0.483 0.61
17.Sep 0.297 0.257 0.534 0.516 0.455 0.74
Source UAH RSS Had4 Had3 Sst3 GISS
21.ave 0.205 0.237 0.474 0.444 0.374 0.588
22.rnk 6th 8th 9th 7th 6th 9th

If you wish to verify all of the latest anomalies, go to the following links, For UAH, version 5.5 was used since that is what WFT used, RSS, Hadcrut4, Hadcrut3, Hadsst3,and GISS

To see all points since January 2013 in the form of a graph, see the WFT graph below:

WoodForTrees.org – Paul Clark – Click the pic to view at source

Appendix

In this section, we summarize data for each set separately.

RSS

The slope is flat since November 1996 or 16 years and 11 months. (goes to September) RSS is 203/204 or 99.5% of the way to Ben Santer’s 17 years.

For RSS: There is no statistically significant warming since December 1992: CI from -0.005 to 1.968

The RSS average anomaly so far for 2013 is 0.237. This would rank 8th if it stayed this way. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. The anomaly in 2012 was 0.192 and it came in 11th.

UAH

The slope is flat since January 2005 or 8 years, 9 months. (goes to September using version 5.5)

For UAH: There is no statistically significant warming since November 1995: CI from -0.001 to 2.501

The UAH average anomaly so far for 2013 is 0.205. This would rank 6th if it stayed this way. 1998 was the warmest at 0.419. The highest ever monthly anomaly was in April of 1998 when it reached 0.66. The anomaly in 2012 was 0.161 and it came in 9th.

Hadcrut4

The slope is flat since December 2000 or 12 years, 10 months. (goes to September)

For HadCRUT4: There is no statistically significant warming since August 1996: CI from -0.006 to 1.358

The Hadcrut4 average anomaly so far for 2013 is 0.474. This would rank 9th if it stayed this way. 2010 was the warmest at 0.547. The highest ever monthly anomaly was in January of 2007 when it reached 0.829. The anomaly in 2012 was 0.448 and it came in 9th.

Hadcrut3

The slope is flat since May 1997 or 16 years, 5 months (goes to September, 2013)

The Hadcrut3 average anomaly so far for 2013 is 0.444. This would rank 7th if it stayed this way. 1998 was the warmest at 0.548. The highest ever monthly anomaly was in February of 1998 when it reached 0.756. One has to go back to the 1940s to find the previous time that a Hadcrut3 record was not beaten in 10 years or less. The anomaly in 2012 was 0.406 and it came in 10th.

Hadsst3

For Hadsst3, the slope is flat since November 2000 or 12 years, 11 months. (goes to September, 2013).

For Hadsst3: There is no statistically significant warming since May 1993: CI from -0.002 to 1.768

The Hadsst3 average anomaly so far for 2013 is 0.374. This would rank 6th if it stayed this way. 1998 was the warmest at 0.416. The highest ever monthly anomaly was in July of 1998 when it reached 0.526. The anomaly in 2012 was 0.346 and it came in 9th.

GISS

The slope is flat since September 1, 2001 or 12 years, 1 month. (goes to September 30, 2013)

For GISS: There is no statistically significant warming since August 1997: CI from -0.030 to 1.326

The GISS average anomaly so far for 2013 is 0.588. This would rank 9th if it stayed this way. 2010 was the warmest at 0.67. The highest ever monthly anomaly was in January of 2007 when it reached 0.94. The anomaly in 2012 was 0.58 and it came in 9th.

Conclusion

It appears as if we can accurately say from what point in time the slope is zero or any other value. However the period where warming is statistically significant seems to be more of a challenge. Different programs give different results. However what I found really surprising was that according to Nick’s program, GISS shows significant warming at over 95% for the months of November 1996 to July 1997 inclusive. However during those nine months, the slope for RSS is not even positive! Can we trust both data sets?

———-

Update: Additional Explanatory Commentary from Nick Stokes

Trends and errors:

A trend coefficient is just a weighted average of a time series, which describes the rate of increase. You can calculate it without any particular statistical model in mind.

If you want to quantify the uncertainty you have about it, you need to be clear what kind of variations you have in mind. You might want to describe the uncertainty of actual measurement. You might want to quantify the spatial variability. Or you might want to say how typical that trend is given time variability. In other words, what if the weather had been different?

It’s that last variability that we’re talking about here, and we need a model for the variation. In all kinds of time series analysis, ARIMA models are a staple. No-one seriously believes that their data really is a linear trend with AR(1) fluctuations, or whatever, but you try to get the nearest fitting model to estimate the trend uncertainty.

In my trend viewer, I used AR(1). It’s conventional, because it allows for autocorrelating based on a single delay coefficient, and there is a widely used approximation (Quenouille). I’ve described here how you can plot the autocorrelation function to show what is being fitted. The uncertainty of the trend is proportional to the area under the fitted ACF. Foster and Rahmstorf argued, reasonably, that the AR(1) underfits, and a ARMA(1,1) approx does better. Here is an example from my post. SkS uses that approach, following F&R.

You can see from the ACF that it’s really more complicated, The real ACF does not taper exponentially – it oscillates, with a period of about 4 years – likely ENSO related. Some of that effect reaches back near zero, where the ARIMA fitting is done. If it is taken out, the peak would be more slender that AR(1). But there is uncertainty with ENSO too.

So the message is, trend uncertainty is complicated.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
112 Comments
Inline Feedbacks
View all comments
November 3, 2013 8:34 pm

.
It’s all mine.

Konrad
November 3, 2013 8:35 pm

Latitude says:
November 3, 2013 at 6:22 pm
“Face it people, they’ve been flat out lying for 15 years…and they are lying now”
—————————————————————————————————–
Ton Karl’s 1985 early paper on TOB adjustment is notable for mentioning “Global warming” in the concluding remarks. The paper starts with a quite reasonable adjustment for time zones, but sneaks off into computer programs for applying TOB adjustment for changes between morning and evening surface station readings that make no use of actual station metadata.
There is virtually no warming in US surface station records without adjustments. TOB adjustment is the single largest.
Flat out lying for 15 years? And the rest!

Werner Brozek
November 3, 2013 8:59 pm

UPDATE:
RSS for October has just come out and the value was 0.207. As a result, RSS has now reached the 204 month or 17 year mark. The slope over the last 17 years is -0.000122111 per year.

Thrasher
November 3, 2013 8:59 pm

RSS out for October at +0.207

November 3, 2013 9:03 pm

Let’s extend Nick’s analogy a bit…
I’ve been commuting to work and keeping track of my commute times for 10 years. I figure my commute time is 30 minutes, plus or minus 5 minutes. The modelers take my data and calculate that my commute time is 30.49 minutes, plus or minus 4.558 minutes. They calculate that based on the data, my commute time has been increasing by 0.014 minutes per year Then they notice that I forgot to write down my times fairly often in the first year or two. No problem, they just ram a linear trend through the data and calculate the missing values from that. Of course if you extend the linear trend far enough back in time, at some point I apparently start arriving at work just before I leave home. Well, let’s just ignore that for now. But now we have a new problem, which is when the modelers were graphing all the data, they noticed that five years ago there was a six month period when my commute times were 45 minutes, not 30. They quickly conclude that since the city is larger now than it was then, and traffic congestion is worse, that my records from that six month period must be wrong, and so they adjust them to match the balance of the data.
I then tell the modelers that there is a new housing development going up that will add 1,000 new cars to the immediate area, and ask if they can tell me how this will impact my commute time. They run their computers ragged, and then advise me that my commute time will increase year over year 10 times as fast as it has been up until now. I’m dubious because there’s new roads and better traffic light sequences, and sure enough, my commute time drops to 26 minutes.
The modelers tell me that this is impossible, there must be something wrong with the way I’m measuring time. Time goes on and my commute time drops to 24 minutes. The modelers tell me I am nuts, my commute time is actually well over 30 minutes. Next year I average 24 minutes again, and 24 minutes the year after. The modelers tell me that it is just a pause. Any day now the awful truth will dawn on me that my commute time is actually increasing so fast that I will never get to work at all. Another year goes by…
Now you know everything you need to know about models.

James Allison
November 3, 2013 9:12 pm

When the temperature is trending up Warmists tell us with absolute certainty that CO2 causes it. When the temperature is flatlining or trending down one of em says “trend uncertainty is complicated” and another adds his support by telling us breathlessly that that is actually an understatement. LOL

Nick Stokes
November 3, 2013 9:43 pm

Konrad says: November 3, 2013 at 8:35 pm
“The paper starts with a quite reasonable adjustment for time zones, but sneaks off into computer programs for applying TOB adjustment for changes between morning and evening surface station readings that make no use of actual station metadata.”

I presume you mean this 1986 paper. The method certainly does require station data on observation times. This 2003 NOAA paper says:
“This time of observation bias is addressed in the adjusted HCN using the method described in Karl et al. [1986]. This adjustment approach requires as input an a priori knowledge of the actual observation time in each year at each HCN station.”

November 3, 2013 9:48 pm

Leonard Weinstein says:
November 3, 2013 at 2:29 pm
Due to the chaotic underlying nature of climate, and due to the fact that we are likely nearing the end of the Holocene, any variation up, down, or flat for even several decades demonstrates nothing useful. The flat or even downward trend could be a large natural downward trend that has been significantly temporarily overcome by the large human warming effect, or the variation could be totally natural dominated variation. Playing statistics games on such processes is truly just a game, with no meaning. At this point we do not know what is going on or which way the trend will go from here, and to say otherwise is hubris.
+++++++++++
Hold on just a minute.
We do know something from the data here. Guess what that is? We know that the models’ predictions and/or projections have been proven to be wrong based on their own metrics.
That you suggest they have no predictive value is true. But that is NOT the point. Again the point is that we do know that the models which predicted doom are incorrect.

RossP
November 3, 2013 9:49 pm

Then you have to take into account the data tampering that has gone. Steve Gorddard shows what the NOAA did to September
http://stevengoddard.wordpress.com/2013/11/03/noaa-data-tampering-reaches-a-tipping-point/
I assume werner is using the adjusted data for the NOAA data set

November 3, 2013 10:00 pm

Leonard;
Scientific truth, and the search therefor, is not driving this oxcart. A claimed connection between a so-called “forcing” increase in CO2 and global temperature has been asserted and challenged. Decisions of vast and fatal import have been taken and demanded based on the assertion. Would you say the assertion or the challenge has more to back it at this time?

RossP
November 3, 2013 10:26 pm

I should have added the following link to my comment on data tampering above. Again from Steve Goddard
http://stevengoddard.wordpress.com/2013/11/02/latest-nasa-global-data-tampering/

November 3, 2013 10:31 pm

RossP says:
November 3, 2013 at 9:49 pm
I assume werner is using the adjusted data for the NOAA data set
I actually do not use NOAA since it is not on WFT. However I use GISS and September for GISS was at a record high for September that it shared with 2005, namely 0.74.

Magicjava
November 3, 2013 11:33 pm

We all know that CO2, other items being constant, will reflect a certain band of radiation back to earth. The results should be an increase in temperature.
————
I don’t think we all know that. I don’t think it’s ever been demonstrated that adding *any* kind of green house gas the the atmosphere causes the temperature to increase.
In fact, if you removed the most powerful green house gas, water vapor, from the atmosphere, the temperature wouldn’t go down, it would go *up*! A lot. By about 20 degrees or so, if I remember correctly. This is because water vapor creates clouds which have an overall powerful cooling effect on the planet.
AGW depends on water vapor to get the heating it claims. But the facts are that water vapor has the exact opposite effect: it cools the planet.

David A
November 3, 2013 11:38 pm

Yes, and record adjustments and differential from RSS.

David A
November 3, 2013 11:42 pm
November 3, 2013 11:54 pm

The Global Warming Industry costs us almost $1.0 billion per day.
Apart from filling the pockets of government and quasi-government bureaucrats and ‘scientists’, is there anything else tangible this incredible waste of money achieves?
Observations, when not routinely manipulated by the data gatekeepers, continue to make a mockery of the models.
The recent mild warming of the past 150 years is no more than a typical natural climate cycle, which this planet has witnessed millions of times before. Man has probably affected this latest natural climate cycle in a very minor way, but we are totally unable to quantify this amount and do not know the weighting (IPCC guesses can be safely ignored) to apply to CO2, soot, aerosols, irrigation, farming etc.

King of Cool
November 4, 2013 12:22 am

Don’t know about hiding in the deep ocean, I reckon the missing heat went underground and has appeared through a vent in the centre of Australia.
Still 2 months to go but we are having a warm one:
http://www.bom.gov.au/climate/updates/articles/a003-2013-temperature.shtml
Good news for the alarmists and tax loving politicians who are now vigorously promoting an emissions trading scheme. Doesn’t matter whether Central Europe has had 5 consecutive winters colder than normal or that the Antarctic re-supply ship Aurora Australis is icebound in record sea ice 10 days behind schedule and still 177 nm from her destination.
You will not hear about this on our ABC. All you will hear about is bush fires and heat waves. Yep, global warming for many is all about the cherries in your own back yard.
Last year was looking dicey for the demon CAGW but now we will need at least another 2-3 years of a line that looks more like a sine curve than a straight line on the temperature anomaly chart before it will be endangered here.

RossP
November 4, 2013 12:25 am

Sorry Werner @10.31, my mistake . I got the two links mixed up.

thisisnotgoodtogo
November 4, 2013 1:07 am

How much CO2 does it take to support the AGW industry?

clovis man
November 4, 2013 1:23 am

Are we allowed to call the late 20th century a ‘blip’ yet?

SandyInLimousin
November 4, 2013 1:39 am

Re Commute to work
When a recession hits, thanks to politicians and bankers, the commute time goes down as does fuel usage and the model can change. For instance you can have a ten minute water cooler chat about the weather. Then there are the seasonal adjustments for school holidays when, in the UK at least, the volume of traffic drops by about 15% and a shorter journey time is the result.
It’s a great analogy.

Disko Troop
November 4, 2013 1:47 am

Take away their computers and make them use sliderules and all this this sillyness would end.

Scottish Sceptic
November 4, 2013 1:52 am

Thanks Anthony, William and the rest of the crew – as a result of reading one of these papers, I now have the answer and know why these climate models fail. I know why we get these predictions that don’t work. This will take time to put together, but the thanks can come sooner.

Konrad
November 4, 2013 2:21 am

Nick Stokes says:
November 3, 2013 at 9:43 pm
—————————————————————————————
Yes Nick, that would be the one, written 1985, accepted 1986. Tom Karl’s pet rat TOBy, chewing on raw surface station data since 1986 😉
From the abstract-
“A self-contained computer program has been developed which allows a user to estimate the time of observation bias anywhere in the contiguous United States without the costly exercise of accessing 24-hourly observations at first order stations.”
From the conclusion –
“…or temporal analysis of climate, especially climate change…”
OCR is a wonderful thing is it not? Although I suspect a certain Dr. Perriehumbert, writer of the sacred texts and player of the devil’s instrument, wishes it had never been invented.
Nick, the real problem is not that adding radiative gases to the atmosphere will not reduce the atmospheres radiative cooling ability. Nor that radiative gases act to cool our atmosphere at all concentrations above 0.0ppm. The real problem is that some of those promoting and profiting from AGW knew this before 1995, and this being the age of the Internet, they have left a trail of “climate science” behind them that would fertilise the Simpson desert.

Greg Goodman
November 4, 2013 2:29 am

Good informative post but the way you are representing CO2 is totally misleading. You are delibarately scaling and offsetting so that the slope fills the graph range. You could do that with absolutely anything that has a positive trend and it would not show a damn thing about its relation to surface temps.
You have chosen a particularly useless web tool to show such a comparison because you can do nothing apart from detrending and running means on the data. (Duh!)
Probably the best you can do is scale CO2 to 400ppm full range. This is enough to make the point that it has been steadily rising despite the ‘pause’.
http://www.woodfortrees.org/plot/esrl-co2/from:1975/scale:0.0025/offset/plot/rss/plot/rss/from:1997/trend
Since CO2 “forcing” is supposed to be a log relationship and then you need to consider how to scale it (ie climate sensitivity, which is the big question) this is still not right but at least you are showing the actual quanity of CO2 not rigging it to make your point.