Has Global Warming Stalled?

Guest Post By Werner Brozek, Edited By Just The Facts

In order to answer the question in the title, we need to know what time period is a reasonable period to take into consideration. As well, we need to know exactly what we mean by “stalled”. For example, do we mean that the slope of the temperature-time graph must be 0 in order to be able to claim that global warming has stalled? Or do we mean that we have to be at least 95% certain that there indeed has been warming over a given period?

With regards to what a suitable time period is, NOAA says the following:

”The simulations rule out (at the 95% level) zero trends for intervals of 15 yr or more, suggesting that an observed absence of warming of this duration is needed to create a discrepancy with the expected present-day warming rate.”

To verify this for yourself, see page 23 of this NOAA Climate Assessment.

Below we present you with just the facts and then you can assess whether or not global warming has stalled in a significant manner. The information will be presented in three sections and an appendix. The first section will show for how long there has been no warming on several data sets. The second section will show for how long there has been no significant warming on several data sets. The third section will show how 2012 ended up in comparison to other years. The appendix will illustrate sections 1 and 2 in a different way. Graphs and tables will be used to illustrate the data.

Section 1

This analysis uses the latest month for which data is available on WoodForTrees.org (WFT). (If any data is updated after this report is sent off, I will do so in the comments for this post.) All of the data on WFT is also available at the specific sources as outlined below. We start with the present date and go to the furthest month in the past where the slope is a least slightly negative. So if the slope from September is 4 x 10^-4 but it is – 4 x 10^-4 from October, we give the time from October so no one can accuse us of being less than honest if we say the slope is flat from a certain month.

On all data sets below, the different times for a slope that is at least very slightly negative ranges from 8 years and 3 months to 16 years and 1 month:

1. For GISS, the slope is flat since May 2001 or 11 years, 7 months. (goes to November)

2. For Hadcrut3, the slope is flat since May 1997 or 15 years, 7 months. (goes to November)

3. For a combination of GISS, Hadcrut3, UAH and RSS, the slope is flat since December 2000 or an even 12 years. (goes to November)

4. For Hadcrut4, the slope is flat since November 2000 or 12 years, 2 months. (goes to December.)

5. For Hadsst2, the slope is flat since March 1997 or 15 years, 10 months. (goes to December)

6. For UAH, the slope is flat since October 2004 or 8 years, 3 months. (goes to December)

7. For RSS, the slope is flat since January 1997 or 16 years and 1 month. (goes to January) RSS is 193/204 or 94.6% of the way to Ben Santer’s 17 years.

The following graph, also used as the header for this article, shows just the lines to illustrate the above. Think of it as a sideways bar graph where the lengths of the lines indicate the relative times where the slope is 0. In addition, the sloped wiggly line shows how CO2 has increased over this period:

The next graph shows the above, but this time, the actual plotted points are shown along with the slope lines and the CO2 is omitted:

WoodForTrees.org – Paul Clark – Click the pic to view at source

Section 2

For this analysis, data was retrieved from WoodForTrees.org and the ironically named SkepticalScience.com. This analysis indicates how long there has not been significant warming at the 95% level on various data sets. The first number in each case was sourced from WFT. However the second +/- number was taken from SkepticalScience.com

For RSS the warming is not significant for over 23 years.

For RSS: +0.127 +/-0.136 C/decade at the two sigma level from 1990

For UAH, the warming is not significant for over 19 years.

For UAH: 0.143 +/- 0.173 C/decade at the two sigma level from 1994

For Hacrut3, the warming is not significant for over 19 years.

For Hadcrut3: 0.098 +/- 0.113 C/decade at the two sigma level from 1994

For Hacrut4, the warming is not significant for over 18 years.

For Hadcrut4: 0.095 +/- 0.111 C/decade at the two sigma level from 1995

For GISS, the warming is not significant for over 17 years.

For GISS: 0.116 +/- 0.122 C/decade at the two sigma level from 1996

If you want to know the times to the nearest month that the warming is not significant for each set, they are as follows: RSS since September 1989; UAH since April 1993; Hadcrut3 since September 1993; Hadcrut4 since August 1994; GISS since October 1995 and NOAA since June 1994.

Section 3

This section shows data about 2012 in the form of tables. Each table shows the six data sources along the left, namely UAH, RSS, Hadcrut4, Hadcrut3, Hadsst2, and GISS. Along the top, are the following:

1. 2012. Below this, I indicate the present rank for 2012 on each data set.

2. Anom 1. Here I give the average anomaly for 2012.

3. Warm. This indicates the warmest year on record so far for that particular data set. Note that two of the data sets have 2010 as the warmest year and four have 1998 as the warmest year.

4. Anom 2. This is the average anomaly of the warmest year just to its left.

5. Month. This is the month where that particular data set showed the highest anomaly. The months are identified by the first two letters of the month and the last two numbers of the year.

6. Anom 3. This is the anomaly of the month immediately to the left.

7. 11ano. This is the average anomaly for the year 2011. (GISS and UAH were 10th warmest in 2011. All others were 13th warmest for 2011.)

Anomalies for different years:

Source 2012 anom warm anom month anom 11ano
UAH 9th 0.161 1998 0.419 Ap98 0.66 0.130
RSS 11th 0.192 1998 0.55 Ap98 0.857 0.147
Had4 10th 0.436 2010 0.54 Ja07 0.818 0.399
Had3 10th 0.403 1998 0.548 Fe98 0.756 0.340
sst2 8th 0.342 1998 0.451 Au98 0.555 0.273
GISS 9th 0.56 2010 0.66 Ja07 0.93 0.54

If you wish to verify all rankings, go to the following:

For UAH, see here, for RSS see here and for Hadcrut4, see here. Note the number opposite the 2012 at the bottom. Then going up to 1998, you will find that there are 9 numbers above this number. That confirms that 2012 is in 10th place. (By the way, 2001 came in at 0.433 or only 0.001 less than 0.434 for 2012, so statistically, you could say these two years are tied.)

For Hadcrut3, see here. You have to do something similar to Hadcrut4, but look at the numbers at the far right. One has to back to the 1940s to find the previous time that a Hadcrut3 record was not beaten in 10 years or less.

For Hadsst2, see here. View as for Hadcrut3. It came in 8th place with an average anomaly of 0.342, narrowly beating 2006 by 2/1000 of a degree as that came in at 0.340. In my ranking, I did not consider error bars, however 2006 and 2012 would statistically be a tie for all intents and purposes.

For GISS, see here. Check the J-D (January to December) average and then check to see how often that number is exceeded back to 1998.

For the next two tables, we again have the same six data sets, but this time the anomaly for each month is shown. [The table is split in half to fit, if you know how to compress it to fit the year, please let us know in comments The last column has the average of all points to the left.]

Source Jan Feb Mar Apr May Jun
UAH -0.134 -0.135 0.051 0.232 0.179 0.235
RSS -0.060 -0.123 0.071 0.330 0.231 0.337
Had4 0.288 0.208 0.339 0.525 0.531 0.506
Had3 0.206 0.186 0.290 0.499 0.483 0.482
sst2 0.203 0.230 0.241 0.292 0.339 0.352
GISS 0.36 0.39 0.49 0.60 0.70 0.59
Source Jul Aug Sep Oct Nov Dec Avg
UAH 0.130 0.208 0.339 0.333 0.282 0.202 0.161
RSS 0.290 0.254 0.383 0.294 0.195 0.101 0.192
Had4 0.470 0.532 0.515 0.527 0.518 0.269 0.434
Had3 0.445 0.513 0.514 0.499 0.482 0.233 0.403
sst2 0.385 0.440 0.449 0.432 0.399 0.342 0.342
GISS 0.51 0.57 0.66 0.70 0.68 0.44 0.56

To see the above in the form of a graph, see the WFT graph below.:

Appendix

In this part, we are summarizing data for each set separately.

RSS

The slope is flat since January 1997 or 16 years and 1 month. (goes to January) RSS is 193/204 or 94.6% of the way to Ben Santer’s 17 years.

For RSS the warming is not significant for over 23 years.

For RSS: +0.127 +/-0.136 C/decade at the two sigma level from 1990.

For RSS, the average anomaly for 2012 is 0.192. This would rank 11th. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. The anomaly in 2011 was 0.147 and it will come in 13th.

Following are two graphs via WFT. Both show all plotted points for RSS since 1990. Then two lines are shown on the first graph. The first upward sloping line is the line from where warming is not significant at the 95% confidence level. The second straight line shows the point from where the slope is flat.

The second graph shows the above, but in addition, there are two extra lines. These show the upper and lower lines for the 95% confidence limits. Note that the lower line is almost horizontal but slopes slightly downward. This indicates that there is a slightly larger than a 5% chance that cooling has occurred since 1990 according to RSS per graph 1 and graph 2.

UAH

The slope is flat since October 2004 or 8 years, 3 months. (goes to December)

For UAH, the warming is not significant for over 19 years.

For UAH: 0.143 +/- 0.173 C/decade at the two sigma level from 1994

For UAH the average anomaly for 2012 is 0.161. This would rank 9th. 1998 was the warmest at 0.419. The highest ever monthly anomaly was in April of 1998 when it reached 0.66. The anomaly in 2011 was 0.130 and it will come in 10th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to UAH. Graph 1 and graph 2.

Hadcrut4

The slope is flat since November 2000 or 12 years, 2 months. (goes to December.)

For Hacrut4, the warming is not significant for over 18 years.

For Hadcrut4: 0.095 +/- 0.111 C/decade at the two sigma level from 1995

With Hadcrut4, the anomaly for 2012 is 0.436. This would rank 10th. 2010 was the warmest at 0.54. The highest ever monthly anomaly was in January of 2007 when it reached 0.818. The anomaly in 2011 was 0.399 and it will come in 13th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to Hadcrut4. Graph 1 and graph 2.

Hadcrut3

The slope is flat since May 1997 or 15 years, 7 months (goes to November)

For Hacrut3, the warming is not significant for over 19 years.

For Hadcrut3: 0.098 +/- 0.113 C/decade at the two sigma level from 1994

With Hadcrut3, the anomaly for 2012 is 0.403. This would rank 10th. 1998 was the warmest at 0.548. The highest ever monthly anomaly was in February of 1998 when it reached 0.756. One has to back to the 1940s to find the previous time that a Hadcrut3 record was not beaten in 10 years or less. The anomaly in 2011 was 0.340 and it will come in 13th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to Hadcrut3. Graph 1 and graph 2.

Hadsst2

The slope is flat since March 1997 or 15 years, 10 months. (goes to December)

The Hadsst2 anomaly for 2012 is 0.342. This would rank in 8th. 1998 was the warmest at 0.451. The highest ever monthly anomaly was in August of 1998 when it reached 0.555. The anomaly in 2011 was 0.273 and it will come in 13th.

Sorry! The only graph available for Hadsst2 is this.

GISS

The slope is flat since May 2001 or 11 years, 7 months. (goes to November)

For GISS, the warming is not significant for over 17 years.

For GISS: 0.116 +/- 0.122 C/decade at the two sigma level from 1996

The GISS anomaly for 2012 is 0.56. This would rank 9th. 2010 was the warmest at 0.66. The highest ever monthly anomaly was in January of 2007 when it reached 0.93. The anomaly in 2011 was 0.54 and it will come in 10th.

Following are two graphs via WFT. Everything is identical as with RSS except the lines apply to GISS. Graph 1 and graph 2.

Conclusion

Above, various facts have been presented along with sources from where all facts were obtained. Keep in mind that no one is entitled to their own facts. It is only in the interpretation of the facts for which legitimate discussions can take place. After looking at the above facts, do you think that we should spend billions to prevent catastrophic warming? Or do you think that we should take a “wait and see” attitude for a few years to be sure that future warming will be as catastrophic as some claim it will be? Keep in mind that even the MET office felt the need to revise its forecasts. Look at the following and keep in mind that the MET office believes that the 1998 mark will be beaten by 2017. Do you agree?

WoodForTrees.org – Paul Clark – Click the pic to view at source
Get notified when a new post is published.
Subscribe today!
2 1 vote
Article Rating
185 Comments
Inline Feedbacks
View all comments
Tom Jones
February 10, 2013 6:39 pm

On Feb 10 at 2:16 pm, Steven Mosher says:
“We are quite sure that the underlying process is not linear. The imposition of a linear model is an analyst choice.”
I am quite sure that nobody has any idea what the underlying process is. It has been implied frequently that the underlying process is linear, but I have never seen anybody step up to the plate and say that the underlying process is one thing or another. If someone was willing to do that, one could at least devise an experiment to confirm or falsify that hypothesis. It seems pretty intuitive that the process is not linear, but the notion that it is up to the experimentalist to prove that, boggles my mind.
The business of taking the average of an ensemble of simulations is not even vaguely convincing. The best case is that one is correct and the others are not, but we don’t know which one is correct, do we? Another possibility is that none is correct. Given that all of the GCM’s use the basic notion that GHG concentration is the prime driver, and that CO2 is very, very important, they could all have the same fallacy built in. Maybe that’s not true, you know.

SkepticGoneWild
February 10, 2013 6:42 pm

For us (or maybe just me) statistics dummies, could you briefly explain what “2 sigma” means, and how it relates to the term “statistically significant”. I went to the SkS website and played with their trend calculator, and read some of the definitions. For example if a linear trend is 0.003 +/- 0.003, then the slope of the trend is between 0.000 and 0.006. But what does the 2 sigma mean, and how do you determine statistical significance? And what does that +/- range mean?
Thanks
[Reply: Sigma is shorthand for ‘standard deviation’ (that probably doesn’t explain much if you are not familiar with statistics already). The more standard deviations you have, the further you are from average. The -/+ means how much the value is likely to vary. So saying I have an 8 ounce cup of tea +/- 1 means I might have anywhere form a 7 ounce to a 9 ounce cup of tea. Statistical significance takes more than a note to explain, but mostly says how much you can trust a conclusion. If I have a sample of 6 cups of tea and they are average 8 ounces, it might be that the next one is 9 or 10. If I have a sample of 1200 cups, and they are 8 ounces +/- 0.1 ounce, it is highly likely (very statistically significant) that my next cup will be betwwn 7.9 and 8.1 ounces. I can’t say that about my first sample as it is too small to be ‘statistically significant’. Back on that trend and +/-: When a trend has a value the same as the error range, it means you don’t know much about the trend. Having $2 +/- $2 says I might have money, or not, in equal probability… -ModE]

Ian H
February 10, 2013 6:51 pm

Not too happy about the way you put CO2 levels on the combined graph. What is the vertical axis in this picture? A temperature anomaly? Where is the axis for the CO2 levels? It clearly doesn’t start at zero. A wiggly line going up? I guess CO2 increased. But without numbers attached this is pretty meaningless. With no axis to pin this down you could scale it all over the page. So what does that wiggly line actually mean. Without an axis to give it context, not much.

February 10, 2013 7:36 pm

Tom Jones says:
February 10, 2013 at 6:39 pm
The business of taking the average of an ensemble of simulations is not even vaguely convincing.
========
It is a nonsense. The earth has a near infinite number of possible futures. This is clearly shown by quantum mechanics. Some of those futures are more probable than others, but it is beyond our ability to calculate the probabilities.
Taking the average of 10 wrong answers does not improve the quality of the answer. If you ask 10 idiots the square root of a 100, will the average be closer to the correct answer than any one of the answers? No, it is 10 times more likely that one of the idiots will be closer to the true answer than is the average.
The reason for this is simple. The idiots have 10 chances to get closer to the correct answer. The average only has 1. Thus you would expect that if you repeated the exercise 11 times, only once would the average be closer. The other 10 times one of the idiots would beat the average

pottereaton
February 10, 2013 7:49 pm

Can you imagine the panic and pandemonium if temperatures start trending downward?
For a variety of reasons, nothing would please me more.

February 10, 2013 7:59 pm

SkepticGoneWild says:
February 10, 2013 at 6:42 pm
[Reply: Sigma is shorthand for ‘standard deviation’
=========
To complicate things further, most calculations of ‘standard deviation’ rely on the assumption that the underlying data is a “normal distribution”. Most of our statistical theory relies on this assumption.
However, it has been found that most time-series data (such as stock markets and climate) does not follow a normal distribution. This means that our confidence levels are often incorrect about how “likely” something is to happen. This leads people to bet the farm on a “sure thing” in the stock market, and is very similar to the current situation where politicians, scientists and countries are betting their economic future on climate.
We assume that the future is fairly certain because we expect the future to behave like the past. However, this is an entirely naive belief because we already know the past while we do not know the future. This leads us to significantly underestimate the size of the unknown when we try and consider all the things that might happen going forward.
Thus, the plans of mice and men aft go awry.

Werner Brozek
February 10, 2013 8:33 pm

John Finn says:
February 10, 2013 at 4:43 pm
I’ve got a bit of a problem with the representation of CO2 (compared to linear trends) in the first graph.
Ian H says:
February 10, 2013 at 6:51 pm
What is the vertical axis in this picture?
I would like to thank you both for raising an excellent point. The way WFT works, you can plot several different things at the same time. Naturally the y axis is different for each case. If I were to just plot CO2 alone, it would look as follows and the y axis would go from about 360 ppm to about 397 ppm CO2. See:
http://www.woodfortrees.org/plot/esrl-co2/from:1997/to:2013
When two things are plotted as I have done, the left only shows a temperature anomaly. It happens to go from -0.2 to +0.8 C. I did not plan it this way, but as it turns out, a change of 1.0 C over 16 years is about 6.0 C over 100 years. And 6.0 C is what some say may be the temperature increase by 2100. See:
http://www.independent.co.uk/environment/climate-change/temperatures-may-rise-6c-by-2100-says-study-8281272.html
“The world is destined for dangerous climate change this century – with global temperatures possibly rising by as much as 6C – because of the failure of governments to find alternatives to fossil fuels, a report by a group of economists has concluded.“
So for this to be the case, the slope for three of the data sets would have to be as steep as the CO2 slope. Hopefully the graphs show that this is totally untenable.

Chad Wozniak
February 10, 2013 8:34 pm

Interesting that even with all the obvious fudging of numbers still going on, the alarmies can’t do any better than to say the temperature curve is “flat.” Any statement by them that it is flat infallibly means it is trending downward, and that cooling, not merely a pause in warming, has been occurring for the last 15 years.
A correction to one poster’s comment: There have never been any years anywhere near as warm as the 1934-1938 period since that time, NOAA and IPCC lies to the contrary notwithstanding – not even 1953, a relatively hot one, and certainly not 1997-1998. There has in fact been an overall cooling trend, as demonstrated by the negative slope of the regression line for the past 80 years. Cyclical variations in the meantime have not affected the overall downward trend over the longer term. And interestingly again, this long-term cooling has taken place while CO2 in the atmosphere increased by nearly 40 percent. Therefore, no CO2-caused, hence no man-caused, global warming. Q.E.D.
Second proof: Inter alia, animal respiration alone pours at least 30, maybe as much as 75 times as much CO2 into the air as human activity. And then, there is ordinarily 30 to 140 times as much water vapor – a substantially more effective heat-trapping substance than there is CO2, in the atmosphere at any given time. And these are only a couple of many examples that can be cited on both sides of the CO2 equation. Man is an infinitesimal of an infinitesimal – man’s role in climate change is therefore, mathematically speaking, one over infinity squared. Q.E.D. again.
The hubris and effrontery of people who think they can control climate – especially by controlling only one tiny fraction of one tiny factor in climate change – is truly breathtaking. Back off, God/Mother Nature (take your pick of which), we’re in charge now, they’re saying.

SkepticGoneWild
February 10, 2013 8:34 pm

Ok. Thanks moderator and people. Things are a little bit clearer. Say for example the RSS data you posted:
“For RSS the warming is not significant for over 23 years.
For RSS: +0.127 +/-0.136 C/decade at the two sigma level from 1990″.
Could you run through how you determined it was 23 years for the above? Is it because the minus value of 0.136 gives one a negative trend of -0.009 C/decade (0.127 – 0.136)? So you are going back as far as you can to obtain a zero or slightly negative slope?

Richard of NZ
February 10, 2013 8:52 pm

ferd berple says:
February 10, 2013 at 7:59 pm
“Thus, the plans of mice and men aft go awry.”
I think I prefer the original
“The best laid schemes of mice and men gang aft agley
An’ lea’e us nought but grief an’ pain,
For promised joy.”
It encapsulates the entire CAGW proposition from “scheme” with its connotation of dishonesty, to the obfuscation of “gang aft agley” which needs an expert to interpret, to the promise of an idealised future for a little inconvenience now.
Perhaps Mr. Burns was the true visionary.

February 10, 2013 8:57 pm

Mod E you misunderstand statistical significance. When you say you have $2 +/- $2 it does not mean what you claim, if you’re using 1sigma bands then you’re saying there’s a 64% chance that you have between 0 and $4 and 18% chance that there’s less than 0. If you’re using the more likely 2 sigma limits then you’re saying there’s a 95% chance of 0 to $4 and 2.5% chance of less than 0.
Reply: OK, I mis-spoke. I was picturing a $2 bill that I either have, or do not. A discrete event. Ought to have said “Have $4 or No $ in equal probability”. Not so much a nonunderstanding of statistics as trying to make a simple example for a newbie to have a vague idea what’s going on. So I said “have money” when I ought to have said “have $4”. Please, feel free to actually answer the posters question instead of just nit-picking. Oh, and remember to keep it at a level that requires no prior understanding of statistics. Oh, and make NO simplifying statements for illustration that might in any way be at variance with hard core statistical truth… But make sure it is absolutely clear. And do it in less than 3 sentences. -ModE]
Reply 2: No need to be snippy ModE. Phil. was making a valid point. ~uberMod

February 10, 2013 9:04 pm

Wow, this may be the final nail. The U.N. conspiracy has stalled temperature rise to really through off those who don’t see the conspiracy.

Werner Brozek
February 10, 2013 9:12 pm

SkepticGoneWild says:
February 10, 2013 at 6:42 pm
could you briefly explain what “2 sigma” means
To add to what the moderator said, I would like to apply it to the numbers for RSS since 1990, namely: For RSS: +0.127 +/-0.136 C/decade at the two sigma level from 1990. See this illustrated on this graphic from the post:
http://www.woodfortrees.org/plot/rss/from:1990/plot/rss/from:1990/trend/plot/rss/from:1997/plot/rss/from:1990/plot/rss/from:1990/trend/plot/rss/from:1990/detrend:0.3128/trend/plot/rss/from:1990/detrend:-0.3128/trend/plot/rss/from:1997/trend
Now 0.127 – 0.136 is -0.009; and 0.127 + 0.136 is 0.263. What this is saying is that since 1990, on RSS, we can be 95% certain that the rate of change of temperature is between -0.009 and +0.263/decade. So while we may be 93% certain that it has warmed since 1990, we cannot be 95% certain. For some reason, in climate circles, 95% certainty is required in order for something to be considered statistically significant.
Now having said all this, RSS shows no change for 16 years. This means there is a 50% chance that it cooled over the last 16 years and a 50% that it warmed over the last 16 years. Since this exceeds NOAA’s 15 years, their climate models are in big trouble.
So you are going back as far as you can to obtain a zero or slightly negative slope?
Yes, but that was only to the nearest whole year. If you go to:
http://www.skepticalscience.com/trend.php
And if you then start from 1989.67, you would find:
“0.131 ±0.132 °C/decade (2σ)”
In other words, the warming is not significant at 95% since September 1989. However this is just barely true and it only goes until October. If it went to January with the huge jump, it could change.

E.M.Smith
Editor
February 10, 2013 9:25 pm

@Ferd Berple:
I touch on there here:
http://chiefio.wordpress.com/2012/12/10/do-temperatures-have-a-mean/
Oh, and as temerature is an intrinsic property, it is not legitimate to average temperatures anyway (you need to convert to heat using mass and specific heat / enthalpy terms)
http://chiefio.wordpress.com/2011/07/01/intrinsic-extrinsic-intensive-extensive/
So the whole “Global Average Temperature” idea is broken in two very different ways, either one of which makes the whole idea bogus… But nobody wants to hear that, warmers or skeptics, as then they have to admit all their arguments over average temperature are vacuous…

SkepticGoneWild
February 10, 2013 10:07 pm

Werner Brozek says:
February 10, 2013 at 9:12 pm
Now 0.127 – 0.136 is -0.009; and 0.127 + 0.136 is 0.263. What this is saying is that since 1990, on RSS, we can be 95% certain that the rate of change of temperature is between -0.009 and +0.263/decade. So while we may be 93% certain that it has warmed since 1990, we cannot be 95% certain.
Werner, thanks for your patient explanation. I understood most of what you stated, but I am somewhat confused by your statement in bold above. So according to the above data, the warming is not significant at 95%. But say the RSS data happened to be o.127+/- 0.127 C/decade at 2 sigma, you could then state that you are 95% certain that it warmed? But since the above real data had a negative value ( -0.009), you could not be 95% certain it warmed?

Climate Control Central Command (Modelling Secretariat)
February 10, 2013 10:34 pm

Dear Supporter
The General Secretary of the CCCC has asked me to write directly to you to inform you of an important change in the way in which our operations will henceforth be conducted.
Following on from the poor publicity and consequent Denier opportunities presented by the mishandling of the recent temperature and and trends by our ‘colleagues’ in the D&O Secretariat, they have all accepted compulsory early retirement.
We have taken this opportunity to remove D&O from our strategic portfolio of work and so the Modelling Secretariat will now have complete control of all ‘data’ released for any form of publication. To boost our capabilities in this area we hope soon to acquire the services (on secondment) of a valued team of experienced climatalarmists currently located in Norwich UK.
Please welcome them to our happy bunch of crusaders.
I urge you all to redouble your efforts to ensure that only Modelling secretariat approved data is ever released. There is only one True Message and we must ensure that there are no further misunderstandings because of a lack of zeal and enthusiasm for the cause..
Though regrettable, we also need to introduce severe disciplinary sanctions for any deviations from the path. To emphasise this point the previous leader of D&O will be making a public confession of his errors live on webcam tomorrow at noon. The memorial service will begin at 14:00.
The General Secretary also wishes to convey his best wishes to you all. His well-earned retirement begins forthwith – at an undisclosed location. Rest assured GS, that we will be able to find you so that your past ‘achievements’ can be put in their proper context and the judgement of history applied!
Of course, our honoured Emeritus Professor – who has recently added a further honour to his distinguished record – will continue to use Twitter to remind you of the scientific facts you need to use in your day-to-day work
Best Wishes to You All
LA
[/sarcasm ? Mod]

Climate Control Central Command (Modelling Secretariat)
February 10, 2013 10:55 pm

@mod
See the earlier message from the Climate Control Central Command (‘Data’ and ‘Observations’ Secretariat)
http://wattsupwiththat.com/2013/02/10/has-global-warming-stalled/#comment-1221832
All should become clear.
LA

Sandor
February 10, 2013 11:07 pm

Did anyone calculate the correlation between CO2 levels and temperature? Would be interestin to see?

Werner Brozek
February 10, 2013 11:43 pm

SkepticGoneWild says:
February 10, 2013 at 10:07 pm
But say the RSS data happened to be o.127+/- 0.127 C/decade at 2 sigma, you could then state that you are 95% certain that it warmed?
You are 95% certain the real slope is between 0.000 and 0.254. So I guess there would be a 2.5% chance the slope is above 0.254 and a 2.5% chance the slope is less than 0. So you would be 97.5% certain there was warming in this case.
But since the above real data had a negative value ( -0.009), you could not be 95% certain it warmed?
I was tempted to say yes that “you could not be 95% certain it warmed”, but I think I may have to ask for help here. In view of the 97.5% that I mentioned above, I am starting to wonder if there is a very small negative value that would allow you to say that you can be sure it warmed at a 95% confidence level. Can Phil. or someone else help me out? Thanks!

EternalOptimist
February 11, 2013 1:12 am

It’s enough to make a grown man cry. Here we have a set of facts that no one is disputing, and the catastrophists are spinning it as a success
the lukewarmers are claiming it proves them right
and the sceptics(inluding me) are agog that anyone can doubt they were right all along

Philip Shehan
February 11, 2013 1:27 am

More cherry picking.
Let’s just concentrate on one of the data sets shown – the claim that Hadcrut4 temperature data is flat since November 2000.
Look also at the data from 1999. Or compare the entire Muana Loa data set from 1958 with temperature.
And remember, “statistical significance” here is at the 95% level. That is, even a 94% probability that the data is not a chance result fails at this level.
http://tinyurl.com/atsx4os

Nigel Harris
February 11, 2013 1:51 am

The final chart with the question “the MET office believes that the 1998 mark will be beaten by 2017. Do you agree?” looks rather different if you put a simple linear trend line through the data presented.
http://www.woodfortrees.org/plot/hadcrut4gl/from:1990/mean:12/offset:-0.16/plot/hadcrut4gl/from:1990/mean:12/offset:-0.16/trend

Greg Goodman
February 11, 2013 1:58 am

Steven Mosher says:
Fitting a straight line to time series data assumes that the data generating process is linear.
We are quite sure that the underlying process is not linear. The imposition of a linear model is an analyst choice. This choice has an associated uncertainty such that you must either demonstrate that the physical process is linear or add uncertainty due to your model selection.
next, you need to look at autocorrelation before you make confidence intervals.
http://www.woodfortrees.org/plot/hadcrut4gl/from:1990/mean:12/mean:9/mean:7/derivative/plot/sine:10/scale:0,00001/from:1990/to:2015
=====
This is a very good point. Sadly this is what most of climate science seems to have been feeding us for the last 30 years. Especially the IPCC version of climate history. Show a trend over , say, the last 50 years (innocently chosen round number NOT). Then say “if present trends continue” , with the unstated yet implied assumption that this obviously what will happen if we don’t change our ways.
Now we are all agreed that this is bad and totally unscientific. So can a layman do any better messing around with the limited facilities offered by WFT.org ? [sic]
Firstly , if we are interested in whether temps are rising faster or slower why the hell aren’t we looking at rate of change rather than trying to guess it by look at the time series and squinting at bit. Take the DERIVATIVE .
If rate of change is above zero it’s warming, below zero it’s cooling. You don’t need a PhD to view that.
dT/dt or the time derivative is available on WTF.org , so let’s use it.
While we are there, let’s stop distorting the data with these bloody running means that even top climate scientists of all persuasions can’t seem to get beyond. Running means let through a lot the stuff we image we have filtered out and, worse still, actually INVERT some frequencies. ie turn a negative cycle into a positive one. That is why you can often see the running mean shows a trough when the data is showing a peak.
Again, we can do better even with limited tools like WTF offers.
Without going into detail on the maths , if you do a running mean of 12 , then 9 then 7 months this make a half decent filter that does not invert stuff and gives you a way smoother result because really does filter out the stuff you intended.
So I’ll just take one data series used above at random ( without suggesting it is more or less relevant than the others).
http://www.woodfortrees.org/plot/hadcrut4gl/from:1990/mean:12/mean:9/mean:7/derivative/plot/sine:10/scale:0,00001/from:1990/to:2015
BTW if anyone knows how to get a grid or at least the x-axis, I just hacked a sine wave with minute amplitude but it won’t go all the way across. WFT.org is a bit of mess but easy to use for those who don’t know how to get and plot data themselves.
Just The Facts may like to add this sort of graph for all the datasets reported here.

Greg Goodman
February 11, 2013 2:07 am

Here’s another view of this, upping the filter to remove 2 years an shorter.
http://www.woodfortrees.org/plot/hadcrut4gl/from:1970/mean:24/mean:18/mean:14/derivative/plot/sine:10/scale:0,00001/from:1990/to:2015
Here we see the 80s and 90s each had two warming periods and one cooling each. Since 1997 it is swinging about evenly each side of zero.
I’d called that ‘stalled’ if we want to draw a simplistic conclusion.

Max™
February 11, 2013 2:18 am

Switch the columns for rows on the last table.
___Data|Sets|Across|Here
M
O
N
T
H
S
H
E
R
E