No statistically significant warming since 1995: a quick mathematical proof

Physicist Luboš Motl of The Reference Frame demonstrates how easy it is to show that there is: No statistically significant warming since 1995

First, since it wasn’t in his original post, here is the UAH data plotted:

UAH_LT_1979_thru_Nov_09

By: Luboš Motl

Because there has been some confusion – and maybe deliberate confusion – among some (alarmist) commenters about the non-existence of a statistically significant warming trend since 1995, i.e. in the last fifteen years, let me dedicate a full article to this issue.

I will use the UAH temperatures whose final 2009 figures are de facto known by now (with a sufficient accuracy) because UAH publishes the daily temperatures, too:

Mathematica can calculate the confidence intervals for the slope (warming trend) by concise commands. But I will calculate the standard error of the slope manually.

x = Table[i, {i, 1995, 2009}]

y = {0.11, 0.02, 0.05, 0.51, 0.04, 0.04, 0.2, 0.31, 0.28, 0.19, 0.34, 0.26, 0.28, 0.05, 0.26};

data = Transpose[{x, y}]

(* *)

n = 15

xAV = Total[x]/n

yAV = Total[y]/n

xmav = x - xAV;

ymav = y - yAV;

lmf = LinearModelFit[data, xvar, xvar];

Normal[lmf]

(* *)

(* http://stattrek.com/AP-Statistics-4/Estimate-Slope.aspx?Tutorial=AP *)

;slopeError = Sqrt[Total[ymav^2]/(n - 2)]/Sqrt[Total[xmav^2]]

The UAH 1995-2009 slope was calculated to be 0.95 °C per century. And the standard deviation of this figure, calculated via the standard formula on this page, is 0.88 °C/century. So this suggests that the positivity of the slope is just a 1-sigma result – a noise. Can we be more rigorous about it? You bet.

Mathematica actually has compact functions that can tell you the confidence intervals for the slope:

lmf = LinearModelFit[data, xvar, xvar, ConfidenceLevel -> .95];

lmf["ParameterConfidenceIntervals"]

The 99% confidence interval is (-1.59, +3.49) in °C/century. Similarly, the 95% confidence interval for the slope is (-0.87, 2.8) in °C/century. On the other hand, the 90% confidence interval is (-0.54, 2.44) in °C/century. All these intervals contain both negative and positive numbers. No conclusion about the slope can be made on either 99%, 95%, and not even 90% confidence level.

Only the 72% confidence interval for the slope touches zero. It means that the probability that the underlying slope is negative equals 1/2 of the rest, i.e. a substantial 14%.

We can only say that it is “somewhat more likely than not” that the underlying trend in 1995-2009 was a warming trend rather than a cooling trend. Saying that the warming since 1995 was “very likely” is already way too ambitious a goal that the data don’t support.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
300 Comments
Inline Feedbacks
View all comments
Tenuc
December 26, 2009 5:57 pm

Here’s some examples of trends from Roger Pielke Jr’s blog:-
“Knappenberger provides some robust conclusions as well as some examples of creative cherrypicking:”
• For the past 8 years (96 months), no global warming is indicated by any of the five datasets.
• For the past 5 years (60 months), there is a statistically significant global cooling in all datasets. • There has been no (statistically significant) warming for the past 13 years. [Using the satellite records of the lower atmosphere].
• The globe as been cooling rapidly for the past 8 years. [Using the CRU and satellite records]
• Global warming did not ‘stop’ 10 years ago, in fact, it was pretty close to model projections. [Using the GISS and NCDC records beginning in 1998 and 1999]
• Global warming is proceeding faster than expected. [Using the GISS record staring in 1991 or 1992—the cool years just after the volcanic eruption of Mt. Pinatubo]
• For the past 15 years, global warming has been occurring at a rate that is below the average climate model expected warming
We know that temperature is the result of the large number of processes that drive our climate system. We also know that climate exhibits all the attributes of deterministic chaos.
So the real question is why do climate scientist still try to find trends in non-linear data, when these produce answers which are meaningless?

Baa Humbug
December 26, 2009 6:19 pm

Why are posters referring to 1998? Wasn’t 1998 just another year in the variability of climate? The fact that it influences the “confidence interval” is immaterial. 1998 is what it is and can’t be changed.
Regards start years, the first question to answer is what shape slope do you want? Going back over time, anyone can produce any shape, “slope” graph they wish.
All this minutae detail of the mathematica may be a fun exercise in itself, but is irrelevant to climate, as shown in the above paragraph.
It became irrelevant once the infamous hockey stick was shown to be a lie. There is nothing unusual about the last 5-10-50-100 years.
By all means debate the maths, but don’t make any assumptions about the climate.

photon without a Higgs
December 26, 2009 6:22 pm

856 snowfall records, in The Lower 48, Dec 19–27
HAMweather:
http://mapcenter.hamweather.com/records/7day/us.html?c=snow
I wanted to put this in ‘Tips’ but it’s not available now
p.s. map is for previous 7 days and will change soon, so record total will change

photon without a Higgs
December 26, 2009 6:23 pm

photon without a Higgs (18:22:18)
opps, 854, not 856

Paul Vaughan
December 26, 2009 6:27 pm

Mike D. (16:52:48) is the only contributor who passes an in-page search for “assum” (assume, assumed, assumption, assumptions, etc.)
The assumption of randomness is absolutely ridiculous.
However, alarmists rely heavily on such unrealistic assumptions, so Luboš Motl certainly has good reason for firing such a shot.

Richard M
December 26, 2009 6:29 pm

Icarus, using 15 years is what was put forth by the AGW crowd. Lubos is using their definition. Why would he use anything but the last 15 years?
Rob, the reason we use years is because that is the only unit reasonable from a climate perspective. A year represents the smallest time period where inter-year variances cancel out. Even years are probably too fine (a better unit might be PDO full cycles).

Sydney Sceptic
December 26, 2009 6:32 pm

Hmm.. you know that if you draw a trend line from the start of that graph to the end of the graph, you get .1 degree C per decade warming?
Quick, better break out the sunblock, we’re seeing a deviation from the ‘normal’ 1 degree per century or .1 degree per decade of.. oh gawsh.. 0.0 degrees?
AGW. Where is it?

DirkH
December 26, 2009 6:42 pm

“photon without a Higgs (18:22:18) :
856 snowfall records, in The Lower 48, Dec 19–27”
Compare Hansen’s weather forecast from 1988:

kdkd
December 26, 2009 6:44 pm

This analysis is flawed. Any estimate of short term trends that relys on a single date as the start point is worthless. For this kind of thing to be valid, you’d want to bootstrap mutiple start points, and different time peridos.
However, a recent paper from Energy and Environment (“TREND ANALYSIS OF SATELLITE GLOBAL TEMPERATURE DATA” Loehle, C.) recently showed that the effect of 1998 is quite strong in terms of producing a short term cooling trend, but as time moves on from 1998 this trend gets smaller, showing it’s merely an outlier.
The paper also shows that a meaningful timeframe for detecting statistically significant trends is in the region of 15 years plus. Of course, the authors use much more sophisticated statistical analysis than Motl’s, and being from Energy and Environment, the scientific conclusions are over-extended but there is some useful stuff there. For Mot’ls methodology, I’d be unhappy with anything less than 20-25 years.
[snip]

Basil
Editor
December 26, 2009 6:44 pm

Richard M (18:29:19) :
Icarus, using 15 years is what was put forth by the AGW crowd. Lubos is using their definition. Why would he use anything but the last 15 years?
Rob, the reason we use years is because that is the only unit reasonable from a climate perspective. A year represents the smallest time period where inter-year variances cancel out.

There is some poor reasoning here. The comment might have merit, if the interannual variances were random, but they are not. There are systematic seasonal variations throughout the year, and those are not without some interest to the study of climate. Even if the interest is in yearly changes, we can get to those using monthly data (seasonal differencing, or moving averages, for example). No need to throw out monthly data. Truth is, canceling out the interannual variance makes climate seem a lot less variable than it really is.

Tom P
December 26, 2009 7:00 pm

Luboš Motl,
The monthly data from UAH is readily available and can be used to provide a much better estimate of the confidence limits – we now have 180 monthly points rather than 15 averaged annual points to look at the variability of the temperature measurements.
Following precisely your method, the trend now drops to 0.94C/century from your figure of 0.95C, not a significant difference. However, with the additional information from twelve times the data, the standard error of this slope drops to 0.31C, substantially less than your figure of 0.88C.
Hence the warming slope is more than three standard deviations above zero, which means we can say with 99.9% confidence, not your 86%, that there has been warming on the basis of the UAH data.
I have followed your algorithm exactly here, though as has been pointed out, for the correlated temperatures in a time series the confidence level will be even higher than 99.9%.
I would have thought you’d have been well aware that averaging measurements throws away important information about their variability. For some reason you must have forgotten this important mathematical property of measurement science when performing the calculation in your article.

Icarus
December 26, 2009 7:04 pm

Richard M (18:29:19) :
Icarus, using 15 years is what was put forth by the AGW crowd. Lubos is using their definition. Why would he use anything but the last 15 years?

Actually 30 years is what is ‘put forth’ by climate scientists, and with good reason – if average interannual variation is around 0.2C, and the observed warming trend is around 0.2C per decade, then obviously you need substantially more than 10 years to distinguish trend from stochastic variation. 15 years would be a bare minimum on that basis but it’s better to work with 30.

Kevin Kilty
December 26, 2009 7:12 pm

Disputin (13:10:16) :
I am far from expert in statistics, but is this really valid? By including the obvious outlier of 98 (El Nino) the SD is increased so widening the confidence intervals. While I should agree that in the long run the 98 jump is just a part of the variability, over this restricted timescale it is a major anomaly.
But then what do I know?

1998 is within a standard deviation, so why consider it an outlier?
Paul (15:39:43) :
I believe that this posted analysis is flawed. You cannot apply this type of simple linear trend analysis to serially correlated data, since the precision of the parameter estimates is strongly a function of the autocorrelation in temperature data.

I don’t know that this is meaningful in this particular, short series, but autocorrelation reduces degrees of freedom. Applies to the suggested monthly time series even more so?
scienceofdoom (17:27:49) :
Following on from DocMartyn..
I agree with DocMartyn’s main point – you need a hypothesis to test against. Just plotting a graph and fitting a linear trend is a pointless exercise.

Isn’t the implicit null hypothesis a zero trend in this case? You will find even less significance testing against an assumed positive trend.

December 26, 2009 7:15 pm

kdkd: The paper by Loehle costs $18.
You said “The paper also shows that a meaningful timeframe for detecting statistically significant trends is in the region of 15 years plus.”
Can you elaborate on that?
I suspect, from everything else that I have read, that the analysis will simply show that year to year variation is high and tells you nothing about “longer term trends”.
We may lack the data to prove or disprove the hypothesis that any particular period in question can be subject to exactly the same analysis.
My hypothesis: “The GMST variation over 30 years periods is statistically insignificant in the context of 200 year temperature trends and should be ignored.”
And my later hypothesis: “The GMST variation over 1000 year periods is simply noise in the context of longer term 50,000 year trends.”
I would be very interested to know a little more about what exactly Loehle demonstrates.

December 26, 2009 7:20 pm

Richard M wrote: “…the reason we use years is because that is the only unit reasonable from a climate perspective. A year represents the smallest time period where inter-year variances cancel out
Aren’t the inter-year variances already removed from temperature anomaly data?

December 26, 2009 7:46 pm

pwl (14:23:55)
Sunspot activity may have an influence on earth but it is not due to the radiation getting more direct radiation from the sun. The sun’s variance is only .1% our of which we would receive only a quarter portion of that. So directly the sun’s influence would not be able to exert any real variation on the earth. However that does not mean there is not a myriad of other mechanisms that may occur because of the sun in an indirect manner. Just as a Volcano is not the cause of cooling but the resultant aerosols do contribute something. This is not directly an effect because of the eruption but because the aerosols blanket the earth and reflect radiation.
However this is simply pure conjecture of the same kind as CO2, correlation does not mean causation. Lest we forget.

Steve J
December 26, 2009 7:49 pm

Funny,
I thought the temp increase preceeds the co2 increase by 800 years.
Based upon what we know (Thanks Anthony) about the dubious quality of the data.
Why would any rational being attempt to develop any trends based upon such a small data slice?
2,500 years should be the min. or maybe 10,000 years.
The entire argument evaporates under those conditions.

December 26, 2009 7:52 pm

Bret (19:20:24) :
Aren’t the inter-year variances already removed from temperature anomaly data

Lol welcome to the wonderful world of temperature anomalies. let me do you one better. What is the likelihood that the data has been transformed by an algorithm in a way that renders its actual use irrelevant?
As for your question, if you are compressing a year down to single data point yes, however the point is that a year is a simple method of creating a start and end point for that data to be correlated. In a grander scheme of things based on data that I have seen anything short of a couple of centuries is probably too short a time period to get an actuary gauge on temperature perhaps as much as half a millennium…

December 26, 2009 8:03 pm

I just realized by my last post someone might then ask well then what is the point of this article if 15 years is too short a time period to know what is going on with temperature… I for one think that is exactly one of the points of this article… Plus it just shows how confidence levels really don’t vary that much…
Lets put temperature into perspective and get rid of this micro chasm we call Celsius. Remember that 14 degrees Celsius is actually 14 degrees plus 273 degrees above absolute 0 so if someone was to come along and mention that the Earth has warmed approx. .5 Degrees placing you from 287 Degrees to 287.5 Would you really be that worried?
That means the earth increased in temperature by a whooping .17% degrees. Would you not place that as within a normal cycle of variance statistically. I mean the Earths temperature has not even varied as much as a quarter of a percent.
Anyway… Statistically speaking we have not managed to effect much change on the Earth with a 150 years of spewing CO2 into the atmosphere and if the Earth does not hit the tipping point, and I am fairly willing to wager pretty heavy on it not, before CO2 cannot help warm us anymore ( CO2 is a gas that absorbs radiation lets face it ) that the natural cycles of the planet hold far more sway then the paltry excuse for a greenhouse gas that CO2 is.
sorry for the rant, just tired of explaining to people over and over again and the blank look on their face as they go, oh I see…

kdkd
December 26, 2009 8:05 pm

scienceofdoom: I found the paper here http://icecap.us/images/uploads/05-loehleNEW.pdf (as my university has quite rightly stopped holding E&E since it’s move from a social science journal to a political rag). The interesting bit is Figure 2, which shows the confidence interval plotted along with the trend for a couple of sattelite data sets. The conclusion that the paper supports evidence against anthropogenic global warming is extremely weak, but the regression techniques are interesting.

December 26, 2009 8:07 pm

Tom in Florida (15:20:10) (re: Eve (14:08:45))
What about burner efficiency declining with age?
Perhaps the thermstat needs adjusting?

Those issue were preemptively dealt with –
“Asked and answered.”
.
.

kdkd
December 26, 2009 8:12 pm

scienceofdoom: (again). If we’re talking about anthropogenic warming, the correct time frame to look at trends is in the 200 year period, although subsets of that time are interesting (e.g. in the early 20th century, co2 levels accounted for about 25% of observed warming whereas in the late 20th century/early 21st century, co2 accounts for ~ 80% of observed warming – now correlation is not causation, but it is rather suggestive given the cohesive body of scientific theory associated with this information).
The other error I’m seeing in the comments here is the “800 year lag” between temperature and co2. This is correct when we’re coming out of an ice age, because there is a lot of carbon locked up in the ice that takes a while to come out, and eventually the biogeochemically stored carbon will reach equilibrium for the current temperature. Perhaps we will see a similar phenomenon with the melting of the arctic permafrosts. Anyway this 800 year lag is conceptually quite distinct to the current warming which appears to be caused by the burning of fossil fuels.

cohenite
December 26, 2009 8:25 pm

Lubos has picked 1995 and most critics are referring to 1998 being an outlier; that’s ironic because if 1998 is an outlier than by including it Lubos is actually tilting the scales towards a warming trend. But 1998 IS NOT an outlier; it is a legitimate phase change point verified by prominent and well documented climatic events;
http://arxiv.org/PS_cache/arxiv/pdf/0907/0907.1650v3.pdf

Michael Jankowski
December 26, 2009 8:27 pm

kdkd (20:12:57) :
So which temperature data set shall we use for the 200 year period?
And since CO2 has been increasing and is supposed to account for “80% of the observed warming in the late 20th century/early 21st century,” then I assume it should be easiest to spot even in a subset of, say, the most recent 15 years?

INGSOC
December 26, 2009 8:33 pm

Someone may already have said this, but it is “somewhat more likely than not” that none of this matters to the purveyors of doom. Recent events such as the CRU scandal and Copenhagen make it clear that science and data are irrelevant to the eco crusaders. In fact, it is plain that the environment doesn’t matter to them either. It should be clear to everyone by now.
Nice work though.