Unadjusted data of long period stations in GISS show a virtually flat century scale trend

Hohenpeissenberg Meteorological Observatory - Image from GAWSIS - click for details

Temperature averages of continuously reporting stations from the GISS dataset

Guest post by Michael Palmer, University of Waterloo, Canada

Abstract

The GISS dataset includes more than 600 stations within the U.S. that have been

in operation continuously throughout the 20th century. This brief report looks at

the average temperatures reported by those stations. The unadjusted data of both

rural and non-rural stations show a virtually flat trend across the century.

The Goddard Institute for Space Studies provides a surface temperature data set that

covers the entire globe, but for long periods of time contains mostly U.S. stations. For

each station, monthly temperature averages are tabulated, in both raw and adjusted

versions.

One problem with the calculation of long term averages from such data is the occurrence of discontinuities; most station records contain one or more gaps of one or more months. Such gaps could be due to anything from the clerk in charge being a quarter drunkard to instrument failure and replacement or relocation. At least in some examples, such discontinuities have given rise to “adjustments” that introduced spurious trends into the time series where none existed before.

1 Method: Calculation of yearly average temperatures

In this report, I used a very simple procedure to calculate yearly averages from raw

GISS monthly averages that deals with gaps without making any assumptions or adjustments.

Suppose we have 4 stations, A, B, C and D. Each station covers 4 time points, without

gaps:

In this case, we can obviously calculate the average temperatures as:

A more roundabout, but equivalent scheme for the calculation of T1 would be:

With a complete time series, this scheme offers no advantage over the first one. However, it can be applied quite naturally in the case of missing data points. Suppose now we have an incomplete data series, such as:

…where a dash denotes a missing data point. In this case, we can estimate the average temperatures as follows:

The upshot of this is that missing monthly Δtemperature values are simply dropped and replaced by the average (Δtemperature) from the other stations.

One advantage that may not be immediately obvious is that this scheme also removes

systematic errors due to change of instrument or instrument siting that may have occurred concomitantly with a data gap.

Suppose, for example, that data point B1 went missing because the instrument in station B broke down and was replaced, and that the calibration of the new instrument was offset by 1 degree relative to the old one. Since B2 is never compared to B0, this offset will not affect the calculation of the average temperature. Of course, spurious jumps not associated with gaps in the time series will not be eliminated.

In all following graphs, the temperature anomaly was calculated from unadjusted

GISS monthly averages according to the scheme just described. The code is written in

Python and is available upon request.

2 Temperature trends for all stations in GISS

The temperature trends for rural and non-rural US stations in GISS are shown in Figure

1.

Figure 1: Temperature trends and station counts for all US stations in GISS between 1850 and 2010. The slope for the rural stations is 0.0039 deg/year, and for the other stations 0.0059 deg/year.

This figure resembles other renderings of the same raw dataset. The most notable

feature in this graph is not in the temperature but in the station count. Both to the

left of 1900 and to the right of 2000 there is a steep drop in the number of available

stations. While this seems quite understandable before 1900, the even steeper drop

after 2000 seems peculiar.

If we simply lop off these two time periods, we obtain the trends shown in Figure

2.

Figure 2: Temperature trends and station counts for all US stations in GISS between 1900 and 2000. The slope for the rural stations is 0.0034 deg/year, and for the other stations 0.0038 deg/year.

The upward slope of the average temperature is reduced; this reduction is more

pronounced with non-rural stations, and the remaining difference between rural and

non-rural stations is negligible.

3 Continuously reporting stations

There are several examples of long-running temperature records that fail to show any

substantial long-term warming signal; examples are the Central England Temperature record and the one from Hohenpeissenberg, Bavaria. It therefore seemed of interest to look for long-running US stations in the GISS dataset. Here, I selected for stations that had continuously reported at least one monthly average value (but usually many more) for each year between 1900 and 2000. This criterion yielded 335 rural stations and 278 non-rural ones.

The temperature trends of these stations are shown in Figure 3.

Figure 3: Temperature trends and station counts for all US stations in GISS reporting continuously, that is containing at least one monthly data point for each year from 1900 to 2000. The slope for the rural stations (335 total) is -0.00073 deg/year, and for the other stations (278 total) -0.00069 deg/year. The monthly data point coverage is above 90% throughout except for the very first few years.

While the sequence and the amplitudes of upward and downward peaks are closely similar to those seen in Figure 2, the trends for both rural and non-rural stations are virtually zero. Therefore, the average temperature anomaly reported by long-running stations in the GISS dataset does not show any evidence of long-term warming.

Figure 3 also shows the average monthly data point coverage, which is above 90%

for all but the first few years. The less than 10% of all raw data points that are missing

are unlikely to have a major impact on the calculated temperature trend.

4 Discussion

The number of US stations in the GISS dataset is high and reasonably stable during the 20th century. In the 21st century, the number of stations has dropped precipitously. In particular, rural stations have almost entirely been weeded out, to the point that the GISS dataset no longer seems to offer a valid basis for comparison of the present to the past. If we confine the calculation of average temperatures to the 20th century, there remains an upward trend of approximately 0.35 degrees.

Figure 4: Locations of US stations continuously reporting between 1900 and 2000 and contained in the GISS dataset. Rural stations in red, others in blue. This figure clearly shows that the US are large, but the world (shown in FlatEarth™ projection) is even larger.

Interestingly, this trend is virtually the same with rural and non-rural stations.

The slight upward temperature trend observed in the average temperature of all

stations disappears entirely if the input data is restricted to long-running stations only, that is those stations that have reported monthly averages for at least one month in every year from 1900 to 2000. This discrepancy remains to be explained.

While the long-running stations represent a minority of all stations, they would

seem most likely to have been looked after with consistent quality. The fact that their

average temperature trend runs lower than the overall average and shows no net warming in the 20th century should therefore not be dismissed out of hand.

Disclaimer

I am not a climate scientist and claim no expertise relevant to this subject other than

basic arithmetics. In case I have overlooked equivalent previous work, this is due to my ignorance of the field, is not deliberate and will be amended upon request.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
265 Comments
Inline Feedbacks
View all comments
October 24, 2011 9:57 am

The strength of the global warming narrative is in the satellite data beginning in 1979. There is little doubt that the data is reliable, accurate to the degree necessary, and coverage is near global and around the clock.

I admire your faith in satellite data that is not calibrated against earth based thermometers… satellite data that cannot be independently verified / processed / checked… although the satellite data starts in 1979 the data series has been accumulated from various satellites with differing equipment and differing failure rates… I do not share your unquestioning faith in it the accuracy of the data, the reliability of the equipment, the scope & timeliness of the coverage… let alone the subsequent processing of the raw data by the usual suspects.

October 24, 2011 10:22 am

I am not trying to downplay this work, but again,
any study that does not show the development of maxima and minima TOGETHER with the averages is pretty useless, I think.
http://www.letterdash.com/HenryP/henrys-pool-table-on-global-warming

Rob Potter
October 24, 2011 10:24 am

Dave,
I agree that there is no reliable global thermometer, but there are a set of global records that people use to create an artificial construct called the global temperature and – for some reason – everyone looks at it and thinks it means something.
I was simply pointing out the that the supposed disconnect in this article was the comparison of the (artificial) US temp with the (artificial) world temp.
The whole notion of a global temperature (even if you use satellites) is an artificial construct. Heck, the concept of temperature itself is an artificial construct of a way to refer to energy. However, it serves a useful purpose because it is something that can be defined simply and compared over time and space.

DirkH
October 24, 2011 10:27 am

Rob Potter says:
October 24, 2011 at 9:03 am
“A number of people are questioning why there is a post on this site arguing the the temps have not gone up when there is also widespread acceptance of warming during the same time period and I think the problem is that this posting is not talking about “global” temperature, but the US.”
One of the longest running records in Europe is Berlin; nearly no trend over 300 years:
http://notrickszone.com/2010/09/23/own-weather-records-contradict-germanys-weather-service-director/
There are, though, smaller waves, and I think it is pretty clear that we are at the moment at the top of one of these waves; so one can construct 30 year trends that show a warming. The CAGW movement uses this to shout “This time is different” – like the people who believed in ever-increasing house prices. We know how that one ended.

October 24, 2011 10:31 am

Springer
“The strength of the global warming narrative is in the satellite data beginning in 1979. ”
Yes, the great argument is that land temperature data is not too far from the satelite data, some say.
But the largest difference occured 1950-78 it seems.
K.R. Frank

October 24, 2011 10:32 am

DirkH says:
October 24, 2011 at 2:50 am
But if you do area weighting your result will be hugely biased towards the trends of isolated thermometers with no second data source a thousand miles around,
You MUST use some form of area weighting. This may be difficult to do depending on whether there is location data available. But a poor man’s weighting would be to calculate the average temp for each state [if that location is available], then average the 48 states.

October 24, 2011 10:39 am

KR says:
October 24, 2011 at 8:47 am
“The lack of area weighting and discarding of 90% of the data, on the other hand, are quite serious issues.”
You persevere in misunderstanding the intention of my post. But, if throwing away 90% of the stations is so terrible: Why does GISS do the same, then? They axed 90% of their stations themselves.
“Area weighting data simply allows you to use the other 90% of the available data.”
Thank you. I already suspected that you were clueless, but now I’m sure of it.
‘“Thanks for playing” – Oh? You consider this a game?’
Yes. For me, it is – my day job is in real science.

Brian
October 24, 2011 10:45 am

Interesting that posts like this keep popping up when Anthony has admitted the earth is warming, just not that humans are causing it. Are you going back on your pledge to “accept whatever result they produce, even if it proves my premise wrong.”?
Has Anthony ever detailed what exactly it would take for him to accept AGW? How high to temperatures have to get? Who has to do the analysis? Because after his reversal on the BEST study, it seems that any study that supports AGW must be wrong for some reason or another.

Ivor Ward
October 24, 2011 10:46 am

Dave Springer says:
October 24, 2011 at 9:49 am
The difficulty is in deciding which point of the cloud base is directly above the horizon. Due to the curvature of the Earth the cloud base continues over the horizon until eventually it appears to meet the horizon. Without the plumb line to indicate exactly where the point of measurement should be you can pick the observed height all the way down to zero. We have corrections for refraction, parallax and dip(height of eye) and vertical sextant angle can be used to determine the distance of an object of known height, or the height of an object of known distance.
(I still use rule of thumb when yachting!)

Louis
October 24, 2011 10:55 am

Has anyone estimated the margin of error associated with calculating global temperature? That could be the elephant in the room. I suspect that the margin of error is greater than the estimated warming of about 1 degree C over the past century. If a margin of error has been estimated, can someone please provide a link to it.
Local temperatures can change several degrees in less than an hour. So, unless all temperature stations around the world are synchronized to record temperature at the exact same time of day, the margin of error could be greater than 1 degree C. Just differences in Daylight Savings Time around the world could play havoc with the data.
Then you have to consider if the number of data points are sufficient to get an accurate estimate of the world’s temperature. The BEST data claims that a third of temperatures show a decline and two-thirds show an increase. This indicates a large variability from region to region and implies that you need a great number of data points around the world to accurately measure an average temperature. But, instead, the number of data points have been drastically reduced, leaving large regions without any measurements. The large polar regions, that are supposed to be the most affected by warming, have no temperature recordings but are entirely estimated. This too increases the margin of error for any global temperature calculation. Am I the only one who suspects that the margin of error dwarfs the small increase in temperature over the past century?

October 24, 2011 10:57 am

Anthony, et al,
I would love to see you address the following in a post on your site.
An even easier way to demonstrate a long term flat USA trend is to simply insist that the date range begin and end at similar points in the AMO cycle. See the following post:
http://sbvor.blogspot.com/2011/10/amo-as-driver-of-climate-change.html

Gosport Mike
October 24, 2011 11:25 am

Just a couple of points if I may.
1. I believe average Global temperature to be immeasurable and meaningless. Local average temperature variations may have some use but, surely, it is the extremes that matter. After all. a climate which freezes at night and fries during the day would hardly be characterised by its average.
2. Apart from the effrontery of the pseudo science the only thing that really matters about AGW is the suggestion that we should be doing something about it. This has led to vast sums being wasted on Carbon trading, windmills and second rate climate studies – all of which should stop now.

Ken Harvey
October 24, 2011 11:31 am

I am one of those who is sceptical as to whether there has been any increase in average temperature over the last century or so. I am one of those who is sceptical as to whether an average can be arrived at and whether it would have anything other than a conceptual meaning if it could. I am one of those that deny that an average can be calculated using existing resources.
Having regard for the width of the error bars arising from the shortcomings in the metrology, from the hodgepodge availability and distribution of data and the inconsistency of the data sources, the current temperature number is no more than a guess, and I can see no reason to believe that it is correct to within 2 degrees.
How is it that lacking any qualification regarding climate science, I can be so adamant in what I say? It is because I am blessed (or cursed) with being numerate. If the data is dodgy one cannot manipulate it in a way that would resolve to a valid answer.

Steve C
October 24, 2011 11:34 am

Lansner – Thanks for that! – I wasn’t familiar with RUTI, so was assuming that it was *only* rural. But as a mix close to what I was asking for, it certainly looks a darn sight more convincing, as expected. I shall have to come over to hidethedecline and look around. 🙂

Sean
October 24, 2011 11:37 am

malagaview:
While not a new idea, the mixing of max and min temp can be justified on the basis of consistancy with practices when there were no data loggers. You do want you can with the information you have. However, The real ‘no no’ is collating daily readings into monthy and calling it monthy raw data. Months are not all the same length. Over much of the recorded period, the months did not even start on within the same week in different countries. Working with months, you have infilling and dropping as days in the month, and again months in years.
Raw means what you saw, That means, the max and the min, or the reading and the time of day as written down – with the gaps where there are gaps.

October 24, 2011 11:43 am

Sean,
Quite correct. For example, the December temperature changes look like this.

Interstellar Bill
October 24, 2011 11:43 am

This discard of stations that, in effect,
are the ones reporting level or declining temperatures:
Didn’t I read here at WUWT the same thing about sea-level?
Before they calculate the ‘global’ sea-level,
they throw out all stations showing decline or no rise,
since they ‘know’
that something they call ‘global sea level’
is on the bring of catastrophically rising.
They can’t have those meaningless outliers contaminating their message.

crosspatch
October 24, 2011 11:43 am

What I find interesting is that NOAA’s National Climate Data Center shows significant cooling for the continental US since 1998 using the USHCNv2 data set. The rate of cooling for the most recent 12-month period (October to September) since 1998 is -0.77 degF/decade. That is a significant cooling trend in the US.

More Soylent Green!
October 24, 2011 11:51 am

Brian says:
October 24, 2011 at 10:45 am
Interesting that posts like this keep popping up when Anthony has admitted the earth is warming, just not that humans are causing it. Are you going back on your pledge to “accept whatever result they produce, even if it proves my premise wrong.”?
Has Anthony ever detailed what exactly it would take for him to accept AGW? How high to temperatures have to get? Who has to do the analysis? Because after his reversal on the BEST study, it seems that any study that supports AGW must be wrong for some reason or another.

No matter high the temperatures get, it’s still not evidence of AGW! Global warming is not AGW! We could set a record high temperature everywhere on the globe from now until the sun burns out and it still wouldn’t be evidence of AGW because global warming does not mean AGW.
Remember these two things
1) Evidence of global warming is not evidence of anthropegenic global warming.
2) Repeat #1 until you get it.

Harry Snape
October 24, 2011 12:15 pm

Roger Knights wrote:
“The GISS dataset includes more than 600 stations within the U.S. …
So no worries about Antarctica or the Equator.”
I’d expect quite a difference in continental US temps between Florida and North Dakota.
“It isn’t the temperature that’s replaced, but the delta (the little triangle is the delta) of the temperature; i.e., the anomaly. (If I’m reading the formula correctly.)”
Replacing a missing figure like the example I gave, Sydney’s 50 year low in Oct with an average of Oct would be quite wrong. If deltas are used, and the average delta is infilled, the error bars should be extended by the maximum variance in the deltas seen historically for that date, and even larger if the number of historical records are low.

Brian
October 24, 2011 12:19 pm

@Soylent Green
But once it’s clear the earth is warming (really it already is) you need to suggest a cause. Either it’s the human emissions of gasses that are known to have warming effects, or something else. The possible list of “something else” shrinks as temperatures keep going up. Claiming that it’s a coincidence is pretty hard to swallow.

October 24, 2011 12:29 pm

More Soylent Green! says:
October 24, 2011 at 8:32 am
… It’s no wonder it seems hot out, because we’re no longer used to the normal summer heat.

I’ve thought about that myself quite often. I vaguely remember suffering from the heat back then, but it was a natural part of our life and we just made do. Much the same thing has happened with hygiene. 100 years ago or more, being somewhat dirty all the time was simply a matter of course. Now, however, we’re used to being clean virtually all the time.
Another factor is the “humidex.” While I’d never argue against the merit of having a humidex, it does tend to fool people into thinking it’s hotter now than before. I have, so very, very often, heard people saying things like, “My God, it’s 42 degrees (Celsius). It never reached those temperatures here when I was a kid!” Well, it hasn’t reached those temperatures here now, either, you moron, because that’s the freakin’ humidex!

October 24, 2011 12:35 pm

Henry and Soylent green
guys, please get it in your head. Stop spreading lies.
http://wattsupwiththat.com/2011/10/24/unadjusted-data-of-long-period-stations-in-giss-show-a-virtually-flat-century-scale-trend/#comment-776475
most of the warming is natural (witness the large increases in maxima) and a small % is caused by the increase in vegetation (that traps the extra heat), mostly in the NH
stick with the truth.
http://www.letterdash.com/HenryP/more-carbon-dioxide-is-ok-ok

October 24, 2011 12:42 pm

Brian sez:
“But once it’s clear the earth is warming (really it already is) you need to suggest a cause.”
Warming over what time frame? The only dataset I trust is the UAH satellite data.
But, the UAH data begin around the bottom of an AMO cooling cycle and currently end around the peak of an AMO warming cycle. The next AMO cooling cycle will bottom out somewhere around 2040 to 2045. So, we’ll have to wait at least that long to even begin to have enough data to draw any sort of reasonable conclusions.
Meantime, a century scale flat USA trend is easily demonstrated by simply insisting that the date range begin and end at similar points in the AMO cycle. See the following post:
http://sbvor.blogspot.com/2011/10/amo-as-driver-of-climate-change.html
I am reasonably certain the same would hold true for Greenland and most of Europe. In 2045, once we have credible global data, we’ll begin to have some idea to what extent that holds true for the entire planet.
In the following post, I have cited several examples of peer reviewed science demonstrating the extent to which the roughly 70 year AMO cycle drives global temperature cycles:
http://sbvor.blogspot.com/2010/12/how-amo-killed-cagw-cult.html

October 24, 2011 12:50 pm

Michael Palmer says: Yes. For me, it is – my day job is in real science.

BRAVO! Give this man a cigar 🙂

1 3 4 5 6 7 11