Why Automatic Temperature Adjustments Don't Work

The automatic adjustment procedure is almost guaranteed to produce spurious, artificial warming, and here’s why.

Guest essay by Bob Dedekind

Auckland, NZ, June 2014

In a recent comment on Lucia’s blog The Blackboard, Zeke Hausfather had this to say about the NCDC temperature adjustments:

“The reason why station values in the distant past end up getting adjusted is due to a choice by NCDC to assume that current values are the “true” values. Each month, as new station data come in, NCDC runs their pairwise homogenization algorithm which looks for non-climatic breakpoints by comparing each station to its surrounding stations. When these breakpoints are detected, they are removed. If a small step change is detected in a 100-year station record in the year 2006, for example, removing that step change will move all the values for that station prior to 2006 up or down by the amount of the breakpoint removed. As long as new data leads to new breakpoint detection, the past station temperatures will be raised or lowered by the size of the breakpoint.”

In other words, an automatic computer algorithm searches for breakpoints, and then automatically adjusts the whole prior record up or down by the amount of the breakpoint.

This is not something new; it’s been around for ages, but something has always troubled me about it. It’s something that should also bother NCDC, but I suspect confirmation bias has prevented them from even looking for errors.

You see, the automatic adjustment procedure is almost guaranteed to produce spurious, artificial warming, and here’s why.

Sheltering

Sheltering occurs at many weather stations around the world. It happens when something (anything) stops or hinders airflow around a recording site. The most common causes are vegetation growth and human-built obstructions, such as buildings. A prime example of this is the Albert Park site in Auckland, New Zealand. Photographs taken in 1905 show a grassy, bare hilltop surrounded by newly-planted flower beds, and at the very top of the hill lies the weather station.

If you take a wander today through Albert Park, you will encounter a completely different vista. The Park itself is covered in large mature trees, and the city of Auckland towers above it on every side. We know from the scientific literature that the wind run measurements here dropped by 50% between 1915 and 1970 (Hessell, 1980). The station history for Albert Park mentions the sheltering problem from 1930 onwards. The site was closed permanently for temperature measurements in 1989.

So what effect does the sheltering have on temperature? According to McAneney et al. (1990), each 1m of shelter growth increases the maximum air temperature by 0.1°C. So for trees 10m high, we can expect a full 1°C increase in maximum air temperature. See Fig 5 from McAneney reproduced below:

clip_image002

It’s interesting to note that the trees in the McAneney study grow to 10m in only 6 years. For this reason weather stations will periodically have vegetation cleared from around them. An example is Kelburn in Wellington, where cut-backs occurred in 1949, 1959 and 1969. What this means is that some sites (not all) will exhibit a saw-tooth temperature history, where temperatures increase slowly due to shelter growth, then drop suddenly when the vegetation is cleared.

clip_image004

So what happens now when the automatic computer algorithm finds the breakpoints at year 10 and 20? It automatically reduces them as follows.

clip_image005

So what have we done? We have introduced a warming trend for this station where none existed.

Now, not every station is going to have sheltering problems, but there will be enough of them to introduce a certain amount of warming. The important point is that there is no countering mechanism – there is no process that will produce slow cooling, followed by sudden warming. Therefore the adjustments will always be only one way – towards more warming.

UHI (Urban Heat Island)

The UHI problem is similar (Zhang et al. 2014). A diagram from Hansen (2001) illustrates this quite well.

clip_image007

clip_image009

In this case the station has moved away from the city centre, out towards a more rural setting. Once again, an automatic algorithm will most likely pick up the breakpoint, and perform the adjustment. There is also no countering mechanism that produces a long-term cooling trend. If even a relatively few stations are affected in this way (say 10%) it will be enough to skew the trend.

References

1. Hansen, J., Ruedy, R., Sato, M., Imhoff, M, Lawrence, W., Easterling, D., Peterson, T. and Karl, T. (2001) A closer look at United States and global surface temperature change. Journal of Geophysical Research, 106, 23 947–23 963.

2. Hessell, J. W. D. (1980) Apparent trends of mean temperature in New Zealand since 1930. New Zealand Journal of Science, 23, 1-9.

3. McAneney K.J., Salinger M.J., Porteus A.S., and Barber R.F. (1990) Modification of an orchard climate with increasing shelter-belt height. Agricultural and Forest Meteorology, 49, 177-189.

4. Lei Zhang, Guo-Yu Ren, Yu-Yu Ren, Ai-Ying Zhang, Zi-Ying Chu, Ya-Qing Zhou (2014) Effect of data homogenization on estimate of temperature trend: a case of Huairou station in Beijing Municipality. Theoretical and Applied Climatology February 2014, Volume 115, Issue 3-4, 365-373

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

166 Comments
Inline Feedbacks
View all comments
Bloke down the pub
June 10, 2014 3:40 am

The fact that adjustments always seem, at least, to cool the past and warm the present should have set alarm bells ringing, but there’s none so deaf as those that don’t want to hear.

johnmarshall
June 10, 2014 3:44 am

Very interesting, many thanks.

June 10, 2014 3:58 am

And thus the problem with all the models. GIGO will not be denied.

Rob
June 10, 2014 4:03 am

Changing the entire prior record is a quick, dirty and often erroneous methodology. More accurately, all validated change points should be treated as entirely new and independent stations.

Nick Stokes
June 10, 2014 4:26 am

Here is some detail about the GCHN temperature record in Wellington WMO 93436, which I believe is Kelburn. There weren’t any adjustments in 1949 or 1959, when the trees were cut. Nor is a change clear in 1969, though there was an interruption to adjustment in the early 70’s.
The main big event was in 1928, when the site moved from Thorndon at sea level to kelburn at 128 m. The algorithm picked that one.

June 10, 2014 4:27 am

I don’t understand the rationale for the breakpoints and why they would adjust all the station’s
past data. Exactly what are they supposedly correcting for? Bad temp data? Bad station location?

June 10, 2014 4:29 am

Climate scientists really aren’t all that bright are they ?

Alex
June 10, 2014 4:41 am

Its difficult to work out the rationale when some people are not rational

Admin
June 10, 2014 4:42 am

In my old science class we had a name for data which required adjustments which were of a similar magnitude to the trend we were attempting to analyse.

Paul Carter
June 10, 2014 4:51 am

Nick Stokes says:
“… Wellington WMO 93436, which I believe is Kelburn. There weren’t any adjustments in 1949 or 1959, when the trees were cut.”
Wellington is very windy – one of the windiest places in NZ and the Kelburn Stevenson screen is on the brow of a hill which is exposed to strong winds from every angle. The site is visible (about 2kms) from my house and I get much the same winds. With the strength of those winds, the shelter from trees makes less difference to the overall temperature at the site compared with other, less windy tree-sheltered sites. The biggest impact to temperature at Kelburn is the black asphalt car-park next to the Stevenson screen.

June 10, 2014 4:53 am

We all know that coming up with one “average temperature” for the globe is stupid beyond belief. Your post highlights some of the problems with doing that. But we all also should have known that the government “scientists” will see what they want to see and disregard the rest. Does anyone in the world really think that Hansen was trying to get accurate measurements when he had the past cooled and the present heated up artificially?
The best we can do is use satellites for measurement to try to get some sort of “global temperature” and we will have to wait a long time before that record is long enough to have real meaning. Why is it that the long term stations that have been in rural areas and undisturbed by the heat island effect always seem to show no real 20th century warming outside the normal and natural variation? F’ing luck?

Stephen Richards
June 10, 2014 5:04 am

How many times does it need be said that the modification of past, pre-calibrated data is unacceptable as part of any scientific activity.

Alex
June 10, 2014 5:04 am

Clearly the man responsible for this is the Marlboro man (X files)

ferdberple
June 10, 2014 5:11 am

Nick Stokes says:
June 10, 2014 at 4:26 am
Here is some detail about the GCHN temperature record
===========
the raw data for this site shows decreasing temperatures over the past 130 years. the adjusted data shows increasing temperatures over the past 130 years.
man-made global warming indeed.
the author has raised a valid point with automated adjustment. as cities and vegetation grow up around a weather station, this will lead to a slow, artificial warming due to sheltering. human intervention to reduce the effects of sheltering will lead to a sudden cooling.
the pairwise homogenization algorithm is biased to recognize sudden events, but fails to recognize slow, long term events. Since sudden events are more likely to be cooling events and slow events are more likely to be warming events (due to human actions) the algorithm over time will induce a warming bias in the signal. thus it can be said that global warming is caused by humans.
the author also correctly identifies that the human subconscious prevents us from recognizing these sorts of errors, because the scientific consensus is that temperatures are warming. thus, the experimenters expect to see warming. any error that lead to warming are thus not seen as errors, but rather as confirmation.
this article raises a very valid signal processing defect in the pairwise homogenization algorithm.

June 10, 2014 5:15 am

Could anyone post up explicit examples of these types of adjustments in any of the various temperature series?

Nick Stokes
June 10, 2014 5:26 am

ferdberple says: June 10, 2014 at 5:11 am
“the raw data for this site shows decreasing temperatures over the past 130 years. the adjusted data shows increasing temperatures over the past 130 years.”

No, what it shows is mostly steady temperatures up to about 1928, then a big dive, then increasing temperatures since. In 1928 the site moved from Thorndon at 3 m altitude to Kelburn at 128 m. That caused a 0.8°C drop in temperature. The automatic algorithm discovered that and made the correct adjustment. That is why the trend quite properly changed.

ferdberple
June 10, 2014 5:51 am

No computer algorithm can correctly adjust the temperature record based on temperature alone. this is a basic truism of all computer testing. you cannot tell if your “correction” is correct unless you have a “known correct” or “standard” answer to compare against.
to correctly adjust temperatures, you need an additional column of data. something that gives you more information about the temperature, that allows you to determine if an adjustment is valid.
thus the author is correct. the pairwise homogenization algorithm is likely to create errors, because it is more sensitive to errors in the short term than the long term. thus, any bias in the temperature distribution of short and long term errors will guarantees that the pairwise homogenization algorithm will introduce bias in the temperature record.
Unless and until it can be shown that there is no temperature bias in the distribution of short term and long term temperature errors, the use of the pairwise homogenization algorithm is unwarranted. The authors sheltering argument strongly suggests such a bias exists, and thus any temperature record dependent on the the pairwise homogenization algorithm is likely to be biased.

June 10, 2014 5:52 am

Nick Stokes, 5.26 : “……… In 1928 the site moved from Thorndon at 3 m altitude to Kelburn at 128 m. That caused a 0.8°C drop in temperature. The automatic algorithm discovered that and made the correct adjustment. That is why the trend quite properly changed.”
‘Properly changed’ Isn’t there an incorrect assumption here that temperatures at 3m will trend the same as the recorded temperatures at 128m? 125m is a big height difference. or is it me?

Nick Stokes
June 10, 2014 6:04 am

ferdberple says:
“The authors sheltering argument strongly suggests such a bias exists”

Well, it’s a theoretical argument. But the examples don’t support it. Kelburn does not show adjustment when the trees were cut. And as for Auckland, it’s a composite record between Albert Park and the airport at Mangere, which opened in 1966. I don’t know when the record switched, but there is a break at 1966. Before that there is 100 years of Albert Park, with no adjustment at all except right at the beginning, around 1860.

ferdberple
June 10, 2014 6:04 am

Nick Stokes says:
June 10, 2014 at 5:26 am
No, what it shows
========
The unadjusted data shows temperatures decreasing over 130 years. The adjusted data shows temperatures increasing over 130 years. This is a simple fact
ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v3/products/stnplots/5/50793436001.gif
you are rationalizing that the corrections have “improved” the data quality. I am arguing that this is unknown based on the temperature data.
your argument that the data is improved is that the station location was changed in 1928. however, that information is not part of the temperature record. which confirms my argument above. you cannot know if the temperature adjustment is valid based on temperature alone. you need to introduce another column of data. in this case station location.
this is the fundamental problem with trying to use the temperature record itself to adjust temperature, it contains insufficient information to validate the corrections are in fact correction and not errors.

Alex
June 10, 2014 6:07 am

Nick Stokes says:
June 10, 2014 at 5:26 am
‘The automatic algorithm discovered that and made the correct adjustment. That is why the trend quite properly changed.’
Does that mean that you approve of data change?
To me, raw data is sacrosanct. It may have been gathered ‘incorrectly’ but it should stay the same. It may be considered faulty at some other time but you don’t change it. You explain why it was faulty or different
This is not an experiment you can try on different ‘runs’. You only get one shot at getting it right or wrong.

Truthseeker
June 10, 2014 6:11 am

To Nick Stokes,
Try and explain away all of the warming bias that has introduced into USHCN data that Steve Goddard has uncovered in many posts. For some of the recent analysis, start at this post and go from there.
http://stevengoddard.wordpress.com/2014/06/08/more-data-tampering-forensics/

Peter Azlac
June 10, 2014 6:11 am

Research results from China and India confirm the critical role of wind speed and vapour pressure in changes in surface temperature whilst answering the apparent paradox between the IPCC claim that increased surface temperature will induce a positive feedback from water vapour by increasing surface evaporation of water leading to higher back radiation from greater low level cloud formation and the measured global decreases in evaporation from Class A Pan Evaporation units that do not support this claim. An example is these data from India that show the critical influence of soil moisture, hence precipitation, in combination with changes in wind speed that affect the rate of evapo-transpiration.
http://www.tropmet.res.in/~bng/bngpaper/999238-Climatic_Change_2013_Reprint.pdf
Precipitation levels are linked to ocean cycles – ENSO, PDO, AMO etc and so we might expect temperature anomaly breakpoints to be affected by them also, especially minimum temperatures. The main effect of shading of the meteorological sites is to reduce evapo-transpiration, hence the cooling effect whilst lowered precipitation reduces soil moisture and hence ground cover allowing greater retention of surface radiation that is released at night to increase minimum temperatures. Thus in many, if not most instances, temperature anomalies are a measure of changes in precipitation and wind speed and not in any significant way to the effects of increases in non condensing GHGs such as CO2 and methane.

Latitude
June 10, 2014 6:15 am

but there will be enough of them to introduce a certain amount of warming…
Like a fraction of a degree, that can’t even be read on a thermometer..that can only be produced by math
http://suyts.wordpress.com/2014/05/27/how-global-warming-looks-on-your-thermometer/

Bob Dedekind
June 10, 2014 6:15 am

Hi Nick,
Nobody said that the algorithm can’t pick up breakpoints, it’s obvious it would pick up 1928. Also, as Paul mentioned before, Kelburn is less affected than other sites – I just used it because the station history specifically mentioned the cut-back dates.
What you have to do is explain to us all exactly what checks are implemented in the algorithms that PREVENT the artificial adjustments I listed in my post.
My apologies for the slow reply, we have high winds here at the moment and the power went out for a while.
I’ll also be offline for a few hours as it’s past 1AM over here and I could do with some sleep.
Good night all.

1 2 3 7