Why Automatic Temperature Adjustments Don't Work

The automatic adjustment procedure is almost guaranteed to produce spurious, artificial warming, and here’s why.

Guest essay by Bob Dedekind

Auckland, NZ, June 2014

In a recent comment on Lucia’s blog The Blackboard, Zeke Hausfather had this to say about the NCDC temperature adjustments:

“The reason why station values in the distant past end up getting adjusted is due to a choice by NCDC to assume that current values are the “true” values. Each month, as new station data come in, NCDC runs their pairwise homogenization algorithm which looks for non-climatic breakpoints by comparing each station to its surrounding stations. When these breakpoints are detected, they are removed. If a small step change is detected in a 100-year station record in the year 2006, for example, removing that step change will move all the values for that station prior to 2006 up or down by the amount of the breakpoint removed. As long as new data leads to new breakpoint detection, the past station temperatures will be raised or lowered by the size of the breakpoint.”

In other words, an automatic computer algorithm searches for breakpoints, and then automatically adjusts the whole prior record up or down by the amount of the breakpoint.

This is not something new; it’s been around for ages, but something has always troubled me about it. It’s something that should also bother NCDC, but I suspect confirmation bias has prevented them from even looking for errors.

You see, the automatic adjustment procedure is almost guaranteed to produce spurious, artificial warming, and here’s why.

Sheltering

Sheltering occurs at many weather stations around the world. It happens when something (anything) stops or hinders airflow around a recording site. The most common causes are vegetation growth and human-built obstructions, such as buildings. A prime example of this is the Albert Park site in Auckland, New Zealand. Photographs taken in 1905 show a grassy, bare hilltop surrounded by newly-planted flower beds, and at the very top of the hill lies the weather station.

If you take a wander today through Albert Park, you will encounter a completely different vista. The Park itself is covered in large mature trees, and the city of Auckland towers above it on every side. We know from the scientific literature that the wind run measurements here dropped by 50% between 1915 and 1970 (Hessell, 1980). The station history for Albert Park mentions the sheltering problem from 1930 onwards. The site was closed permanently for temperature measurements in 1989.

So what effect does the sheltering have on temperature? According to McAneney et al. (1990), each 1m of shelter growth increases the maximum air temperature by 0.1°C. So for trees 10m high, we can expect a full 1°C increase in maximum air temperature. See Fig 5 from McAneney reproduced below:

clip_image002

It’s interesting to note that the trees in the McAneney study grow to 10m in only 6 years. For this reason weather stations will periodically have vegetation cleared from around them. An example is Kelburn in Wellington, where cut-backs occurred in 1949, 1959 and 1969. What this means is that some sites (not all) will exhibit a saw-tooth temperature history, where temperatures increase slowly due to shelter growth, then drop suddenly when the vegetation is cleared.

clip_image004

So what happens now when the automatic computer algorithm finds the breakpoints at year 10 and 20? It automatically reduces them as follows.

clip_image005

So what have we done? We have introduced a warming trend for this station where none existed.

Now, not every station is going to have sheltering problems, but there will be enough of them to introduce a certain amount of warming. The important point is that there is no countering mechanism – there is no process that will produce slow cooling, followed by sudden warming. Therefore the adjustments will always be only one way – towards more warming.

UHI (Urban Heat Island)

The UHI problem is similar (Zhang et al. 2014). A diagram from Hansen (2001) illustrates this quite well.

clip_image007

clip_image009

In this case the station has moved away from the city centre, out towards a more rural setting. Once again, an automatic algorithm will most likely pick up the breakpoint, and perform the adjustment. There is also no countering mechanism that produces a long-term cooling trend. If even a relatively few stations are affected in this way (say 10%) it will be enough to skew the trend.

References

1. Hansen, J., Ruedy, R., Sato, M., Imhoff, M, Lawrence, W., Easterling, D., Peterson, T. and Karl, T. (2001) A closer look at United States and global surface temperature change. Journal of Geophysical Research, 106, 23 947–23 963.

2. Hessell, J. W. D. (1980) Apparent trends of mean temperature in New Zealand since 1930. New Zealand Journal of Science, 23, 1-9.

3. McAneney K.J., Salinger M.J., Porteus A.S., and Barber R.F. (1990) Modification of an orchard climate with increasing shelter-belt height. Agricultural and Forest Meteorology, 49, 177-189.

4. Lei Zhang, Guo-Yu Ren, Yu-Yu Ren, Ai-Ying Zhang, Zi-Ying Chu, Ya-Qing Zhou (2014) Effect of data homogenization on estimate of temperature trend: a case of Huairou station in Beijing Municipality. Theoretical and Applied Climatology February 2014, Volume 115, Issue 3-4, 365-373

Advertisements

  Subscribe  
newest oldest most voted
Notify of
Bloke down the pub

The fact that adjustments always seem, at least, to cool the past and warm the present should have set alarm bells ringing, but there’s none so deaf as those that don’t want to hear.

johnmarshall

Very interesting, many thanks.

And thus the problem with all the models. GIGO will not be denied.

Rob

Changing the entire prior record is a quick, dirty and often erroneous methodology. More accurately, all validated change points should be treated as entirely new and independent stations.

Here is some detail about the GCHN temperature record in Wellington WMO 93436, which I believe is Kelburn. There weren’t any adjustments in 1949 or 1959, when the trees were cut. Nor is a change clear in 1969, though there was an interruption to adjustment in the early 70’s.
The main big event was in 1928, when the site moved from Thorndon at sea level to kelburn at 128 m. The algorithm picked that one.

I don’t understand the rationale for the breakpoints and why they would adjust all the station’s
past data. Exactly what are they supposedly correcting for? Bad temp data? Bad station location?

Climate scientists really aren’t all that bright are they ?

Alex

Its difficult to work out the rationale when some people are not rational

In my old science class we had a name for data which required adjustments which were of a similar magnitude to the trend we were attempting to analyse.

Paul Carter

Nick Stokes says:
“… Wellington WMO 93436, which I believe is Kelburn. There weren’t any adjustments in 1949 or 1959, when the trees were cut.”
Wellington is very windy – one of the windiest places in NZ and the Kelburn Stevenson screen is on the brow of a hill which is exposed to strong winds from every angle. The site is visible (about 2kms) from my house and I get much the same winds. With the strength of those winds, the shelter from trees makes less difference to the overall temperature at the site compared with other, less windy tree-sheltered sites. The biggest impact to temperature at Kelburn is the black asphalt car-park next to the Stevenson screen.

We all know that coming up with one “average temperature” for the globe is stupid beyond belief. Your post highlights some of the problems with doing that. But we all also should have known that the government “scientists” will see what they want to see and disregard the rest. Does anyone in the world really think that Hansen was trying to get accurate measurements when he had the past cooled and the present heated up artificially?
The best we can do is use satellites for measurement to try to get some sort of “global temperature” and we will have to wait a long time before that record is long enough to have real meaning. Why is it that the long term stations that have been in rural areas and undisturbed by the heat island effect always seem to show no real 20th century warming outside the normal and natural variation? F’ing luck?

Stephen Richards

How many times does it need be said that the modification of past, pre-calibrated data is unacceptable as part of any scientific activity.

Alex

Clearly the man responsible for this is the Marlboro man (X files)

Nick Stokes says:
June 10, 2014 at 4:26 am
Here is some detail about the GCHN temperature record
===========
the raw data for this site shows decreasing temperatures over the past 130 years. the adjusted data shows increasing temperatures over the past 130 years.
man-made global warming indeed.
the author has raised a valid point with automated adjustment. as cities and vegetation grow up around a weather station, this will lead to a slow, artificial warming due to sheltering. human intervention to reduce the effects of sheltering will lead to a sudden cooling.
the pairwise homogenization algorithm is biased to recognize sudden events, but fails to recognize slow, long term events. Since sudden events are more likely to be cooling events and slow events are more likely to be warming events (due to human actions) the algorithm over time will induce a warming bias in the signal. thus it can be said that global warming is caused by humans.
the author also correctly identifies that the human subconscious prevents us from recognizing these sorts of errors, because the scientific consensus is that temperatures are warming. thus, the experimenters expect to see warming. any error that lead to warming are thus not seen as errors, but rather as confirmation.
this article raises a very valid signal processing defect in the pairwise homogenization algorithm.

Could anyone post up explicit examples of these types of adjustments in any of the various temperature series?

ferdberple says: June 10, 2014 at 5:11 am
“the raw data for this site shows decreasing temperatures over the past 130 years. the adjusted data shows increasing temperatures over the past 130 years.”

No, what it shows is mostly steady temperatures up to about 1928, then a big dive, then increasing temperatures since. In 1928 the site moved from Thorndon at 3 m altitude to Kelburn at 128 m. That caused a 0.8°C drop in temperature. The automatic algorithm discovered that and made the correct adjustment. That is why the trend quite properly changed.

No computer algorithm can correctly adjust the temperature record based on temperature alone. this is a basic truism of all computer testing. you cannot tell if your “correction” is correct unless you have a “known correct” or “standard” answer to compare against.
to correctly adjust temperatures, you need an additional column of data. something that gives you more information about the temperature, that allows you to determine if an adjustment is valid.
thus the author is correct. the pairwise homogenization algorithm is likely to create errors, because it is more sensitive to errors in the short term than the long term. thus, any bias in the temperature distribution of short and long term errors will guarantees that the pairwise homogenization algorithm will introduce bias in the temperature record.
Unless and until it can be shown that there is no temperature bias in the distribution of short term and long term temperature errors, the use of the pairwise homogenization algorithm is unwarranted. The authors sheltering argument strongly suggests such a bias exists, and thus any temperature record dependent on the the pairwise homogenization algorithm is likely to be biased.

Nick Stokes, 5.26 : “……… In 1928 the site moved from Thorndon at 3 m altitude to Kelburn at 128 m. That caused a 0.8°C drop in temperature. The automatic algorithm discovered that and made the correct adjustment. That is why the trend quite properly changed.”
‘Properly changed’ Isn’t there an incorrect assumption here that temperatures at 3m will trend the same as the recorded temperatures at 128m? 125m is a big height difference. or is it me?

ferdberple says:
“The authors sheltering argument strongly suggests such a bias exists”

Well, it’s a theoretical argument. But the examples don’t support it. Kelburn does not show adjustment when the trees were cut. And as for Auckland, it’s a composite record between Albert Park and the airport at Mangere, which opened in 1966. I don’t know when the record switched, but there is a break at 1966. Before that there is 100 years of Albert Park, with no adjustment at all except right at the beginning, around 1860.

Nick Stokes says:
June 10, 2014 at 5:26 am
No, what it shows
========
The unadjusted data shows temperatures decreasing over 130 years. The adjusted data shows temperatures increasing over 130 years. This is a simple fact
ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v3/products/stnplots/5/50793436001.gif
you are rationalizing that the corrections have “improved” the data quality. I am arguing that this is unknown based on the temperature data.
your argument that the data is improved is that the station location was changed in 1928. however, that information is not part of the temperature record. which confirms my argument above. you cannot know if the temperature adjustment is valid based on temperature alone. you need to introduce another column of data. in this case station location.
this is the fundamental problem with trying to use the temperature record itself to adjust temperature, it contains insufficient information to validate the corrections are in fact correction and not errors.

Alex

Nick Stokes says:
June 10, 2014 at 5:26 am
‘The automatic algorithm discovered that and made the correct adjustment. That is why the trend quite properly changed.’
Does that mean that you approve of data change?
To me, raw data is sacrosanct. It may have been gathered ‘incorrectly’ but it should stay the same. It may be considered faulty at some other time but you don’t change it. You explain why it was faulty or different
This is not an experiment you can try on different ‘runs’. You only get one shot at getting it right or wrong.

Truthseeker

To Nick Stokes,
Try and explain away all of the warming bias that has introduced into USHCN data that Steve Goddard has uncovered in many posts. For some of the recent analysis, start at this post and go from there.
http://stevengoddard.wordpress.com/2014/06/08/more-data-tampering-forensics/

Peter Azlac

Research results from China and India confirm the critical role of wind speed and vapour pressure in changes in surface temperature whilst answering the apparent paradox between the IPCC claim that increased surface temperature will induce a positive feedback from water vapour by increasing surface evaporation of water leading to higher back radiation from greater low level cloud formation and the measured global decreases in evaporation from Class A Pan Evaporation units that do not support this claim. An example is these data from India that show the critical influence of soil moisture, hence precipitation, in combination with changes in wind speed that affect the rate of evapo-transpiration.
http://www.tropmet.res.in/~bng/bngpaper/999238-Climatic_Change_2013_Reprint.pdf
Precipitation levels are linked to ocean cycles – ENSO, PDO, AMO etc and so we might expect temperature anomaly breakpoints to be affected by them also, especially minimum temperatures. The main effect of shading of the meteorological sites is to reduce evapo-transpiration, hence the cooling effect whilst lowered precipitation reduces soil moisture and hence ground cover allowing greater retention of surface radiation that is released at night to increase minimum temperatures. Thus in many, if not most instances, temperature anomalies are a measure of changes in precipitation and wind speed and not in any significant way to the effects of increases in non condensing GHGs such as CO2 and methane.

Latitude

but there will be enough of them to introduce a certain amount of warming…
Like a fraction of a degree, that can’t even be read on a thermometer..that can only be produced by math
http://suyts.wordpress.com/2014/05/27/how-global-warming-looks-on-your-thermometer/

Bob Dedekind

Hi Nick,
Nobody said that the algorithm can’t pick up breakpoints, it’s obvious it would pick up 1928. Also, as Paul mentioned before, Kelburn is less affected than other sites – I just used it because the station history specifically mentioned the cut-back dates.
What you have to do is explain to us all exactly what checks are implemented in the algorithms that PREVENT the artificial adjustments I listed in my post.
My apologies for the slow reply, we have high winds here at the moment and the power went out for a while.
I’ll also be offline for a few hours as it’s past 1AM over here and I could do with some sleep.
Good night all.

This problem with automated corrections is not specific to temperature data. Think of the human body. a disease that causes a large, sudden change is almost always recognized and eliminated by the immune system. however, a disease that causes a slow change in the body is poorly recognized by the immune system and can be extremely difficult to eliminate.
data errors act in a similar fashion. normally, if you are only interested in the short term you need not worry about slow acting errors. TB and cancer contracted yesterday do not much affect you today. However, when you want to build a temperature record over 130 years it is the slow acting errors that prove fatal to data quality.

Alex says: June 10, 2014 at 6:07 am
“Does that mean that you approve of data change?”

The data hasn’t changed. It’s still there in the GHCN unadjusted file.
People adjust data preparatory to calculating a global index. Wellington is included as representative of the temperature history of its region. Now the region didn’t undergo a 0.8°C change in 1928. They moved the weather station. That isn’t information that should affect the regional or global history.
ferdberple
” which confirms my argument above. you cannot know if the temperature adjustment is valid based on temperature alone”

Well, this one doesn’t confirm it. The computer looked at the temperature record and get it right.
Steve Wood says: June 10, 2014 at 5:52 am
“Isn’t there an incorrect assumption here that temperatures at 3m will trend the same as the recorded temperatures at 128m? 125m is a big height difference. or is it me?”

No, there’s an observed change of about 0.8°C, and that’s when the altitude change happened. They are saying that that isn’t a climate effect, and changing (for computing the index) the Thorndon temps to match what would have been at Kelburn, 0.8°C colder.

NikFromNYC

The elephant in the room is the fake “former skeptic” Richard Muller and his sidekick Steven Mosher with their extreme and highly parameterized example of Steven Goddard worthy black box data slicing and dicing to form a claimed hockey stick, but oddly enough the alarmist version was funded directly by the Koch brothers. Oh, look, it suddenly matches climate models in the last decade, just like the SkepticalScience.com tree house club Frankenstein version does where Cowtan & Way used satellite data to up-adjust the last decade in a way that the satellite data itself falsifies.
“I have no idea how one deals with this– to be candid, McIntyre or Watts in handcuffs is probably the only thing that will slow things down.” – Robert Way in the exposed secret forum of John Cook’s site.

Alex

Ok Nick.
You don’t approve of mangling raw data

ThinkingScientist

Nick stokes says:
“No, what it shows is mostly steady temperatures up to about 1928, then a big dive, then increasing temperatures since. In 1928 the site moved from Thorndon at 3 m altitude to Kelburn at 128 m. That caused a 0.8°C drop in temperature.”
The elevation change was 125 m, which at the typical lapse rate of 0.64 degC/100 m gives a shift for the elevation change of…0.8 degC, as you state.
BUT…
1. The actual shift from unadjusted to adjusted data over the period 1864 to 1927 in the GHCN data for Wellington, NZ is 0.98 degC, NOT 0.8 degC.
2. There are additional and complex adjustments applied after about 1965.
If we simply apply a correction of 0.8 degC to pre-1928 unadjusted data the regression slope (through annual averages) is +0.44 degC/century
If we only apply a correction of 0.98 degC to pre-1928 unadjusted data the regression slope (through annual averages) is +0.65 degC/century
If we use the final GHCN adjusted data the regression slope (through annual averages) is 0.93 degC/century.
So the simple elevation correction is not the whole picture. The final trend is over 2X greater than the trend with just the elevation correction.

Nick Stokes says:
June 10, 2014 at 6:24 am
The computer looked at the temperature record and get it right
===========
what the computer got right was the date.
however, you needed to introduce a new column of data to determine that. you could not determine even the date from the temperature record alone.
thus, if you need to add another column to validate the data, then the added column (location) is what should be driving your adjustments. not the temperature column.
this is basic data processing. you don’t use the column you are trying to correct to adjust itself, because this introduces new errors. rather you introduce an additional, independent column on which to base your adjustments.

Alex

ThinkingScientist says:
June 10, 2014 at 6:39 am
Perhaps I was premature in my earlier comment to Nick. I didn’t have your information at my fingertips. I have a great interest in AGW stuff but I’m not that ‘anal’ about it. I mean no disrespect with that last comment. Some people go into deeper research about things that I don’t.

The final trend is over 2X greater than the trend with just the elevation correction.
==========
and the algorithm fails to recognize slow changes, such as the growth of vegetation or the growth of cities and farms.
instead the algorithm is only sensitive to rapid changes. thus, it will introduce bias unless the temperature distribution of slow and fast acting changes is identical. something that is highly unlikely to be true worldwide.
thus, the pairwise homogenization algorithm is unsuitable for data quality enhancement of the temperature record.
what is required is an algorithm that is insensitive to the rate of change. it needs to correct equally changes that take decades with the same accuracy as it corrects changes that take days.
this cannot be done using only the temperature record, it your intent is to determine if there is a long term trend in the data. what is required is an algorithm based on non temperature data, such as station location.

Alex

I apologise for the last sentence. Truely disgusting for an alleged english teacher in a university

Alex

I’m not getting any better. I’m outta here

Tom In Indy

ThinkingScientist says:
June 10, 2014 at 6:39 am

My thoughts as well. Can you also take a look at the post 1928 trend before the adjustment and after the adjustment?
Maybe Zeke can explain where the increase in trend comes from in the post 1928 data. It looks pretty flat in the “QCU” Chart compared to the post 1928 uptrend in the “QCA” Chart at this link –
ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v3/products/stnplots/5/50793436001.gif
.

Gary Pearse

So abrupt changes in temperature are assumed to need adjustment (it is automatic). Are abrupt changes not possible except by foibles of location and operation of equipment? Balderdash. This is how the all time high in the US, 1937 got adjusted some tenths of degrees C below 1998 by Hansen and his homog crew.

latecommer2014

While airport temperatures a necessary at airports, no such stations should be allowed in the national system. Example from yesterday: I live in a semi rural area 20 miles from an airport station surrounded by concrete . Yesterday the reported temp from this site was 106F while my personal station recorded 101F, and a nearby station more embedded in suburbia read 103F. Which do you think was reported as the official temp? Surprise, it’s a new record and the team says “see?!”.

ThinkingScientist

Just to be clear, the trend is for the full series from 1864 to the latest measure, not just pre-1928. Sorry if not clear.

John Slayton

While sheltering by plant growth may be the most obvious and frequent case of gradual biasing, the fact is that any gradual process may change the measured temperatures and will not be caught by the adjustment algorithm unless there is a correction that occurs suddenly. So a Stevenson screen that is ill maintained as the whitewash deteriorates to bare wood, will likely show rising temperatures. When Oscar the Observer (or more likely his wife) decides that it needs painting, will the change be large and sudden enough to be caught by the system and invoke an adjustment? One question I’d like to see answered: “Exactly how big and how sudden a discrepancy will trigger adjustment?”
There are any number of potential gradual changes in station environments. In Baker City, Oregon,
the current station is at the end of a graveled access road running alongside the airport runway. Since runways tend to be laid out parallel to prevailing winds, it is possible that air warmed by transit along the road frequently passes directly into the station. (OK, it is also possible that exactly the opposite occurs; I have never been able to establish which way the wind blows up there. Sigh…)
What is of interest here is an unexpected change in the road. If the gallery were up, I would put a link here to a photo titled “Baker City Gravel Weathering.” What it shows is that the gravel immediately under the surface is much lighter in color than at the very top. There is a surprising amount of weathering that has taken place in the short time since that road was graveled. To what extent the change would affect heating of passing air, to what extent the air would be traveling into the weather station, etc, I don’t know. I think it unlikely in this case that it has much effect. But that’s not my point.
The point is that any number of unexpected changes in the micro-environment of a station can influence the readings, and no general computer algorithm will even catch them, much less correct the record.

Andrew

Wellington is very windy – one of the windiest places in the solar system (FTFY)

ARW

Taking the Wellington Example. (ferdberple post at 6.04am) If the original 3m ASL station location data was simply recorded as ending in 1928 and the new station at 128m ASL was recorded as a completely new station, then there would be no need to apply an automatic adjustment to the data. They are different stations locations. What are the rules (in any) about moving stations and then recombining the data into a single station? What percentage of the long term stations suffer from this “mangualtion” of the data if there was a change in location but not name? How far apart in xyz do they have to be to be considered new stations?

ThinkingScientist

Tom in Indy says:
“Can you also take a look at the post 1928 trend before the adjustment and after the adjustment?”
Yes, the linear regression trends for the periods 1929 – 1988 (annual averages) are:
Unadjusted GHCN 1929 – 1988 is +0.96 degC / Century
Adjusted GHCN 1929 – 1988 is +1.81 degC / Century

Theodore

The painting of Stevenson Screens provides another breakpoint that causes an additional spurious breakpoint that requires adjusting the temp down because the white paint absorbs less heat than faded wood.

Bob,
For Kelburn, at least, I don’t see any sort of saw-tooth pattern in the data for either Berkeley or NCDC, and the detected breakpoints don’t correspond with your clearing dates.
Berkeley – http://berkeleyearth.lbl.gov/stations/18625
NCDC – ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v3/products/stnplots/5/50793436001.gif
Sawtooth patterns are one of the harder inhomogeneities to deal with, though they can go both ways (e.g. sheltering can also progressively reduce the amount of sunlight hitting the instrument, as in the case of a tree growing over the sensor in Central Park, NY). Most homogenization approaches are structured to find larger breaks (station moves, instrument changes) and not to over-correct smaller ones for exactly this reason.
We are working on putting together methods of testing and benchmarking automated homogenization approaches that will include sawtooth pattern inhomogenities. You can read more about it here: http://www.geosci-instrum-method-data-syst-discuss.net/4/235/2014/gid-4-235-2014.pdf
As far as UHI goes, the concern that homogenization will not effectively deal with trend biases is a reasonable one. For the U.S., at least, homogenization seems to do a good job at removing urban-rural differences: ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/papers/hausfather-etal2013.pdf

Rob,
I agree that treating breakpoints as the start of a new station record is a better approach. We do that at Berkeley Earth.
.
Gary Pearse,
Its not just abrupt changes. Its sustained step changes. If all temperatures after, say, 1928 are on average 0.8 C colder, AND this pattern is not seen at nearby stations, then it will be flagged as a localized bias (in the Wellington case due to a station move to a new location 100 meters higher than the old one).

Barrybrill

Paul Catter says shelter is less important at Kelburn because it is an exceptionally windy site. On the contrary, this windines means the data is particularly susceptible to contamination by vegetation growth.
After cutbacks in 1949, 1959 and 1959, the Met Service request for a cutback in 1981 was declined by the Wellington City Council..Complining that the trees were causing routine 20%distortions in wind speed, the Met Service re-built its anemometer in a new location. But the thermometer has stayed put while the trees have continued to grow over the last 32 years.
Amusingly, the Wellington daily, The Dominion, reported the Council’s refusal to allow tree trimming as a deliberate attempt to produce warmer reported temperatures. The Council felt that Wellington’s “Windy City” sobriquet was damaging its image!

The other problem comes in as the number of stations are reduced if we see a loss of “colder” stations. I *believe* (but am not certain) that I read a few years back that as the number of stations in NOAA’s network have declined, the number of higher latitude and higher altitude stations had been declining fastest. If that is true, when looking at surrounding stations and trying to grid temperatures, that would be expected to introduce a warm bias as the colder stations have been removed from the process. I suppose some of this could be compensated for by using some of the data from the SNOTEL sites in some, but not all parts of the country. Does anyone have any more current information on the nature of the stations being removed from the network?

Ashby Manson

This is an interesting explanation for the systemic cooling of past records. An innocent error that makes sense. How does it check against the data and revisions?

pochas

Every adjustment gives you a little wiggle room to favor your own hypothesis. Thats where Global Warming comes from.