From the “we told you so” department comes this paper out of China that quantifies many of the very problems with the US and global surface temperature record we have been discussing for years: the adjustments add more warming than the global warming signal itself
A paper just published in Theoretical and Applied Climatology finds that the data homogenization techniques commonly used to adjust temperature records for moving stations and the urban heat island effect [UHI] can result in a “significant” exaggeration of warming trends in the homogenized record.
The effect of homogenization is clear and quite pronounced. What they found in China is based on how NOAA treats homogenization of the surface temperature record.
According to the authors:
“Our analysis shows that “data homogenization for [temperature] stations moved from downtowns to suburbs can lead to a significant overestimate of rising trends of surface air temperature.”
Basically what they are saying here is that the heat sink effect of all the concrete and asphalt surrounding the station swamps the diurnal variation of the station, and when it is moved away, the true diurnal variation returns, and then the homogenization methodology falsely adjusts the signal in a way that increases the trend.
You can see the heat sink swamping of the diurnal signal in the worst stations, Class 5, nearest urban centers in the graphs below. Compare urban, semi-urban, and rural for Class 5 stations, the effect of the larger UHI heat sink on the Tmax and Tmin is evident.
In Zhang et al, they study what happens when a station is moved from an urban to rural environment. An analogy in the USA would be what happened to the signal of those rooftop stations in the center of the city, such as in Columbia, SC when the station was moved to a a more rural setting.
U.S. Weather Bureau Office, Columbia SC. Circa 1915 (courtesy of the NOAA photo library)Here is the current USHCN station at the University of South Carolina:The Zhang et al paper studies a move of Huairou station in Beijing from 1960 to 2008, and the resultant increases in trend that result from the adjustments from homgenization being applied, resulting in a greater trend. They find:
The mean annual Tmin and Tmax at Huairou station drop by 1.377°C and 0.271°C respectively after homogenization. The adjustments for Tmin are larger than those for Tmax, especially in winter, and the seasonal differences of the adjustments are generally more obvious for Tmin than for Tmax.
The figures 4 and 5 from the paper are telling for the effect on trend:


Huairou station and reference data for original (dotted lines) and adjusted (solid lines) data series during 1960–2008. The solid straight lines denote linear trends
Now here is the really interesting part, they propose a mechanism for the increase in trend, via the adjustments, and illustrate it.

They conclude:
The larger effects of relocations, homogenization, and urbanization on Tmin data series than on Tmax data series in a larger extent explain the “asymmetry” in daytime and nighttime SAT trends at Huairou station, and the urban effect is also a major contributor to the DTR decline as implied in the “asymmetry” changes of the annual mean Tmin and Tmax for the homogeneityadjusted data at the station.
In my draft paper of 2012 (now nearing completion with all of the feedback/criticisms we received dealt with, thank you. It is a complete rework. ), we pointed out how much adjustments, including homogenization, added to the trend of the USCHN network in the USA. This map from the draft paper pretty much says it all: the adjusted data trend is about twice as warm as the trend of stations (compliant thermometers) that have had the least impact of siting, UHI, and moves:
The Zhang et al paper is open access, an well worth reading. Let’s hope Petersen, Karl, and Menne at NCDC (whose papers are cited as references in this new paper) read it, for they are quite stubborn in insisting that their methodology solves all the ills of the dodgy surface temperature record, when it fact it creates more unrecognized problems in addition to the ones it solves.
The paper:
Effect of data homogenization on estimate of temperature trend: a case of Huairou station in Beijing Municipality Theoretical and Applied Climatology February 2014, Volume 115, Issue 3-4, pp 365-373,
Lei Zhang, Guo-Yu Ren, Yu-Yu Ren, Ai-Ying Zhang, Zi-Ying Chu, Ya-Qing Zhou
Abstract
Daily minimum temperature (Tmin) and maximum temperature (Tmax) data of Huairou station in Beijing from 1960 to 2008 are examined and adjusted for inhomogeneities by applying the data of two nearby reference stations. Urban effects on the linear trends of the original and adjusted temperature series are estimated and compared. Results show that relocations of station cause obvious discontinuities in the data series, and one of the discontinuities for Tmin are highly significant when the station was moved from downtown to suburb in 1996. The daily Tmin and Tmax data are adjusted for the inhomogeneities. The mean annual Tmin and Tmax at Huairou station drop by 1.377°C and 0.271°C respectively after homogenization. The adjustments for Tmin are larger than those for Tmax, especially in winter, and the seasonal differences of the adjustments are generally more obvious for Tmin than for Tmax. Urban effects on annual mean Tmin and Tmax trends are −0.004°C/10 year and −0.035°C/10 year respectively for the original data, but they increase to 0.388°C/10 year and 0.096°C/10 year respectively for the adjusted data. The increase is more significant for the annual mean Tmin series. Urban contributions to the overall trends of annual mean Tmin and Tmax reach 100% and 28.8% respectively for the adjusted data. Our analysis shows that data homogenization for the stations moved from downtowns to suburbs can lead to a significant overestimate of rising trends of surface air temperature, and this necessitates a careful evaluation and adjustment for urban biases before the data are applied in analyses of local and regional climate change
Download the PDF (531 KB) Open Access
h/t to The Hockey Schtick
=============================================================
UPDATE 1/30/14: Credit where it is due, Steve McIntyre found and graphed the physical response to station moves three years ago with this comment at Climate Audit.
Here’s another way to think about the effect.
Let’s suppose that you have a station originally in a smallish city which increases in population and that the station moves in two discrete steps to the suburbs. Let’s suppose that there is a real urbanization effect and that the “natural” landscape is uniform. When the station moves to a more remote suburb, there will be a downward step change. E.g. the following:
The Menne algorithm removes the downward steps, but, in terms of estimating “natural” temperature, the unsliced series would be a better index than concatenating the sliced segments.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.



Has the journal been closed yet?
That’s beautiful. Looking forward to seeing your paper too.
What it needs is someone to examine the placement of all the temperature stations and then.. oh.. hang on. Never mind.
Does this explain most (or all) of the warming since the 1930s?
Or, are we really cooling since the 1930s, keeping in mind the NOAA adjustments chart: http://www.sustainableoregon.com/data_adjustments.html
Thanks
JK
What we ultimately find is that for USHCN stations from 1979-2008, the adjusted data shows a warming >50% faster than the compliant (Class 1\2) stations, >100% faster for the Rural MMTS subset. (And, yes, that’s after bumping up the MMTS Tmean trend by +0.02 C/decade as per Menne, 2010.)
And homogenization gets a good chunk of the blame.
It has been a somewhat lonely road, and a very, very long one. Over 2000 man-hours by now. I wonder how many peer-reviewed papers can say as much. But I have enjoyed every minute of it.
REPLY: and lets not forget the thousands of collective man-hours put in by surface-station volunteers – Anthony
Gosh, no! — Evan
Fig 6 is a puzzle. It looks like the adjusted trend is right. The moves caused two artificial drops, reducing the apparent trend, and adjustment restored it.
Now the apparent trend may include UHI. But that is not what homogenization is about. It tries to get the correct trend observed, allowing for artificial changes. It doesn’t attribute the trend.
REPLY: Still in “racehorse” mode, I see. Despite your puzzlement, homogenization is a highly imperfect tool that does little more than blend good data and bad data to make a big sticky unrepresentative mess. Better simply to remove stations that have been compromised than to try to make a temperature milkshake with strawberries and dead fish.
The problem with NOAA is that they think “one size fits all” when it comes to these adjustments, and clearly, as this paper demonstrates, it doesn’t – Anthony
Cue for Mosher drive-by to the effect that BEST proves UHI doesn’t have any measurable effect on the temperature record.
omnologos, it does not look like they violated the publisher’s peer-review policy or appointed editors with no relevant qualifications.
[snip – let’s not start a fight before all the participants are present, shall we? – Anthony]
BEST and their scalpel method should also produce figure 6 type errors, as they are just evaluating the inflated slope of each piece.
Even without relacation, same effect happening and not corrected for.
What the scientists who still do real science are basically saying is what most people realize intuitively. It is not getting warmer. At best, we are holding our own, and not slipping into another LIA.
Land-based temperature records are simply a farce. If businesses used the same type of homogenization techiques in accounting they would be incarcerated.
Let alone the exotic extrapolation and smoothing!
@Anthony Watts or Steve Mosher?
“…The analysis result based on the homogeneity-adjusted data in this paper also shows that the significant increase in annual mean Tmin at Huairou station might have been completely explained by urbanization, and the increase in annual mean Tmax might have been partially caused by urbanization,…”
A criterium to separate UHI affected / non affected stations may then be, if the difference Tmax-Tmin remained stable or if it decreased.
I would like to see exactly what “homogenized” means as it relates to each temperature measurement and it’s homogenized, result as a function or algorithm.
REPLY — You only think you would. I’ve barely managed to isolate the bottom line, and my eyes are still bleeding. ~ Evan
Say what you will about Obama, but with a State of the Union address bracketed by two Polar Vortex events, he’s certainly got that Global Warming thing licked.
When they homogenize the data, why does it always wind up pasteurized?
Well, I can tell y’all what the problem is with homogenization. Pops out of the data like a sore thumb. And the procedure is worse than a mere sticky mess that smears the errors around. That was my first impression, but there’s more to it than just that.
Homogenization takes the average of surrounding stations. If that station’s readings do not conform to the surrounding stations, it is considered an outlier and they adjust it to conform.
Problem is that they do not account for microsite. One out of five stations is properly sited (vis-a-vis heat sink) and they show lower readings. The result is that ~4 out of 5 of the surrounding stations is poorly sited. So the well sited is adjudged an outlier and adjusted accordingly.
The net result is worse than a mere smear. The result is that they are adjusting the well sited stations UPwards to match the poorly sited stations rather than adjusting the poorly sited stations DOWNward to match the well sited stations.
Now, if 4 out of 5 stations were well sited instead of badly sited, the homogenization would sort of work. Or if they adjusted the poorly sited stations downward before homogenizing (as they do with TOBS adjustment). Or even just plain dropped the badly sited stations.
But they don’t, so it ain’t, see?
Besides, why go to all the (large) bother of oversampling in the first place if you are going to go mush it all up by homogenizing it?
I may have misread the Muller’s comments on the Urban Heat Island Effect.
What I thought he was claiming was that the energy that causes UHI is so miniscule that it could raise the temperature of the globe by only a thousandth of a degree C. If so, then he was badly misunderstanding the whole issue related to UHI effects.
I don’t have the source, it was some time ago, and I wasn’t completely sure he meant to be taken that way, so its probable that I’m the one who was wrong, not him. But did anyone else read him that way? And do you have the source?
Dear Mr. W, these folks
Have that odd figure 6. Nick Stokes
Has got a point, to my surprise
The trend should be as he implies
It’s odd to be in this agreement
But it is clear to me what he meant
Adjustments fuzz the temps, that’s clear
But Fig 6 is the issue here
That slope is not the common case
The stations aren’t so hot in place
But if that one in Figure 6
Did what they say, then Nick’s point sticks
That data in particular
Shows rises that are secular
And so, without moves in between
The steep slope is what would be seen
Now maybe those moves had a reason
Creating warmth all out of season
But that’s not what the graph depicts
I’m puzzled, still, by Figure 6.
===|==============/ Keith DeHavelle
Noam Chomsky famously said: “Generally speaking, it seems fair to say that the richer the intellectual substance of a field, the less there is a concern for credentials, and the greater is the concern for content.”
An individual who often posts on WUWT whose name starts with P and reminds us all of a popular breakfast treat (not to be cute – but writing it out seems to kick a post into automatic moderation) apparently disagrees with Chomsky, or believes there is no great content to be found in what is posted on this blog; as he is OBSESSED with Credentials. Favorite targets are well respected commenters like Willis (recently – ad nauseum) and Mosher (above at 5:10) who apparently do not have the proper imprimatur for P. Without a stamped ticket, your actual current WORK counts for nothing in his book.
Making things worse, P hides behind a screen name, or apparently denies any real name from which one could learn of his own credentials.
P. Please stop doing this. We already got your point of view, and it is tiresome. Give it a long, long rest. And anyway, you are wrong.
For what it’s worth, Anthony’s grand discovery: the MICROSITE effect on trend, packs a punch that knocks UHI effect clean out of the park. That shows up in the trend data, too.
The two effects are not entirely unrelated, but let’s not go conflating our microsite with our mesosite: Well sited urban stations average a lot cooler than poorly sited urban stations.
And, sure, cities show a much greater offset than rural. But we must be ever so careful not to confuse offset and trend.
I never knew that strawberry and dead-fish milkshakes were such a favorite among AGWers!
Well, heck, I believe in AGW.
But you have to bear in mind the Three Principles of Global Warming:
1.) Size Matters.
2.) So does the Motion of the Ocean.
3.) Regarding data “adjustment”: If you shake it more than three times, yer playin’ with it.
evanmjones says:
January 29, 2014 at 6:01 pm
——————————————–
The troubling part of the whole adjustment process is that it does similar magic on historical records as well. There is no justification for manipulating 80-120 year old records.
Just saying, the comments on malice or [incompetence] are both right.
Cheers ! From very cold Sarasota Florida once again!
What it needs is someone to examine the placement of all the temperature stations and then.. oh.. hang on. Never mind.
#B^D
And, as Anthony says, we’d like to thank all the unfunded volunteers out there who made this joke possible.
The troubling part of the whole adjustment process is that it does similar magic on historical records as well. There is no justification for manipulating 80-120 year old records.
You said it, brother.
It’s my ultimate ambition to get ahold of the entire GHCN raw data and the metadata (TOBS, moves, equipment changes, etc.) and do the Muller job properly (i.e., using the correct microsite metric). If it’s possible, which it may or may not be, depending on whether that information even exists.
But . . .
If the data on moves and TOBS is not sufficient, though, not only can it not be done, but the record we have — both adjusted and raw — is fatally compromised.
Mods a little help with my shivering spelling of incompetence would be appreciated 😎