Important study on temperature adjustments: 'homogenization…can lead to a significant overestimate of rising trends of surface air temperature.'

From the “we told you so” department comes this paper out of China that quantifies many of the very problems with the US and global surface temperature record we have been discussing for years: the adjustments add more warming than the global warming signal  itself

A paper just published in Theoretical and Applied Climatology finds that the data homogenization techniques commonly used to adjust temperature records for moving stations and the urban heat island effect [UHI] can result in a “significant” exaggeration of warming trends in the homogenized record.

The effect of homogenization is clear and quite pronounced. What they found in China is based on how NOAA treats homogenization of the surface temperature record.

According to the authors:

“Our analysis shows that “data homogenization for [temperature] stations moved from downtowns to suburbs can lead to a significant overestimate of rising trends of surface air temperature.”

Basically what they are saying here is that the heat sink effect of all the concrete and asphalt surrounding the station swamps the diurnal variation of the station, and when it is moved away, the true diurnal variation returns, and then the homogenization methodology falsely adjusts the signal in a way that increases the trend.

You can see the heat sink swamping of the diurnal signal in the worst stations, Class 5, nearest urban centers in the graphs below. Compare urban, semi-urban, and rural for Class 5 stations, the effect of the larger UHI heat sink on the Tmax and Tmin is evident.

Watts_etal_fig17

In Zhang et al, they study what happens when a station is moved from an urban to rural environment. An analogy in the USA would be what happened to the signal of those rooftop stations in the center of the city, such as in Columbia, SC when the station was moved to a a more rural setting.

U.S. Weather Bureau Office, Columbia SC. Circa 1915 (NOAA photo library)U.S. Weather Bureau Office, Columbia SC. Circa 1915 (courtesy of the NOAA photo library)Here is the current USHCN station at the University of South Carolina:

The Zhang et al paper studies a move of Huairou station in Beijing from 1960 to 2008, and the resultant increases in trend that result from the adjustments from homgenization being applied, resulting in a greater trend. They find:

The mean annual Tmin and Tmax at Huairou station drop by 1.377°C and 0.271°C respectively after homogenization. The adjustments for Tmin are larger than those for Tmax, especially in winter, and the seasonal differences of the adjustments are generally more obvious for Tmin than for Tmax.

The figures 4 and 5 from the paper are telling for the effect on trend:

Zhang_et_al_homogenization_china_fig4
Fig. 4 The annual mean Tmax (a) and Tmin (b) of original and adjusted data series at Huairou station and of reference series during 1960–2008. The solid straight lines denote linear trends
Zhang_et_al_homogenization_china_fig5
Fig. 5 The differences of annual mean Tmax (a) and Tmin (b) between
Huairou station and reference data for original (dotted lines) and adjusted (solid lines) data series during 1960–2008. The solid straight lines denote linear trends

Now here is the really interesting part, they propose a mechanism for the increase in trend, via the adjustments, and illustrate it.

Zhang_et_al_homogenization_china_fig6
Fig. 6 A sketch of effects of Huairou station relocations on annual mean minimum temperature trends of the adjusted and unadjusted data series

They conclude:

The larger effects of relocations, homogenization, and urbanization on Tmin data series than on Tmax data series in a larger extent explain the “asymmetry” in daytime and nighttime SAT trends at Huairou station, and the urban effect is also a major contributor to the DTR decline as implied in the “asymmetry” changes of the annual mean Tmin and Tmax for the homogeneityadjusted data at the station.

In my draft paper of 2012 (now nearing completion with all of the feedback/criticisms we received dealt with, thank you. It is a complete rework. ), we pointed out how much adjustments, including homogenization, added to the trend of the USCHN network in the USA. This map from the draft paper pretty much says it all: the adjusted data trend is about twice as warm as the trend of stations (compliant thermometers) that have had the least impact of siting, UHI, and moves:

Watts_et_al_2012 Figure20 CONUS Compliant-NonC-NOAA

The Zhang et al paper is open access, an well worth reading. Let’s hope Petersen, Karl, and Menne at NCDC (whose papers are cited as references in this new paper) read it, for they are quite stubborn in insisting that their methodology solves all the ills of the dodgy surface temperature record, when it fact it creates more unrecognized problems in addition to the ones it solves.

The paper:

Effect of data homogenization on estimate of temperature trend: a case of Huairou station in Beijing Municipality Theoretical and Applied Climatology February 2014, Volume 115, Issue 3-4, pp 365-373,

Lei Zhang, Guo-Yu Ren, Yu-Yu Ren, Ai-Ying Zhang, Zi-Ying Chu, Ya-Qing Zhou

Abstract

Daily minimum temperature (Tmin) and maximum temperature (Tmax) data of Huairou station in Beijing from 1960 to 2008 are examined and adjusted for inhomogeneities by applying the data of two nearby reference stations. Urban effects on the linear trends of the original and adjusted temperature series are estimated and compared. Results show that relocations of station cause obvious discontinuities in the data series, and one of the discontinuities for Tmin are highly significant when the station was moved from downtown to suburb in 1996. The daily Tmin and Tmax data are adjusted for the inhomogeneities. The mean annual Tmin and Tmax at Huairou station drop by 1.377°C and 0.271°C respectively after homogenization. The adjustments for Tmin are larger than those for Tmax, especially in winter, and the seasonal differences of the adjustments are generally more obvious for Tmin than for Tmax. Urban effects on annual mean Tmin and Tmax trends are −0.004°C/10 year and −0.035°C/10 year respectively for the original data, but they increase to 0.388°C/10 year and 0.096°C/10 year respectively for the adjusted data. The increase is more significant for the annual mean Tmin series. Urban contributions to the overall trends of annual mean Tmin and Tmax reach 100% and 28.8% respectively for the adjusted data. Our analysis shows that data homogenization for the stations moved from downtowns to suburbs can lead to a significant overestimate of rising trends of surface air temperature, and this necessitates a careful evaluation and adjustment for urban biases before the data are applied in analyses of local and regional climate change

Download the PDF (531 KB)  Open Access

h/t to The Hockey Schtick

=============================================================

UPDATE 1/30/14: Credit where it is due, Steve McIntyre found and graphed the physical response to station moves three years ago with this comment at Climate Audit.

Posted Oct 31, 2011 at 3:24 PM | Permalink

Here’s another way to think about the effect.

Let’s suppose that you have a station originally in a smallish city which increases in population and that the station moves in two discrete steps to the suburbs. Let’s suppose that there is a real urbanization effect and that the “natural” landscape is uniform. When the station moves to a more remote suburb, there will be a downward step change. E.g. the following:

The Menne algorithm removes the downward steps, but, in terms of estimating “natural” temperature, the unsliced series would be a better index than concatenating the sliced segments.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
194 Comments
Inline Feedbacks
View all comments
RichardLH
January 30, 2014 4:40 am

Gail Combs says:
January 30, 2014 at 3:36 am
“As someone who ran chemical QC labs for years I am in complete agreement.”
As someone who has been ‘modelling the world inside a computer’ for many years……. 🙂

MikeN
January 30, 2014 5:16 am

Is the SurfaceStations Project still accepting additions to the database?

RichardLH
January 30, 2014 5:45 am

Gail Combs says:
January 30, 2014 at 4:29 am
“Agreed as long as it is calibrated.”
Well if it’s not calibrated……
“I would still like to know when the min-max started to be used at weather stations. makes the Tobs adjustment questionable especially if it is across the board adjustment.”
Back in the days when men wore Top Hats daily I believe. And pen and paper to record the results. 🙂
RMS values (what ARE they?) of a voltage derived from an instrument, setup to reflect voltage as being a representation of temperature (to keep some here slightly happier?), would allow Min + Max / 2 only for sine wave 12 hour days.
All the rest should be some form of half wave sine plus offset and a ‘power factor’ to get it right.
Or a ‘continuously’ sampled output, ADC with anti-alias filter and then averaged down to the period in question.
Hour, Day, Month, Year, Decade.

RichardLH
January 30, 2014 5:50 am

You know, a step wise integral of voltage/temperature. Now when did I learn that is school? All that graph paper and counting squares……..

MarcK
January 30, 2014 6:01 am

Isn’t measuring temps using urban thermometers like trying to measure the kitchen temp by using a thermometer in a pot of hot water while someone occasionally, randomly, increases the flame under that pot? The thermometer in the pot really only tells you about the pot not the room. Why do we even pay attention to urban thermometers?

rogerknights
January 30, 2014 6:10 am

Gail Combs says:
January 30, 2014 at 3:51 am
rogerknights says: January 29, 2014 at 9:44 pm
Tonight in 20 minutes (10 PM Pacific, 1 AM Eastern), Coast to Coast AM radio will be interviewing:
>>>>>>>>>>>>>>>>>>>>>>>>
Ghost to Ghost? Home of paranormal and UFO ‘Science’?
Not Good. That lumps Skeptics in with the Flat Earthers and other nuts and fringe groups. Lewandowsky and friends must be grinning with glee.

Well, Art Bell, the show’s legendary former host, was the co-author of “The Day after Tomorrow,” so there’s that.
Robert Zimmerman’s views can be linked to independently of C2C at his website:
http://www.google.com/url?q=http://behindtheblack.com/

Reply to  rogerknights
January 30, 2014 12:24 pm

– You mean George Nouri has not been the host since Marconi invented Radio?
You destroyed my faith! 😉

RichardLH
January 30, 2014 6:27 am

MarcK says:
January 30, 2014 at 6:01 am
“Isn’t measuring temps using urban thermometers like trying to measure the kitchen temp by using a thermometer in a pot of hot water while someone occasionally, randomly, increases the flame under that pot? The thermometer in the pot really only tells you about the pot not the room. Why do we even pay attention to urban thermometers?”
Nearly true. The pot does reflect the temperature in the room as well. That’s its ‘heat sink’.
If we track how temps change in all the parts of the system and give them appropriate weights then we do get a correct overall picture.
Like adding the thermometer in the oven to that in the room to that in the outside air.
Yon need them all to get the true ‘average’. But you do also need to get the weighting factors right 🙂

JJ
January 30, 2014 6:42 am

charles the moderator says:
In what field are you preeminent Roger?

Beg pardon, but I think that you have read it incorrectly. Monckton did not claim Roger was preeminent. He said that Roger and the others were "pre-eminent". See:
“To save time, the 19 authors of the 12 papers – all of them pre-eminent in their various fields – …”
I think we should all be able to agree that Monckton is correct, and that all 19 authors (including Roger) are not yet eminent in their various fields. This simple recognition of common ground should allow all of you to stop playing out this post-eminently silly and increasingly tedious game of “my ad verecundiam fallacy can beat up your ad verecundiam fallacy” on every thread. That situation being utterly embarrassing to all involved – testy-peer and toaster-pastry alike…

January 30, 2014 6:54 am

RichardLH:
At January 30, 2014 at 6:27 am you say

But you do also need to get the weighting factors right :-)</blockquote
Adopting the very big and very dubious assumption that everything else you say is true.
How would you know how to get the weighting factors right?
And how could you know you had got them right?
Richard

January 30, 2014 6:56 am

Sorry about the formatting error. Richard

RichardLH
January 30, 2014 7:08 am

richardscourtney says:
January 30, 2014 at 6:54 am
RichardLH:
At January 30, 2014 at 6:27 am you say
“And how could you know you had got them right?”
By doing an RMS (step wise integral) averaged sum of how they change over time associated with a volume calculation?

January 30, 2014 7:16 am

RichardLH:
re your answer to me at January 30, 2014 at 7:08 am.
Ignoring the computing power and time required to do “an RMS (step wise integral) averaged sum of how they change over time associated with a volume calculation” for all the possible combinations and permutations of weightings, what the Dickens would that tell you?
You are not doing a regression where minimising differences from the trend line has a meaning.
Richard

negrum
January 30, 2014 7:20 am

charles the moderator says:
January 30, 2014 at 2:54 am
I hear ya Gail, and I feel somewhat the way you do, but this issue goes to the heart of the ethics and principles that brought many of us into these subjects, the quest for truth and integrity of science, wherever that takes us.
—-l
Without wishing to offend or detract from the remarkable effort put in by poptech in this matter:
The intense focussing on credentials seems to be counter-productive to me. There are enough other transgressions to tackle and I feel that this has become a too personal matter for those involved. It might even be that there is a bit of baiting occurring, since the other lines of attack did not work.
I fully support the exposure of hypocrisy, but I think in this case Willis got it most right (in his usual style:))

Mark Bofill
January 30, 2014 7:23 am

hmm. My popcorn has gone stale. No surprise or outrage from anybody?
Did everybody already know about this except me?

January 30, 2014 7:24 am

Anthony,
The UHI is not caused by any “heat sink.” A heat sink, used in most electrical equipment, conducts heat AWAY from a component which may overheat without it. In other words, the excess heat goes into the “sink” and down the drain, away.
Heat-retaining buildings and pavement are “thermal mass” heated by sunlight, which does not happen to any Earth surface covered by vegetation. Naked rock might behave similarly.
DR! Hear hear! Please confirm. Engineers must know how to take data, and typically treat it with respect, or the data becomes noise. “Climate Scientists” get away with data murder, mostly because most of us engineers are not good at communicating to the public.
Most thermometers are used for weather reports. Weather reports never specify the temperature to a tenth of a degree, so most thermometers are only accurate to +-0.5 degrees, sometimes even less. Reporting the “average temperature” to the tenths, hundredths, or thousandths from these thermometers is simply meaningless, the reporter is claiming information that does not exist.

RichardLH
January 30, 2014 7:30 am

richardscourtney says:
January 30, 2014 at 7:16 am
“what the Dickens would that tell you?”
It would give me a pretty good assessment of how the gas in my cooker was heating the kitchen in which it sat. The external heat sink (outside air) is just as important as the gas flame you know 🙂
Knowing or calculating the volumes would make the temperature conduction/convection sums easier as well.

srvdisciple
January 30, 2014 7:31 am

regarding Anthony’s reply to Nick Stokes at 5:05pm #comment-1554048. Was that a “Bass-O-Matic” reference? If so, 1000+ up votes!

January 30, 2014 7:32 am

This whole mess is why I tried something different.
I went looking for the difference (diff) between how much the temp went up yesterday (rise), and then went down last night(fall) on a station by station basis.
So, take a rural station, with a wide signal, still day over day is just the change based on the ratio of day to night (seasons), and weather. Now, build a road near by. Tmin would go up, Tmax would go up, but once it stabilizes the diff stabilizes, once averaged over a year there might be a slight change in diff, but you still end up with a signal containing seasons and weather. Now average diff between 100 stations, if 10 had roads built this year, 90 didn’t, the more samples you have the smaller any individual stations change impacts the resultant diff. Similarly as long as each station regularly takes it’s measurement the same way, TOB is immaterial, either based on a specific time, or actual mn/mx and the difference between rise and fall (diff) will include only seasonal and weather. Weather is removed by averaging a large number of stations, seasonal signal by yearly averaging.
I don’t do anything except use actual measurements, no adjustments, no interpolation between stations, what I have is all measurement. It’s definitely not a “Global” average, that was by intention.
And it shows no warming trend (or so small it’s hard to identify). I also include averages on Tmean, Tmin, Tmax, Trise, Tfall, Rel Humidity, station pressure, precipitation, sample counts, station by station counts, station by station average Tmean, and google map station location.
My opinion is that all of the models of surface temps vastly over estimate modern warming.
Mosh thinks I’m an idiot.

RichardLH
January 30, 2014 7:32 am

P.S. This is an analogy you do realize, not an attempt to suggest that you could do this for Global Temperatures.

RichardLH
January 30, 2014 7:34 am

Mi Cro says:
January 30, 2014 at 7:32 am
“My opinion is that all of the models of surface temps vastly over estimate modern warming.”
The data says they do.
“Mosh thinks I’m an idiot.”
Mosh plays with jpg pictures of the world with the compression setting set to ‘high’. Me, I prefer raw myself 🙂

January 30, 2014 7:35 am

RichardLH:
There is an old adage which one should never forget.
What you don’t know does not causes you the greatest problems. What you think you know that is wrong does.
Richard

RichardLH
January 30, 2014 7:44 am

richardscourtney says:
January 30, 2014 at 7:35 am
“What you don’t know does not causes you the greatest problems. What you think you know that is wrong does.”
Top down analyse, bottom up implement.

January 30, 2014 8:04 am

Mods,
You didn’t like that? All strictly true from the textbooks…

Tom Stone
January 30, 2014 8:34 am

When a climate scientist homogenizes and smooths data, he gets a grant. When an accountant homogenizes or smoooths data, he gets gets sued and goes to jail.

January 30, 2014 8:41 am

I maintain that apart from homogenisation, questions remain about the accuracy of original observer recordings due to their propensity in the old days to round temperatures to the nearest (or lowest) .0F.
At all 112 ACORN locations in Australia, the average .0 rounding proportion prior to 1972 metrication was about 44%. It now hovers around 12% in the Celsius regime, where it should be, although there’s evidence of a 20%+ jump in the late 90s, early 2000s when Automatic Weather Stations were introduced.
To compare annual .0 rounding proportions and annual min and max temperatures at all ACORN stations 1910-2013, have a look at http://www.waclimate.net/truncate/rounding-raw.html
As the page warns, its data is sourced to the BoM’s Climate Data Online so an unknown quantity of daily temps since 1910 have been homogenised by the BoM, different stations have intermittent missing data gaps, and many shifted from the PO to an airport at some stage – so the results aren’t precise.
Nevertheless, conversion of every day since 1910 from the existing celsius records back to fahrenheit, using a .0 definition of .94><.06 within the converted F temps, suggests 43.73% were .0F pre 1972. Either a majority of daily temps were unadjusted by the BoM or they were adjusted to convert to X.0F … unlikely. Most surviving original Fahrenheit records in newspapers, Year Books, etc, suggest 44% is an accurate estimate of .0 rounding pre metric.
Does .0 rounding affect annual temperature averages? If observers rounded equally up and down it would be indiscernible, but I believe the evidence suggests a greater number used to round down more frequently to the nearest degree on their thermometer – a natural tendency, particularly among amateurs in country towns.
The analysis suggests 1972 had the equal warmest annual min ever recorded in Australia, and the La Nina of 1974/76 was as warm nationally as the El Nino of 1968/70. Australian ACORN station .0 rounding of min dropped from 33.54% in 1971 to 20.84% in 1973, with max rounding down from 31.58% to 18.97%.
Too many variables, particularly natural, to pin a temperature influence, but food for thought.