New paper blames about half of global warming on weather station data homogenization

From the told ya so department, comes this recently presented paper at the European Geosciences Union meeting.

Authors Steirou and Koutsoyiannis, after taking homogenization errors into account find global warming over the past century was only about one-half [0.42°C] of that claimed by the IPCC [0.7-0.8°C].

Here’s the part I really like:  of 67% of the weather stations examined, questionable adjustments were made to raw data that resulted in:

“increased positive trends, decreased negative trends, or changed negative trends to positive,” whereas “the expected proportions would be 1/2 (50%).”

And…

“homogenation practices used until today are mainly statistical, not well justified by experiments, and are rarely supported by metadata. It can be argued that they often lead to false results: natural features of hydroclimatic times series are regarded as errors and are adjusted.”

The paper abstract and my helpful visualization on homogenization of data follows:

Investigation of methods for hydroclimatic data homogenization

Steirou, E., and D. Koutsoyiannis, Investigation of methods for hydroclimatic data homogenization, European Geosciences Union General Assembly 2012, Geophysical Research Abstracts, Vol. 14, Vienna, 956-1, European Geosciences Union, 2012.

We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant.

From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends.

One of the most common homogenization methods, ‘SNHT for single shifts’, was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence.

The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.

Conclusions

1. Homogenization is necessary to remove errors introduced in climatic time

series.

2. Homogenization practices used until today are mainly statistical, not well

justified by experiments and are rarely supported by metadata. It can be

argued that they often lead to false results: natural features of hydroclimatic

time series are regarded errors and are adjusted.

3. While homogenization is expected to increase or decrease the existing

multiyear trends in equal proportions, the fact is that in 2/3 of the cases the

trends increased after homogenization.

4. The above results cast some doubts in the use of homogenization procedures

and tend to indicate that the global temperature increase during the

last century is smaller than 0.7-0.8°C.

5. A new approach of the homogenization procedure is needed, based on

experiments, metadata and better comprehension of the stochastic

characteristics of hydroclimatic time series.

PDF Full text:

h/t to “The Hockey Schtick” and Indur Goklany

UPDATE: The uncredited source of this on the Hockey Schtick was actually Marcel Crok’s blog here: Koutsoyiannis: temperature rise probably smaller than 0.8°C

 =============================================================

Here’s a way to visualize the homogenization process. Think of it like measuring water pollution. Here’s a simple visual table of CRN station quality ratings and what they might look like as water pollution turbidity levels, rated as 1 to 5 from best to worst turbidity:

CRN1-bowlCRN2-bowlCRN3-bowl

CRN4-bowlCRN5-bowl

In homogenization the data is weighted against the nearby neighbors within a radius. And so a station might start out as a “1” data wise, might end up getting polluted with the data of nearby stations and end up as a new value, say weighted at “2.5”. Even single stations can affect many other stations in the GISS and NOAA data homogenization methods carried out on US surface temperature data here and here.

bowls-USmap

In the map above, applying a homogenization smoothing, weighting stations by distance nearby the stations with question marks, what would you imagine the values (of turbidity) of them would be? And, how close would these two values be for the east coast station in question and the west coast station in question? Each would be closer to a smoothed center average value based on the neighboring stations.

UPDATE: Steve McIntyre concurs in a new post, writing:

Finally, when reference information from nearby stations was used, artifacts at neighbor stations tend to cause adjustment errors: the “bad neighbor” problem. In this case, after adjustment, climate signals became more similar at nearby stations even when the average bias over the whole network was not reduced.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
224 Comments
Inline Feedbacks
View all comments
kim
July 17, 2012 7:51 pm

Global warming, we hardly knew ye.
===========

DR
July 17, 2012 8:18 pm

Bill Illis says:
July 17, 2012 at 5:04 pm
The lower troposphere is supposed to be warming at a faster rate than the surface, particularly in the Tropics where it is supposed be 27.3% higher according to the climate models, but also extending to mid-latitudes like the US.

Roy Spencer says:
if the satellite warming trends since 1979 are correct, then surface warming during the same time should be significantly less, because moist convection amplifies the warming with height.

DR
July 17, 2012 8:22 pm

Dr. Koutsoyiannis followed the same steps for a previous publication. First, a presentation, then publish the paper.
http://climateaudit.org/?s=Koutsoyiannis

GeoLurking
July 17, 2012 8:39 pm

Chris Wright says:
July 17, 2012 at 4:31 am
“…However, I don’t think it is the result of any organized conspiracy…”
Never attribute to stupidity that which is best explained by a collection of ###****s with an agenda.
Yeah, it’s not the way it actually goes, but with flip side of that you are giving the agenda pushers and easy win while you sit around waiting for better indications.

July 17, 2012 8:59 pm

It worries me that when I read these posts, I think, “Wow, there really is something dodgy about global warming!”. Then I read the comments, and somewhere there is always some “alarmist” who points out annoying details, and I start to doubt. How come just about every nail in the coffin of AGW seems to be made of jelly?
[REPLY – What it means is that WUWT, unlike nearly all alarmist blogs, does not censor contrary points of view. Science is a very back-and-forth kind of thing. Anyone can be wrong. Anything can be wrong. Consider that. ~ Evan]

David Gould
July 17, 2012 9:09 pm

It would seem to me that if the warming has been that minute then the earth as a system is incredibly sensitive, given the rapid changes in in the Arctic and the quite dramatic fall in soil moisture levels globally (http://climexp.knmi.nl/ps2pdf.cgi?file=data/ipdsi_0-360E_-90-90N_n.eps.gz). If the earth is that sensitive, then we are in just as much trouble as if the temperature increase had been large …

July 17, 2012 9:54 pm

In calculating correlation coefficients between pairs of stations, Tmax and Tmin will give rather different correlations over a given time period. Further, different compilers use different ways to calculate Tmean, sometimes by simply averaging Tmax and Tmin. From these 2 sentences alone, one can see scope for error. (Note to BEST – might be an idea to check this if you have not already).
Here is a short-cut look at correlations, calculated by taking a single site and lagging the annual data by 1, 2, 3 etc years. Calculations like this support the above contention.
http://www.geoffstuff.com/Melb-correl.jpg

SteveS
July 17, 2012 10:40 pm

Steven Mosher says:
July 17, 2012 at 12:12 pm
Victor Venema says:
July 17, 2012 at 11:50 am (Edit)
Steven Mosher says:
“Then of course it would make sense to check the report and see how PHA did? Because they are looking at GHCN v2 here, ”
The pairwise homogenization algorithm used by NOAA to homogenize USHCN version 2, is called “USHCN main” in the article. It performed well. It has a very low False Alarm Rate (FAR). As there is always a trade of between FAR and detection power, the algorithm could probably have been more accurate overall. And the pairwise algorithm has a fixed correction for every month of the year. Inhomogeneities can, however, also have an annual cycle. For example, in case of a radiation error, the jump will be larger in summer as in winter. With monthly corrections USHCN would have performed better, especially as the size of the annual cycle of the inhomogeneities in the artificial data used in this study was found to be a little too large.
################
Thanks Victor I’m pretty well aware of how PHA did, but thanks for explaining to others. Most wont take time to read the article or consider the results. Those who do take the time to skim the article for a word ( like PHA ) will not find it. But of course if they were current on the literature they would know that PHA is USHCN main. (hehe. told me everything I needed to know)
Interesting Steve……at 12:12 pm today you chastize people for not being abreast of the latest literature in not knowing what pha main is. An hour earlier at Steve M site, you yourself don’t seem quite to sure what it is. I take “hmmmm” and ” pretty sure ” to be, well, not so sure
“Steven Mosher
Posted Jul 17, 2012 at 10:53 PM | Permalink | Reply
Hmm. yes he describes the process of correcting GHCN v3 as SNHT.
hmm. I pretty sure that the PHA algorithm is not SNHT.”
Apparently you yourself were not up to date on this one, until of course you looked it up and came back to look like a genius. :0)

Spence_UK
July 17, 2012 11:27 pm

And only six of the stations in the diagram are north of 60 degrees latitude, where most of the warming is happening.

How does this compare, as a ratio, to HadCRU/GISS then? I’d be surprised if they were much different. Also, why do you think the homogenisation process will suddenly start working beyond 60 degrees N?

nc
July 17, 2012 11:39 pm

Now with this revelation and the earlier post about the IPCC, anyone with thoughts on a climategate 3 release. Keep punching while the consensus is dazed.

John Finn
July 18, 2012 12:41 am

scarletmacaw says:
……..
As brokenyogi pointed out, the study concluded that the warming is between 0.4 and 0.7.
If the warming from adjustment was 100% in error (i.e. a correct adjustment would have resulted in zero additional warming) then the conclusion is that there has been warming since 1980, and there was similar warming from 1900 to 1940, negated by cooling from 1940 to 1980. Just drawing a trend line through 1900-1980 glosses over the up and down nature of the temperature history.

Which means that solar activity has a negligible effect since solar activity was higher in 1940-1980 than it was in 1900-1940. It also brings into question the UHI effect
But you’re also grabbing at the straw whereby you seem to be suggesting that temps rose 0.4 – fell by 0.4 – and then rose again by 0.4. Presumably we’ll now fall by 0.4 deg again. Have fun with that particualr hypothesis.

Alexej Buergin
July 18, 2012 1:07 am

“mikelorrey says:
July 17, 2012 at 6:27 pm
Soooooo lemme get this right: 25-50% of climate change is due to solar variation, 25-50% is due to ENSO/AMO/PDO/NAO variations, and now 50% is due to homogenization effects. That means 100-150% of warming is now accounted for”
Lets get it right: If 50% is due to homogenization, climate change is 0.4°C and not 0.8°C. Of these 0.4°C 25% would be due to solar variation and 25% is due to ENSO etc. …
That leaves up to 0.2°C for CO2.

Alexej Buergin
July 18, 2012 1:29 am

“Steven Mosher says:
July 17, 2012 at 11:12 am
Situation: When have station named Mount Molehill. It is located at 3000 meters above sea level. It records nice cool temperatures from 1900 to 1980. Then in 1981 they decide to relocate the station to the base of Mount Molehill 5 km away. Mount Molehill suddenly because much warmer.
But won’t they rename the station? Nope! they may very well keep the station name the same.”
An example for “Mount Molehill” in the real world can be seen here:
http://wattsupwiththat.com/2009/12/06/how-not-to-measure-temperature-part-92-surrounded-by-science/
Note: They did rename the station instead of calling both of them “Wellington”.
But otherwise …

John Doe
July 18, 2012 1:59 am

Steven Mosher says:
July 17, 2012 at 11:12 am
“Now, my friends, how do you handle such a record. a station at 3000 meters is moved to 0 meters and suddenly gets warmer? That’s some raw data folks. Thats some un adjusted data.
anybody want to argue that it should be used that way??”
What you do is presume that with thousands of stations for every one that moved to a lower altitude another somewhere moved to a higher altitude and they cancel out.
These instruments were never designed for this task in any case. They cover only a tiny fraction of the earth’s surface and are not accurate to tenths of a degree and until recently only recorded two instantaneous temperatures per day. You can’t make a silk purse out of a sow’s ear.
Follow the satellite data. It’s only 33 years but it’s the only network capable of establishing a global average temperature. As of now it shows 0.14C/decade warming and falling as the warming was faster in the earlier part of the record.
http://woodfortrees.org/plot/rss/every/mean:12/offset:0.13/plot/rss/every/trend/offset:0.13/plot/uah/every/mean:12/offset:0.23/plot/uah/every/trend/offset:0.23
When we overlay the satellite record on the AMDO we see a reasonable explanation for why it was rising faster in the early part of the record – the satellite measurements happened to begin coincident with the warming side of a 60-year cycle.
http://woodfortrees.org/plot/rss/every/mean:12/offset:0.13/plot/rss/every/trend/offset:0.13/plot/uah/every/mean:12/offset:0.23/plot/uah/every/trend/offset:0.23/plot/esrl-amo/every
Another 5-10 years of satellite temperature following the AMDO on the downside of the cycle further reducing the current 0.14C/decade should settle the matter one way or the other. It’s not looking for the warmists at this point. IPCC AR1 in 1990 predicted 0.30C/decade warming if CO2 emission was not curtailed. It wasn’t curtailed and less than half the predicted warming actually occured. That’s game over as far as IPCC consensus “skill”. The only thing left to determine is exactly how badly wrong they were and why. The post mortem will be interesting.

Power Engineer
July 18, 2012 3:51 am

“..and the other half is due to the UHI urban heat island”
I believe we should rename this HI Heat Island as it is present in small towns also. My hometown is only 4000 people yet over the last 60 years most of the large trees have died, the lawns have been paved over to create parking lots, the homes have been converted to offices with extensive air conditioning, the car traffic has increased 10-fold.
It is not uncommon to notice a 5 deg F temperature decrease as you leave town.
Had this town been a temperature monitoring location it would have shown warming over the last 60 years but little of it would have been due to the climate.
I see the same the same thing happening all over America…..and Europe.
This is doubly important because some of the studies minimize UHI by showing that small towns and large cities have similar temp increases. I say they are both showing the heat island effect.

July 18, 2012 4:07 am

The problem with shelter that are open to the bottom is actually also “thermal pollution”. The problem with these shelters is that on days with strong insolation and little wind, the soil heats up and the thermal radiation from the ground heats the thermometer. This is very similar to the case of the UHI where the surface heats the air and then the thermometer.
See figure below, where two Stevenson screens are compared with one Montsouri screen (right), which is open to the bottom and to the North. Any “skeptic” can build such old screen and validate that indeed these measurements were biased and thus need to be corrected to obtain trends in the true climate.
cd_uk says: “If you’re correcting for the UHI then one would expect that most would be lowered not raised.”
Exactly and that the temperature trend is higher after homogenization means that the UHI is less important than all the other inhomogeneities.
cd_uk says: “As for your link to your page on homogenisation. Thanks for that. It only refers to optimisation, many systems of linear combinations (e.g. some type of weighted mean) are derived via optimsation where the process is to minimise the error between the estimated and true value. This would be a type of averaging. But can’t say one way or another given your page. But thanks anyway.”
There are two types of homogenization algorithm, ones that work with pairs of stations, such as USHCN and ones that use a reference time series computed from multiple surrounding stations. This reference is indeed a weighted average of the surrounding stations. (Some people use kriging weights, which is optimal if the time series do not contain inhomogeneities, it still has to be studied whether it is optimal for homogenization.) However, this reference is not used to replace the station data, but a difference time series is computed by subtracting this reference from the candidate station. In this way the regional climate signal is removed and a jump can be detected much more reliably. The jump size found in this difference series is added to the candidate station for homogenization. The homogenized data is thus the original data plus a homogenization adjustment, it is not the averaged signal of the neighbors, there is no smearing the error as Anthony Watt keeps on repeating.

vvenema
July 18, 2012 4:16 am

“Steven Mosher
Posted Jul 17, 2012 at 10:53 PM | Permalink | Reply
Hmm. yes he describes the process of correcting GHCN v3 as SNHT.
hmm. I pretty sure that the PHA algorithm is not SNHT.”
The detection algorithm of PHA is SNHT. The standard version of SNHT uses a kriged reference time series, that is it computes the difference between a reference time series and the candidate and detects the inhomogeneities on this difference time series.
The PHA uses a pair wise comparison, that is is computes the difference of pairs of stations with the candidate station and its surrounding station and then applies the test of SNHT to detect the breaks on these pairs. Then you know the breaks in the pairs, but you still need to determine in which station the break actually is. If there is break at a certain date in the difference between A and B and between A and C, but not between B and C, you can attribute the break to station A. With more stations this become much more reliable. This attribution part of the PHA is not part of the original, simpler SNHT algorithm.
Thus both statements are okay. You can call PHA a version of SNHT if you focus on the detection part, but you can also see it as too different if you want to emphasis the full algorithm.

wayne Job
July 18, 2012 4:33 am

Take one hundred long term rural stations across America making due allowance for any UHI effect, graph them individually for trend and divide by 100. The AGW crowd are all about trends they would be pleased with the results. Or maybe not. Statistical homoginization using the algorithyms of the AGW crowd is some what like the company advertising for an accountant.
The recruiting officer only asked one question, what is two plus two, all failed the answer until one applicant said “what would you like it to be” he got the job.

vvenema
July 18, 2012 6:23 am

DR says: “Dr. Koutsoyiannis followed the same steps for a previous publication. First, a presentation, then publish the paper. ”
That is okay, that is what conferences are for, to discuss your preliminary results with colleagues and improve the analysis before you publish. On your own, you are likely to oversee something, especially for a new topic, as far as I know Koutsoyiannis did not work on homogenization before. That is why it is such a pity that the climate “skeptics” are never at conferences. (Except for people like Roger Pielke, who do not deny climate change, but only say it is more complex, which is always true, real life is always more complex.)
The problem is Anthony acting as if these power point slides were a finished scientific study, a “peer reviewed paper”, which he fortunately corrected, although the title of the post still claims it was a “paper”. The main problem is the lack of critical thinking here when the results point in the “right” direction. If this study had shown that the trend is actually twice as strong, the study would have been heavily criticized.
Mark Harrigan says it beautifully:
http://tamino.wordpress.com/2012/07/18/wheres-the-skepticism/#comment-64174

J Crew
July 18, 2012 6:27 am

I noticed each troll talked down from their lofty position in climate science. But in open scientific debate they were exposed as narrowly opinionated and not truth seekers, still hung up on CO2 as the control knob. A part from funding, their foundation continues to crumble beneath them before many.

izen
July 18, 2012 6:56 am

This is a classic example of confirmation bias.
Take a small cherry-picked sample of temperature data records, mainly from areas which have shown less warming than the whole globe and which required more correction for time of observation, sensor type and microclimate change than most and compare the uncorrected trend with the result after correction.
When this limited sample shows an uncorrected trend lower than the global trend from every other data source, including satellite data that does not have any homogenisation correction, or the BEST temperature series the skeptical response would be to doubt the validity of the analysis. Only the dogmatically devoted who avidly embrace any and all suggestions that the observed warming may be smaller than the full diversity of the data indicates would elevate such tendentious and uncertain work to something that calls into question the mainstream record.

cd_uk
July 18, 2012 7:09 am

Victor
Is the point of the article not, as one would expect for UHI effect, that most station data would are adjusted up rather than adjusted down as in the homogenised data? The other point I’d add is that most urbanisation would be gradual and therefore may not be idntified by a discrete jump. Furthermore, if your homogenisation uses neighbouring stations suffering from the same process the effect would be to push the temperatures up.
As for your qualification on temperature homogenisation thanks. I think the result will still be the same – smoothing.
If I’ve got this right:
1) You use interpolated data (for candidate station) to predict the temperature relative to neighbouring stations. This is carried out for each year of the time series.
2) This difference (for each year): diff = observed – interpolated
3) This diff is then added onto the observed to give the homogenised value and thus:
interpolated = diff + observed
whichas you can see is the same as just assigning the interpolated value to the candidate station and hence the smoothing.

cd_uk
July 18, 2012 7:14 am

Sorry Victor the should’ve been “…adjusted down rather than up as in the homogenised data…”

cd_uk
July 18, 2012 7:28 am

izen
I don’t think that is what is going on here. The homogenisation process does appear to be a purely statistical approach that will effectively smooth real/false climatological data indiscriminately. The reason for these adjustments is to remove experimental error, but this should have a 50:50 split and therefore will not affect the final result. This is not what is happening.
As for global records you are correct. The satellite data does corroborate the instrumental record. However, some of the heaviest adjustment (and downward ones) are pre-satellite.
As for BEST they still have to release their Kriging Variance maps for each year (as far as I know). Without these we don’t know what proportion of their gridded data has uncertainties equal to the variance of the set. If the majority have “variances” on the order of the range seen in the time series they produced the chronology has no – casually speaking – statistical significance.

scarletmacaw
July 18, 2012 7:43 am

vvenema says:
July 18, 2012 at 4:16 am

Thank you for your detailed explanation.
That method sounds like it would do a very good job of finding discontinuities due to station moves, equipment changes, and microclimate changes. It doesn’t sound like it would solve the problem of a relatively slow increase in UHI, and might end up correcting the few non-UHI stations in the wrong direction.
Air conditioning and a switch to asphalt paving both occurred (at least in the developed world) from the mid-1960s through the 1970s. This would give a significant increase in UHI mainly during that period and explains why the relatively flat temperature history of GHCN et al. disagrees with the concerns of rapid cooling of some scientists in the late 1970s. The UHI in the temperature record masked the real cooling apparent in the actual weather at the time.