NCDC's new USHCN hockey stick trick

Yesterday, NCDC released Version 2.5 of the USHCN data set. For those who don’t know, this is the U.S. Historical Climatology Network (USHCN) which NCDC considers a “gold standard” for US temperature measurements. Problem is, the reality of the old network is that it is fraught with all sort of inconsistencies throughout its record such as multiple station moves, equipment changes, time of observation changes, encroachment by urbanization, and of course faulty station siting which I discovered that only 1 in 10 USHCN stations met the criteria of NOAA’s 100 foot rule, a charge backed up by an investigation done by the U.S. General Accounting Office (GAO). As a result of the most recent research, we found significant positive biases in the raw data:

Of course the line from NCDC is always that “none of this matters” and that such things can be solved by adjustments. After seeing the differences between the USHCN2.0 and USHCN 2.5 data set, I ask:in what temperature measurement universe does a hockey stick like this occur“?  See below.

Graph from Steve Goddard, source here.

How can adjustments that affect only one decade like this be justified? The answer is that they can’t.

Here’s what NCDC said about it yesterday in their monthly State of the Climate report:

===========================================================

USHCN Version 2.5 Transition

Since 1987, NCDC has used observations from the U.S. Historical Climatology Network (USHCN) to quantify national- and regional-scale temperature changes in the conterminous United States (CONUS). To that end, USHCN temperature records have been “corrected” to account for various historical changes in station location, instrumentation, and observing practice. The USHCN is a designated subset of the NOAA Cooperative Observer Program (COOP) Network. USHCN sites were selected according to their spatial coverage, record length, data completeness, and historical stability. The USHCN, therefore, consists primarily of long-term COOP stations whose temperature records have been adjusted for systematic, non-climatic changes that bias temperature trends.

Did you know — the National Climatic Data Center periodically improves the quality of the datasets maintained at the center and releases updated versions’ Beginning with the September 2012 processing, NCDC will use USHCN version 2.5 for national temperature calculations as well as in other products, including Climate at a Glance and the Climate Extremes Index.

For additional information on the improvements made to USHCN version 2.5, please see http://www.ncdc.noaa.gov/oa/climate/research/ushcn/.

============================================================

The page at http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ describes some the procedures, but makes heavy use of references on published papers, rather than provide an operational flowchart. This is an impediment to replication, and I suspect that given the mishmash of references they provided, few if any would be able to replicate the process fully.

Steve Goddard did an additional analysis and showed how the temperature changes over the different phases of the USHCN data:

Also in the monthly report from NCDC is this statement:

The average contiguous U.S. temperature during September was 67.0°F, 1.4°F above the 20th century average, tying September 1980 as the 23rd warmest such month on record. September 2012 marks the 16th consecutive month with above-average temperatures for the Lower 48.

According to the “platinum standard” state of the art US Climate Reference Network (USCRN) which has none of the problems and requires none of the adjustments of the problem plagued and aging USHCN network, the CONUS monthly average was 66.0°F.

That puts the temperature difference (above normal) at 0.4F, within the bounds of standard deviation. Ever wonder why NCDC never mentions the new state of the art USCRN data in their State of the Climate press releases but prefers to rely on the old network and its wonky hockey stick like adjustments? This is why.

I’ll have more on the official announcement about the map above soon.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
49 Comments
Inline Feedbacks
View all comments
john robertson
October 13, 2012 9:39 am

Is there a difference between desperation and amplifying the vertical scale until a signal appears to appear?Seems your govt is full of people who also need to hear those magic words…”Get a real job”.

cui bono
October 13, 2012 9:40 am

Sorry Americans. If the USA is warming at 0.3C per decade, you’re on your own. Those of us in the rest of the world will just carry on with mostly idling temperatures, as usual.

Bloke down the pub
October 13, 2012 9:45 am

They are heading down a dead end street. It is only possible to keep adding adjustments for so long before they become blatantly obvious to even the most blinkered disinterested observer. Seeing that they’re paid out of the government purse for their work, the looming prospect of criminal fraud charges must be worrying for some of them, but who will be the first to turn State’s evidence?

October 13, 2012 9:47 am

[SNIP – Mr. Barney, I’ve told you in the past that your vicious comparisons are not welcome here. Even though I’m not a fan of what NCDC does, I’m even less a fan of your categorizations and parallels which you attempt to ascribe Marxist and other connections. Get off my blog – Anthony]

pat
October 13, 2012 9:48 am

This is from the Joe Biden school of Warmist Alarmism. Where deranged behavior passes for debate.
Does any one remember this wild temperature extreme? Why no. Is it reflected in atmospheric satellite temperature readings? Why no. Near surface ocean satallite readings? No again. Any readings in the Southern Hemisphere swing upward like this? No again . Siberian inland readings? No. Alaska inland? No. Greenland Northern? No.

DirkH
October 13, 2012 9:55 am

Defund them all.
Borrowing Chinese money to pay for that crap is idiotic.

Eric Barnes
October 13, 2012 10:09 am

Thanks Anthony! 🙂

Jimbo
October 13, 2012 10:21 am

They are now simply brazen about it knowing full well that the media are comatose

Frank Kotler
October 13, 2012 10:57 am

“Teacher, may I have my lab report back so I can improve my observations?”

anthony holmes
October 13, 2012 11:10 am

Im afraid that the USA – as the only warming country in the world – is just asking to be overun by european immigrants fleeing from our crop failures and cold . Thanks for letting us know guys !

John Shade
October 13, 2012 11:20 am

If and when there is a sufficiently calm, rational, and neutral body with sufficient power, recognition, resources and competence, we might hope for a deep ‘official’ audit of adjustments within global and national temperature data sets and algorithms. In the meantime, we can but welcome posts such as these which highlight curious results.

Kelvin Vaughan
October 13, 2012 11:23 am

Frank Kotler says:
October 13, 2012 at 10:57 am
“Teacher, may I have my lab report back so I can improve my observations?”
Yes because the more of my pupils who get good marks the better the teacher I will look!

Editor
October 13, 2012 11:40 am

Meanwhile GHCN and GISS are still tampering with temperatures in Iceland.
http://notalotofpeopleknowthat.wordpress.com/2012/10/12/ghcn-adjustments-in-iceland/

Chuck L
October 13, 2012 11:42 am

I find this profoundly disturbing since there seems to be no recourse and we appear to be helpless to prevent the perversion of temperature records to further the global waming narrative.

Fred from Canuckustan.
October 13, 2012 11:48 am

“NCDC considers a “gold standard” for US temperature measurements”
Still flogging their pile of pyrite as the real stuff eh?
Fools and their gold are soon parted.

Resourceguy
October 13, 2012 11:56 am

New improved data bias, get it while it’s hot!

wayne
October 13, 2012 11:58 am

Dishonest is not strong enough of a word for these shenanigans. Write your representative and senators on this matter if you deem them honest, most know what to do, defunding sticks to civility. They can categorize the departments and lobbyists as they appear with hands held out.
I hate to be suspicious and I haven’t trace this instance down yet to see if it has occurred but I just bet they gave TX & OK a big upward boost in the latter years since we have mentioned the flatness more that a few times here on WUWT. That is what I am watching the closest, the adjustments to data cued by simple contrary mentions by commenters here. Has anyone else noticed this tendency? It’s kind of like — shut up or we will make the ‘data’ to always prove you a fool (but the old data snapshots tells a different story, like the second graph above).

October 13, 2012 12:21 pm

Just have a look at http://lwf.ncdc.noaa.gov/cmb-faq/anomalies.php or http://www.metoffice.gov.uk/hadobs/hadcrut3/diagnostics/global/nh+sh/
both data bases show a different world.
Look at http://www.drroyspencer.com/2012/10/uah-v5-5-global-temp-update-for-sept-2012-0-34-deg-c/ for another take on the same world, both slowly cooling.
It is definitely the cooling that we have to prepare to live, but also the warming, should it happen.
A solution is: more, cheaper energy, it can deal with both scenarios. I think.

October 13, 2012 12:30 pm

Probably a lot like the Bureau of Labor and Statistics when it was discovered that California’s jobless claims numbers weren’t in their latest report. They “didn’t leave it out,” California didn’t submit it in time.
So, er… what is the difference?

Anything is possible
October 13, 2012 12:31 pm

Maximum temperatures across eastern Colorado yesterday averaged about 70F……
…Apart from one apparent “freak heat flash” that NCDC measured in Kit Carson :
http://www.ncdc.noaa.gov/extremes/records/monthly/maxt/2012/10/12?sts%5B%5D=US#records_look_up
This sort of thing does not inspire confidence. How many more of their maximum temperature “records” are as dubious-looking as this one?

October 13, 2012 12:40 pm

And here I thought time machines only existed in science fiction stories. How else could they tell what a thermometer really said in the past. Maybe H. G. Wells works for the NOAA?

J Martin
October 13, 2012 12:58 pm

Fraudsters. Sadly not stupid enough to alter today’s temperatures, notice that the pivot point is today’s temperature.
One wonders what the future holds for these prostitutes of science ?
If they succeed in panicking a compliant puppet government (Dimocrats) into destroying the US economy (also here in the UK, OZ, Germany) leaving the population unprepared for a possible mini ice age, the US people may then appear less than grateful.
It would appear however, that they have prepared for just such an outcome;
http://wattsupwiththat.com/2012/08/14/wtf-national-weather-service-buying-hollow-point-bullets/
A strategy which may not prove too effective if the US ever find any judges who are not entirely gutless gullible incompetent dimwits, like the three that managed to declare co2 a pollutant.
There are better exit strategies than such a gung-ho amateurish response, time will tell if they can muster the professionalism and necessary sense of judgement to formulate and execute a more sophisticated exit strategy.
Temperatures are currently plateaued for a year or three at the top of the solar cycle, after that down is the only possible direction, a few years later the s89t should start to hit the fan.

Jit
October 13, 2012 1:02 pm

May I propose renaming the “100 foot rule” as the “centipede rule”, if no-one else has.
I hear it’s snowing in Australia…

October 13, 2012 1:25 pm

I have a question. These “adjustments” have obviously been made to past records. Are they only to the final result or have they been made to the individual readings to obtain the final result?
As I’ve mentioned before, I have the records for where I live from 2007, 2009 and 2012 and they don’t match up. Have they messed with the individual records to produce yet another “hockey stick” or have they mainly messed with the final result? A combination of the two?
Again, this is a question. Do we know what the method is?

Berényi Péter
October 13, 2012 1:28 pm

You mean the actual code doing those adjustments (with full documentation) is not published?
As code development is payed for by (American) taxpayer’s money, it can’t possibly be protected by copyright, can it? I do not think either it was classified. But even if it was, there can be no justification for that move whatsoever, so it can be challenged before court.
On the other hand, if none of the above applies, the code base, full docs included, is subject to FOIA.
That’s what you should go for. However, be wary and never let them to get away with giving out low quality scanned images of a printout, as they are inclined to do just that. Go for a full digital copy instead, for no one wants to bother with a tedious OCR job.
Once we have it in the open (let’s say in a public repo on github), it is only a matter of time & effort to uncover the full diverse set of hidden tricks that make the adjusted dataset trend up faster than even its worse subset does. The job is definitely doable and I am quite confident there are plenty of qualified experts on the net, willing to donate their time to such an extensive code audit.
A professional report to be published online following such an effort, including all drafts in a timely manner, with proper version control in place, would be a bomb. Especially its executive summary for policy makers and the press release about it.