NOAA's National Climatic Data Center caught cooling the past – modern processed records don't match paper records

We’ve seen examples time and again of the cooling of the past via homogenization that goes on with GISS, HadCRUT, and other temperature data sets. By cooling the data from the past, the trend/slope of the temperature for the last 100 years increases.

This time, the realization comes from an unlikely source, Dr. Jeff Masters of Weather Underground via contributor Christopher C. Burt. An excerpt of the story is below:

Inconsistencies in NCDC Historical Temperature Analysis

Jeff Masters and I recently received an interesting email from Ken Towe who has been researching the NCDC historical temperature database and came across what appeared to be some startling inconsistencies. Namely that the average state temperature records used in the current trends analysis by the NCDC (National Climate Data Center) do not reflect the actual published records of such as they appeared in the Monthly Weather Reviews and Climatological Data Summaries of years past. Here is why.

An Example of the Inconsistency

Here is a typical example of what Ken uncovered. Below is a copy of the national weather data summary for February 1934. If we look at, say Arizona, for the month we see that the state average temperature for that month was 52.0°F.

The state-by-state climate summary for the U.S. in February 1934. It may be hard to read, but the average temperature for the state of Arizona is listed as 52.0°F From Monthly Weather Review.

However, if we look at the current NCDC temperature analysis (which runs from 1895-present) we see that for Arizona in February 1934 they have a state average of 48.9°F, not the 52.0°F that was originally published:

Here we see a screen capture of the current NCDC long-term temperature analysis for Arizona during Februaries. Note in the bar at the bottom that for 1934 they use a figure of 48.9°.

Ken looked at entire years of data from the 1920s and 1930s for numerous different states and found that this ‘cooling’ of the old data was fairly consistent across the board. In fact he produced some charts showing such. Here is an example for the entire year of 1934 for Arizona:

The chart above shows how many degrees cooler each monthly average temperature for the entire state of Arizona for each month in 1934 was compared to the current NCDC database (i.e. versus what the actual monthly temperatures were in the original Climatological Data Summaries published in 1934 by the USWB (U.S. Weather Bureau). Note, for instance, how February is 3.1°F cooler in the current database compared to the historical record. Table created by Ken Towe.

Read the entire story here: Inconsistencies in NCDC Historical Temperature Analysis

================================================================

The explanation given is that they changed from the  ‘Traditional Climate Division Data Set’ (TCDD) to a new ‘Gridded Divisional Dataset’ (GrDD) that takes into account inconsistencies in the TCDD. “.

Yet as we have seen time and time again, with the exception of a -0.05°C cooling applied for UHI (which is woefully under-represented) all “adjustments, improvements, and fiddlings” to data applied by NCDC and other organizations always seem to result in an increased warming trend.

Is this purposeful mendacity, or just another example of confirmation bias at work? Either way, I don’t think private citizen observers of NOAA’s Cooperative Observer Program who gave their time and efforts every day for years really appreciate that their hard work is tossed into a climate data soup then seasoned to create a new reality that is different from the actual observations they made. In the case of Arizona and changing the CLimate Divisions, it would be the equivalent of changing state borders as saying less people lived in Arizona in 1934 because we changed the borders today. That wouldn’t fly, so why should this?

Sure there are all sorts of “justifications” for these things published by NCDC and others, but the bottom line is that they are not representative of true reality, but of a processed reality.

h/t to Dr. Ryan Maue.

UPDATE: Here’s a graph showing cumulative adjustments to the USHCN subset of the entire US COOP surface temperature network done by Zeke Hausfather and posted recently on Lucia’s Blackboard:

This is calculated by taking USHCN adjusted temperature data and subtracting  USHCN raw temperature data on a yearly basis.  The TOBS adjustment is the lion’s share.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
5 1 vote
Article Rating
193 Comments
Inline Feedbacks
View all comments
DR
June 6, 2012 3:57 pm

Nick Stokes says:
June 6, 2012 at 3:42 pm
As Phil C noted, the Jeff Masters link goes on to explain the reasons for the change. And I think they make a lot of sense. They are going to a more modern gridded system. I think this bullet point is the key:
“1. For the TCDD, each divisional value from 1931-present is simply the arithmetic average of the station data within it, a computational practice that results in a bias when a division is spatially under sampled in a month (e.g., because some stations did not report) or is climatologically inhomogeneous in general (e.g., due to large variations in topography).”
In the original calc, the average for Arizona was just the average of whatever stations reported in each month. That’s pretty much what you have to do if you’re working with pen and paper.
It’s the reason why most climate work is now done with anomalies. Then you don’t get the effect that a month will seem warmer when mountain stations don’t report, etc.
Anyway, if you have to work with absolute temperatures, then you have to look at what kind of stations you have in the mix when calculating an average. And the least you can do is area weighting, which the new system seems to include. If you have an area represented by few stations, you give them more weight in the average.
That’s probably why the average has gone down. It’s likely that mountain regions in Arizona were underrepresented. If you upweight stations there it brings down the average.
The real question is whether this has anything to do with trend.

Translation: there are no standards in climate “science”, so make it up as you go along. If it cools the past and/or warms the present, it must be correct.
I’d like to see these people pass an A2LA audit.

Green Sand
June 6, 2012 3:57 pm

Nick Stokes says:
June 6, 2012 at 3:42 pm

The real question is whether this has anything to do with trend.
==========================================================
Yup Nick, that really is the question. If a guy stands still in Arizona and the average daily temperature where he stands does not change for a century what is the trend?

wayne Job
June 6, 2012 3:59 pm

Once can be a mistake, twice can be a co-incidence, thrice is on purpose. If this is computer generated, one can only assume that the programming is deliberately biased, or would have been corrected by now.

Steptoe Fan
June 6, 2012 4:08 pm

http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ushcn.html#QUAL
The seven ( 7 ) references at the bottom of the article show as links, yet none of them are active. How can one look under the hood ?
Sigh, why do they want to make it hard ?

Glenn
June 6, 2012 4:08 pm

Phil C says:
June 6, 2012 at 2:34 pm
“In your view, if I happened to be stationed a mere ten yards away from another researcher, and we are supposed to be two people covering an area of hundreds of square miles, I should be concerned that my hard work was “tossed into a climate data soup” rather than properly considered as not accurately representative of the region we were supposed to be covering?”
He should be concerned for his own sanity as well as the other’s, by thinking that temp data for an area of hundreds of square miles could be taken from two measurement locations ten yards apart. I’m concerned for those that think this needs to be considered at all.

June 6, 2012 4:09 pm

I would like (love) to see an audit of some of the specific paper results for sites in the current records to the reported results for those sites as shown in Anthony’s surface project. Paper to excel, column to column, single site comps. Adding the audit of the real records to the surface project as a whole would be interesting. As reported meets As Adjusted.
I have a funny feeling that the reported 1 degree of warming in the last century is possibly off a bit more then anyone is ready to accept. Irony is if they have flipped the trend like the upside series embedded in other work.

Glenn
June 6, 2012 4:10 pm

Green Sand says:
June 6, 2012 at 3:57 pm
” If a guy stands still in Arizona and the average daily temperature where he stands does not change for a century what is the trend?”
Depends on his boot size.

WE TOLD YOU SO
June 6, 2012 4:13 pm

Anthony, I’ll kindly point to you my friend that, many MANY of us have been telling anyone who wouldn’t listen, that
*IT HASN’T WARMED EVEN UNUSUALLY for practically speaking, in instrument-recorded history.*
***All this warming over and above standard deviations leaving last cold period,***
***IS BEING CONSTRUCTED through FALSIFICATION of R.E.C.O.R.D.S.***
And, there’s another thing: this Magic Gas thing everyone was so scared of?
Where’s the infrared astronomy & optical astronomy fields’ constant chorus for us to all
***LOOK at the PHOTOS of EVER RISING EVIDENCE of HEAT in the ATMOSPHERE accompanying rising levels of THE MAGIC GAS?***
It’s not there kids, because there ISN’T any more infrared in the atmosphere now than usual,
READ MY LIPS: THERE’S L.E.S.S.
This is, simply, i.m.p.o.s.s.i.b.l.e.
unless those people are committing C.R.I.M.E.
Anthony I know I come here and type like I’m a walking billboard or something: but my field, is two way radio communications. Do you know what the generic name for this field is?
The electronic engineering associated with the calibration maintenance and usage of all instrumentation
associated with the transmission, capture, and analysis of electromagnetic energy through the atmosphere, space, and industrial compounds.
This includes the controls associated with nearly everything under the sun that has a button: and I assure you,
there are a dozen ways there is proof there’s no more energy in the atmosphere, so many THEY COULDN’T BE HIDDEN: and people have been telling again, anyone who *wouldn’t listen*
that, that magic gas story is utter, utter, fabrication from nearly syllable one. Utter falsification of everything under the sun I say, and the fact that industries associated with advanced instrumentation aren’t making note in their industry rags about the ‘recent accomodations in calibration/instrumentation technologies to the ever warming environment’.
This has been crime
from
the
Beginning.

WE TOLD YOU SO
June 6, 2012 4:16 pm

There’s a typo at the end there where I should have inserted, “…to the ever warming environment, are proof that
This has been crime
etc.
Sorry

AlexS
June 6, 2012 4:18 pm

Taphonomic says:
June 6, 2012 at 1:55 pm
“And if all others accepted the lie which the Party imposed -if all records told the same tale — then the lie passed into history and became truth. ‘Who controls the past,’ ran the Party slogan, ‘controls the future: who controls the present controls the past.’ And yet the past, though of its nature alterable, never had been altered. Whatever was true now was true from everlasting to everlasting. It was quite simple. All that was needed was an unending series of victories over your own memory.”
Very appropriate.

Phil C
June 6, 2012 4:22 pm

And you didn’t read what I said about them. They make no sense to me.
I did read what you said and what the GrDD authors write makes sense to me. Here’s a simplied description of what they’ve done:
1. draw a tic-tac-toe grid (3 x 3 square)
2. fill in all 9 squares with a temperature reading.
3. fill in the 3 top squares again with an additional temperature reading.
4. You’ve now got 12 readings: average them. That’s the old method (TCDD).
Is that the average for the entire area? Of course it isn’t. You’re taking too many readings in the top row. To correct for the bias in the top row, you should first average the two numbers in each square of the top row, and then use those three readings with the remaining six to find your average over the entire area.
This is exactly how I understand what the GrDD authors are doing when they write this:
For the TCDD, each divisional value from 1931-present is simply the arithmetic average of the station data within it, a computational practice that results in a bias when a division is spatially under sampled in a month (e.g., because some stations did not report) or is climatologically inhomogeneous in general (e.g., due to large variations in topography).
They’ve added additional adjustments for missing data, etc., and of course the compuation of the areas is far more complicated, but the impact is the same: an average which treats every reading without introducting bias for the size of the area. Doing this method means that there’s no reason why the numbers should “balance out” as you write, because that is purely a function of area size and the readings.
REPLY: The failure in your logic is that grids don’t follow state boundaries, thus when giving an area average number from grid data, it cannot accurately represent the state average (as calculated before) because it will also include stations outside of the state boundary. Depending on how coarse the grid is determines how many stations outside of the state get included. Thus when NCDC displays a grid derived state value, it is not truly representative. But there’s more to it than that. The majority of the states show this flaw. WUWT? I’m trying to locate the author to find out more. – Anthony

KenB
June 6, 2012 4:26 pm

Will the great law suits flood of this century begin with a trickle, then spread to ecompass professional scientific organisations (cozy clubs) that “should have known” but defended the indefensible or will they quickly join the accusers/lawyers to try and cover their butts?

just some guy
June 6, 2012 4:26 pm

Glenn says:
June 6, 2012 at 4:08 pm
Phil C says:
June 6, 2012 at 2:34 pm
“In your view, if I happened to be stationed a mere ten yards away from another researcher, and we are supposed to be two people covering an area of hundreds of square miles, I should be concerned that my hard work was “tossed into a climate data soup” rather than properly considered as not accurately representative of the region we were supposed to be covering?”
He should be concerned for his own sanity as well as the other’s, by thinking that temp data for an area of hundreds of square miles could be taken from two measurement locations ten yards apart. I’m concerned for those that think this needs to be considered at all.

Exactly true. And therein lies the problem. The resulting product can only possibly be an estimate , and in this case, not a very accurate one. Estimates require the estimator to make judgements in order to fill in the blanks. Whether intentional or not, those judgements are subject to estimator bias.
The result is that we have output from protracted computer models being called “instrumental data”.

Green Sand
June 6, 2012 4:27 pm

Glenn says:
June 6, 2012 at 4:10 pm

“Depends on his boot size.”
================================
We are sure that the boot size was constant but are considering possible adjustments for variations in sole thickness and changes over time from natural to man made materials?

just some guy
June 6, 2012 4:31 pm

“1. draw a tic-tac-toe grid (3 x 3 square)
2. fill in all 9 squares with a temperature reading.
3. fill in the 3 top squares again with an additional temperature reading.
4. You’ve now got 12 readings: average them. That’s the old method (TCDD).”
This would be fine, if only it were that simple. Unfortunately, it is not, since weather station data going back over 100 years does not conveniently provide raw data to fill in all those tic-tac-toe square.

davidmhoffer
June 6, 2012 4:40 pm

Green Sand says:
June 6, 2012 at 3:30 pm
DocMartyn says:
June 6, 2012 at 3:12 pm
———————————————————–
Just a few minutes of Fahrenheit 451: the autoignition temperature of paper.
>>>>>>>>>>>>>>>
A rather apt observation, and one of the great works of science fiction of all time, one that is apt for this thread considering that the theme of the book was government destroying all paper records of everything so that they would be the only ones with the “truth”.
Sadly, Ray Bradbury (the author) passed away yesterday.

June 6, 2012 4:45 pm

If they modify the data that is printed and saved in the Library of Congress, imagine what they are doing with the data from satellites!

Tom in Worcester
June 6, 2012 5:00 pm

If this organization is publicly funded, someone should send a link to someone in Senator Inhofe’s website.

June 6, 2012 5:00 pm

It’s difficult to believe that Anthony still makes a living peddling this twaddle. Go to the NCDC here, http://www.ncdc.noaa.gov/cmb-faq/temperature-monitoring.html, and you will see the adjusted temps, the raw data and the complete methodology for insuring the most accurate records. After which, come back here and explain why the less accurate data is preferrable. You might also want to explain why the NCDC adjusted SST’s decreased the trend over the raw data. JP

u.k.(us)
June 6, 2012 5:04 pm

LazyTeenager says:
June 6, 2012 at 3:25 pm
“Let’s see, speaking hypothetically cos I don’t know, ……”
=============
Join the club.
We’ll try to integrate the information offered, while paying taxes to our betters that have shown themselves as most qualified to spend our hard earned money.
The good times are over, one way or another.

Evan Jones
Editor
June 6, 2012 5:05 pm

atarsinc, I approved your post, but I’m here to say you really don’t have any idea what you are talking about.
And, yes, I’ve carefully studied the raw and adjusted USHCN data. The adjustment procedure is shocking and scandalous.

just some guy
June 6, 2012 5:10 pm

REPLY: “……I’m trying to locate the author to find out more. – Anthony”
Anthony, if you are chasing down more information about the adjustments, perhaps the “TOB” would be worthwhile.
“Next, the temperature data are adjusted for the time-of-observation bias (Karl, et al. 1986) which occurs when observing times are changed from midnight to some time earlier in the day. ”
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ushcn.html#QUAL
From the graph on that site, TOB (the dashed line) appears to be the biggest driver of suppressing earlier 20th century temperatures.

June 6, 2012 5:10 pm

Reblogged this on Climate Ponderings and commented:
Caught fingering the cool to make it warm

Rogelio escobar
June 6, 2012 5:12 pm

I would not be surprised if this is removed from the wunderground site soon… make a photocopy

June 6, 2012 5:12 pm

Now I’m really confused. The Medieval Warm Period disappeared because of a tree ring. We’re supposed to trust that tree ring. A tree ring is wood. Paper is made from wood.
Now the recent records recorded on parer are being “disappeared”. We’re not supposed to trust the numbers recorded on that wood. What’s wrong with these tree ring records?