NOAA's National Climatic Data Center caught cooling the past – modern processed records don't match paper records

We’ve seen examples time and again of the cooling of the past via homogenization that goes on with GISS, HadCRUT, and other temperature data sets. By cooling the data from the past, the trend/slope of the temperature for the last 100 years increases.

This time, the realization comes from an unlikely source, Dr. Jeff Masters of Weather Underground via contributor Christopher C. Burt. An excerpt of the story is below:

Inconsistencies in NCDC Historical Temperature Analysis

Jeff Masters and I recently received an interesting email from Ken Towe who has been researching the NCDC historical temperature database and came across what appeared to be some startling inconsistencies. Namely that the average state temperature records used in the current trends analysis by the NCDC (National Climate Data Center) do not reflect the actual published records of such as they appeared in the Monthly Weather Reviews and Climatological Data Summaries of years past. Here is why.

An Example of the Inconsistency

Here is a typical example of what Ken uncovered. Below is a copy of the national weather data summary for February 1934. If we look at, say Arizona, for the month we see that the state average temperature for that month was 52.0°F.

The state-by-state climate summary for the U.S. in February 1934. It may be hard to read, but the average temperature for the state of Arizona is listed as 52.0°F From Monthly Weather Review.

However, if we look at the current NCDC temperature analysis (which runs from 1895-present) we see that for Arizona in February 1934 they have a state average of 48.9°F, not the 52.0°F that was originally published:

Here we see a screen capture of the current NCDC long-term temperature analysis for Arizona during Februaries. Note in the bar at the bottom that for 1934 they use a figure of 48.9°.

Ken looked at entire years of data from the 1920s and 1930s for numerous different states and found that this ‘cooling’ of the old data was fairly consistent across the board. In fact he produced some charts showing such. Here is an example for the entire year of 1934 for Arizona:

The chart above shows how many degrees cooler each monthly average temperature for the entire state of Arizona for each month in 1934 was compared to the current NCDC database (i.e. versus what the actual monthly temperatures were in the original Climatological Data Summaries published in 1934 by the USWB (U.S. Weather Bureau). Note, for instance, how February is 3.1°F cooler in the current database compared to the historical record. Table created by Ken Towe.

Read the entire story here: Inconsistencies in NCDC Historical Temperature Analysis

================================================================

The explanation given is that they changed from the  ‘Traditional Climate Division Data Set’ (TCDD) to a new ‘Gridded Divisional Dataset’ (GrDD) that takes into account inconsistencies in the TCDD. “.

Yet as we have seen time and time again, with the exception of a -0.05°C cooling applied for UHI (which is woefully under-represented) all “adjustments, improvements, and fiddlings” to data applied by NCDC and other organizations always seem to result in an increased warming trend.

Is this purposeful mendacity, or just another example of confirmation bias at work? Either way, I don’t think private citizen observers of NOAA’s Cooperative Observer Program who gave their time and efforts every day for years really appreciate that their hard work is tossed into a climate data soup then seasoned to create a new reality that is different from the actual observations they made. In the case of Arizona and changing the CLimate Divisions, it would be the equivalent of changing state borders as saying less people lived in Arizona in 1934 because we changed the borders today. That wouldn’t fly, so why should this?

Sure there are all sorts of “justifications” for these things published by NCDC and others, but the bottom line is that they are not representative of true reality, but of a processed reality.

h/t to Dr. Ryan Maue.

UPDATE: Here’s a graph showing cumulative adjustments to the USHCN subset of the entire US COOP surface temperature network done by Zeke Hausfather and posted recently on Lucia’s Blackboard:

This is calculated by taking USHCN adjusted temperature data and subtracting  USHCN raw temperature data on a yearly basis.  The TOBS adjustment is the lion’s share.

Advertisements

  Subscribe  
newest oldest most voted
Notify of
Andrew Greenfield

Time to call the cops!

This is an outrageous foul! Where’s the Umpire when you need him in this game?

John Bills

It is called modelling. And as I know they are not very good in that.

Joachim Seifert

What are “Inconsistencies”? – Modern term for “Cooking the books”?

CodeTech

We need Ed Begeley Jr on this. He can reassure us that it’s all Peer Reviewed and thus the people who measured the temperature back then, not being Climatologists with letters after their names, didn’t know how to read thermometers.

Owen in GA

If 1934 was cooled by an average of three degrees, wouldn’t that mean we are actually cooler now than then?

Taphonomic

“And if all others accepted the lie which the Party imposed -if all records told the same tale — then the lie passed into history and became truth. ‘Who controls the past,’ ran the Party slogan, ‘controls the future: who controls the present controls the past.’ And yet the past, though of its nature alterable, never had been altered. Whatever was true now was true from everlasting to everlasting. It was quite simple. All that was needed was an unending series of victories over your own memory.”
Doubleplusungood!

Not familiar with the American system. Can you impeach the weatherfolk?
Or is Andrew geenfield riht – and the only option is to call Phoenix PD and lay information about a fraud? That won’t do science, generally, a lot of good.
Are we moving to a post-scientific age, where celebrity – or Teamwork – trumps facts, carfully researched and un-dramatically presented?
[On present trend, DC and much of Maryland will be covered by a kilometer-deep ice cap by 2025, extrapolating the cooling seen last night, using a model I made (up) earlier. /Sarc. /Not real!]
Seriously, a rational human would expet adjustmenst to be, basically, fairly evenly distributed about the neutral.
As described, it smacks of a very biased die.
Cui Bono?

Chris B

War is peace. Hot is cold. What else is new?

I hope these data vandals are keeping a careful record of their ‘adjustments’ so the records can be un-bent later.

People have gone to jail for less than this.

Luther Wu

All who still trust government, raise your hands.

Luther Wu

Both of them… this is a stickup.

Latitude

Don’t they have to adjust for the sea floor sinking or something like that…………………
Steven Goddard has been posting these “inconsistencies” for years…………….

Andrew

Its time the big people that matter that are likely to be running the show soon are made aware of this and the Hockey stick Scandals (yes it has now become plural) ie persons such as Romney, Abbot and with money like Gina Rhinehart in Australia

cui bono

“From the Peoples Central Statistical Office. The current five-year plan continues to advance well ahead of schedule thanks to the glorious foresight of the Great Leader. Tractor production is up 45.3% since last year. Work harder, dedicated Stakhanovites, for the Great Day will soon dawn for us all.”

SteveSadlov

Got to hide those warm 1930s!

Being NOAA, wouldn’t Congress have oversight??? Someone ought to send this info. to the proper oversight Congressional Chairmen!

jitthacker

Well – it should be obvious that if the explanation is true, then as many of the gridcells should have increased temperatures as have decreased temperatures.
Without knowing how thorough the survey and its reporting is, we can’t say this is systematic bias. Hopefully the author will confirm whether they balance out.

Phil C

Is this purposeful mendacity, or just another example of confirmation bias at work? Either way, I don’t think private citizen observers of NOAA’s Cooperative Observer Program who gave their time and efforts every day for years really appreciate that their hard work is tossed into a climate data soup then seasoned to create a new reality that is different from the actual observations they made.
You cut off the Weatherunderground story just as it gets good. In your view, if I happened to be stationed a mere ten yards away from another researcher, and we are supposed to be two people covering an area of hundreds of square miles, I should be concerned that my hard work was “tossed into a climate data soup” rather than properly considered as not accurately representative of the region we were supposed to be covering? If you had reprinted the full quotation provided over at weather underground that they had taken from Transitioning from the traditional divisional dataset to the Global Historical Climatology Network-Daily gridded divisional dataset it would be clear that these adjustments make scientific sense and are not biased. I think the right thing to do at this point would be for you to bring the substance of that reference over here to prove me wrong, but I’ll wager you don’t do that.
REPLY: And you didn’t read what I said about them. They make no sense to me. Explain why virtually every adjustment made to the raw data causes a temperature trend increase. That’s your challenge. See also comment directly above. Moving the data boundaries around should balance out. It apparently doesn’t. This sort of adjustment wouldn’t be tolerated in a stock report if it made trends/performance improve. The SEC would be on that like white on rice. Why should it be any different here? – Anthony

Myron Mesecke

Why is it that the older data that had less influence from man made structures, roads, changes to land, no air conditioning units, fewer parking lots and smaller airports is considered to have inconsistencies and must be “adjusted”?
The newer data is what is truly messed up with “man made” changes that affect the measurements.

kadaka (KD Knoebel)

http://wattsupwiththat.com/2012/04/13/warming-in-the-ushcn-is-mainly-an-artifact-of-adjustments/
As Dr. Roy Spencer said about the NOAA-NCDC USHCN record, 1973-2012 (read original post for full context):


2) Virtually all of the USHCN warming since 1973 appears to be the result of adjustments NOAA has made to the data, mainly in the 1995-97 timeframe.

And I must admit that those adjustments constituting virtually all of the warming signal in the last 40 years is disconcerting. When “global warming” only shows up after the data are adjusted, one can understand why so many people are suspicious of the adjustments.

A year ago, I believed global warming was a gentle linear trend, starting around 1850 after the Little Ice Age, easily manageable and not a problem.
Now, I’m wondering if there really is any sort of linear trend, from about the start of the 20th century to now, or if there are really just “surges” like the short 1998 Super El Nino and longer ones like a positive PDO, a “charging” of the global temperature systems, that wears off in time, a “discharging”. And if that noticeable long-term trend is only a result of the data mangling.
And with the negative PDO and other indicators, we have about 20 years of global cooling coming which should knock down that slope, provided more adjustments don’t “hide the decline”.
Is it now conceivable that the already-seen “climate change” the CAGW doomsayers insist foretells devastation to come, never even happened?

kramer

What is the point of placing (I assume) calibrated thermometers at various locations and then later on, adjusting what those thermometers recorded in the past?
That’s about the same as measuring and recording the height of say trees in a location at a certain date and then at some point in the future, going back and adjusting the recorded height of those trees.
Fraud comes to mind.
I kind of would like to see an all comprehensive report on all the temperature adjustments ever made and ‘weather’ (it’s a pun) or not the vast majority of adjustments have made global warming look worse, about the same, or not as bad. If I were to guess, I’d say these adjustments on the whole, make AGW look worse.
Reminds me a bit of what James Lovelock said a few years about about how 80% of the ozone measurements were either faked or incompletely done. (I’m still waiting for the MSM to pick up on this and do an investigation into this claim just like they would do if a scientist from big tobacco or big oil had admitted in a newspaper that 80% of their measurements were faked or incompletely done.)

How extensive are the anomalies? I can’t find where he says that in the article

just some guy

Well, the problem is that the highly authorative IPCC has shown that there is an overwelming consensus that anthropogenic factors have caused warming in the 20th century which is unprecedented. Therefore any data which does not agree with this consensus, such as the Arizona data from February 1934, must clearly represent the fringe viewpoint and must be corrected. The adjustments made to the data is obviously in line with the 98% consensus viewpoint and is therefore scientifically justified. Many, many, studies in Science and Nature have confirmed this.
/sarc (did I make you feel nauseous just now.)

LamontT

“Luther Wu says:
June 6, 2012 at 2:10 pm
All who still trust government, raise your hands.”
==================================================
Lost trust? I didn’t have it to begin with.
Typical. We have to fix the data to correct for unspecified errors that led to a much to warm report at the time. ::sigh:: And then they with a straight face ask us to believe them.

coalsoffire

Being a bear of very little brain I have a question. Can this sort of trick work forever? Will the artificial wave in the temperature anomaly just keep rolling along? In other words if you constantly adjust the past down and tinker a bit upward with the present to produce a constant upward trend, regardless of what is actually happening in the world, does the wave you have created ever crash on the shore? And if the natural variation is upward a bit too, well… bonus.
If we were dealing with financial fraud or embezzelment of this nature the trick would eventually become untenable. But with Climate Science it seems to be the perfect crime. Gotta love those anomolies. I guess it works because all you really have to do is create the illusion of rising temperatures, and you are chipping back the the old temperatures that you raised earlier on, so it never gets that far out of the ordinary. Whereas in financial fraud the object is to actually remove money from the system, it’s not enough to make it look like you are making money. That eventually sinks you.
Also I don’t see why this is news to anyone. I’ve been reading about this adjustment exercise for years now on this blog and elsewhere and no one has actually ever denied it. It’s just a rule of “climate science” that must be taught in CS 101. All past temperature adjustments are downwards and all present temperature adjustments are upwards. This keeps the narrative alive and the funding flowing. It’s a fundamental professional conceit. If you can’t do that you can’t be a true climate scientist.

crosspatch

And if you check monthly you will notice that the discrepancy increases over time. With every passing month NCDC adjusts pre-1950 temperatures down a bit more and post-1950 temperatures up.
Shown here is the amount by which the NCDC database has changed since May 2008 until April 2012.
http://climate4you.com/images/NCDC%20MaturityDiagramSince20080517.gif

Ian W

Now you see why the Team loses input data or cannot show it due to ‘non-disclosure agreements’, or just flat out refuses to release data under FOIA requests. The team will see their fault here as failing to ‘lose’ (or hide the decline in) the State records.
It is a relatively simple choice – mendacity or incompetence – in either case they should not be funded and allowed to continue.

This page has a link to a NOAA chart of the adjustments. It essentially shows that there is NO warming since the 1930s without the adjustments:
http://jennifermarohasy.com/2009/06/how-the-us-temperature-record-is-adjusted/
Thanks
JK

noaaprogrammer

Someone should run Benford’s Law of first and second leading digits on the adjusted data to verify that tampering has been done. This is a statistical test that is run on accounting data to alert auditors of book cooking. The test is independent of any kind of data and units used.

JFD

quidsapio, you missed a sentence in Anthony’s paste of the original paper. It reads, “Ken looked at entire years of data from the 1920s and 1930s for numerous different states and found that this ‘cooling’ of the old data was fairly consistent across the board.”
JFD

pochas

Lawyers, here’s your chance! Just think up a basis for a legal action based on some injury this fraud has caused. Regulations based on this chicanery have produced financial losses everywhere.

Fraud.
Anybody else anywhere else about anything else, and they’d go to jail.

We the people must stand up to the lies of AGW. This is unbelievable. Changing the truth to fit the lie is deplorable!

We have all been on about this before and will continue for some time yet I am sure. Homogenization is fools effort, as the results fool even the doer.

DocMartyn

This is something many of us have long suspected. My guess is that the vast majority of paper documents have been or will be lost, water logged or shredded.
In 20 years time we will be shocked that glaciers were not covering the 30’s dust bowl states.

just some guy

Holy Hockey-Sticks, Batman! The NOAA instrumental record is nothing but a protracted series of computers models. Take a look at the “Quality Control, Homogeneity Testing, and Adjustment Procedure”.
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ushcn.html#QUAL
At nearly every step, the data from the previous is entered into a computer program which makes the adjustments. Examples:
“The TOB debiased data are input into the MMTS program and is the second adjustment.”
“The debiased data from the second adjustment are then entered into the Station History Adjustment Program or SHAP. ”
“Each of the above adjustments is done is a sequential manner. The areal edits are preformed first and then the data are passed through the following programs (TOBS, MMTS, SHAP and FILNET). ”
It’s impossible for an outside to see verify any of these steps since (at least just going by what they put on thier website), it is impossible to get “under the hood” on thier procedures. I wonder if anyone’s looked into getting the “code” from NOAA.

LazyTeenager

Sure there are all sorts of “justifications” for these things published by NCDC and others, but the bottom line is that they are not representative of true reality, but of a processed reality.
———————–
Surely you can’t say this until you understand the differences between the methods by which the 2 averages were calculated. Assuming that the older values are correct and the new values are incorrect just because you like the older values better does not standup.
Let’s see, speaking hypothetically cos I don’t know, if it turns out that the old temps are measured at places that people prefer to live and those places are cool in a hot state, that will introduce a bias if you just add the numbers up and divide by the number of places. If those places are at a particular altitude, those places might warm more than other places, not necessarily with some simple linear relationship. so those old averages might be wrong.
So here is another article lined up for you. When the weather service says there were deficiencies in the old method of calculating average temps what were those deficiencies? It must be documented somewhere.

DocMartyn

“REPLY: Explain why virtually every adjustment made to the raw data causes a temperature trend increase. – Anthony”
I can never understand why a large pay increase has less of an impact than a small tax increase.

Green Sand

DocMartyn says:
June 6, 2012 at 3:12 pm

———————————————————–
Just a few minutes of Fahrenheit 451: the autoignition temperature of paper.
“In an oppressive future, a fireman whose duty is to destroy all books begins to question his task. “
http://www.imdb.com/title/tt0060390/

NeedleFactory

I’m not playing Devil’s advocate, but trying to find a sympathetic interpretation of the “adjustments” I came up with this:
Imagine the”surface” suggested when the thermometer readings are z-coordinates and the thermometer locations are x- and y-coordinates. With some knowledge of the actual terrain, some readings may be “suspiciously” low (or high), and might be adjusted by some kriging algorithm. (Think of, say, Colorado with five thermometers spaced like the pips on the five-of-hearts playing card, all reading about the same except for the central one.)
Furthermore, faulty thermometers might be more prone to read a bit low rather than a bit high. Were this the case, a non-biased adjustment might actually “cool the past”.
I’m not saying it’s so, just thinking aloud. Whatever adjustments were made, the rationale and the algorithms should be available, as well as the raw data. Too much to ask, I fear.

NeedleFactory

Should have said “a bit high”

David Corcoran

What will the temperature in 1934 have been tomorrow? And the day after that? Is there some kind of time machine involved in these changes?

juanslayton

The chart above shows how many degrees cooler….
Should this read, “…how many degrees cooler….?

juanslayton

Nuts. Make that, “how many degrees warmer“.

As Phil C noted, the Jeff Masters link goes on to explain the reasons for the change. And I think they make a lot of sense. They are going to a more modern gridded system. I think this bullet point is the key:
“1. For the TCDD, each divisional value from 1931-present is simply the arithmetic average of the station data within it, a computational practice that results in a bias when a division is spatially under sampled in a month (e.g., because some stations did not report) or is climatologically inhomogeneous in general (e.g., due to large variations in topography).”
In the original calc, the average for Arizona was just the average of whatever stations reported in each month. That’s pretty much what you have to do if you’re working with pen and paper.
It’s the reason why most climate work is now done with anomalies. Then you don’t get the effect that a month will seem warmer when mountain stations don’t report, etc.
Anyway, if you have to work with absolute temperatures, then you have to look at what kind of stations you have in the mix when calculating an average. And the least you can do is area weighting, which the new system seems to include. If you have an area represented by few stations, you give them more weight in the average.
That’s probably why the average has gone down. It’s likely that mountain regions in Arizona were underrepresented. If you upweight stations there it brings down the average.
The real question is whether this has anything to do with trend.

DR

Golly gee, I wonder what Mosher will say about this? Will it be the standard “trust the climate scientists, they know what they’re doing”?
Lukewarmers agree with all temperature adjustments. We little people are just too…… little…..to understand the highly technical procedure of sorting and analyzing data, or reading thermometers for that matter.
We can expect Zeke & Co.R will replicate the results thereby validating the “adjustments”, then later Muller will have to reevaluate his data set so that it too agrees.
Sorry for the snark, but frankly it is beyond the pale this is called “science”. It certainly wouldn’t pass any industrial standard I’m aware of.

David Falkner

noaaprogrammer:
Not sure Benford’s analysis is appropriate here. I think you’d be likely to skew a little heavy towards 1 as the first digit just because of the way the numbers fall on the orders of magnitude involved.

Neville

What is wrong with you Americans? If you’re sure of the info above why don’t you take them to court?
There is trillions $ riding on the CAGW fraud and you’d be doing your taxpayers and all other taxpayers on the planet a favour if you could expose this fraud and con trick.
First move should be a spot on Fox news or whatever to get the ball rolling. But some how you must get some REAL publicity. If a major network runs with this story others must respond and then print and pollies have to join in and respond as well.
This is the only way it will work, always has, always will. You’re currently discussing this in house but you need to break out into the neighbourhood.