We’ve seen examples time and again of the cooling of the past via homogenization that goes on with GISS, HadCRUT, and other temperature data sets. By cooling the data from the past, the trend/slope of the temperature for the last 100 years increases.
This time, the realization comes from an unlikely source, Dr. Jeff Masters of Weather Underground via contributor Christopher C. Burt. An excerpt of the story is below:
Inconsistencies in NCDC Historical Temperature Analysis
Jeff Masters and I recently received an interesting email from Ken Towe who has been researching the NCDC historical temperature database and came across what appeared to be some startling inconsistencies. Namely that the average state temperature records used in the current trends analysis by the NCDC (National Climate Data Center) do not reflect the actual published records of such as they appeared in the Monthly Weather Reviews and Climatological Data Summaries of years past. Here is why.
An Example of the Inconsistency
Here is a typical example of what Ken uncovered. Below is a copy of the national weather data summary for February 1934. If we look at, say Arizona, for the month we see that the state average temperature for that month was 52.0°F.
![]()
The state-by-state climate summary for the U.S. in February 1934. It may be hard to read, but the average temperature for the state of Arizona is listed as 52.0°F From Monthly Weather Review.
However, if we look at the current NCDC temperature analysis (which runs from 1895-present) we see that for Arizona in February 1934 they have a state average of 48.9°F, not the 52.0°F that was originally published:
![]()
Here we see a screen capture of the current NCDC long-term temperature analysis for Arizona during Februaries. Note in the bar at the bottom that for 1934 they use a figure of 48.9°.
Ken looked at entire years of data from the 1920s and 1930s for numerous different states and found that this ‘cooling’ of the old data was fairly consistent across the board. In fact he produced some charts showing such. Here is an example for the entire year of 1934 for Arizona:
![]()
The chart above shows how many degrees cooler each monthly average temperature for the entire state of Arizona for each month in 1934 was compared to the current NCDC database (i.e. versus what the actual monthly temperatures were in the original Climatological Data Summaries published in 1934 by the USWB (U.S. Weather Bureau). Note, for instance, how February is 3.1°F cooler in the current database compared to the historical record. Table created by Ken Towe.
Read the entire story here: Inconsistencies in NCDC Historical Temperature Analysis
================================================================
The explanation given is that they changed from the ‘Traditional Climate Division Data Set’ (TCDD) to a new ‘Gridded Divisional Dataset’ (GrDD) that takes into account inconsistencies in the TCDD. “.
Yet as we have seen time and time again, with the exception of a -0.05°C cooling applied for UHI (which is woefully under-represented) all “adjustments, improvements, and fiddlings” to data applied by NCDC and other organizations always seem to result in an increased warming trend.
Is this purposeful mendacity, or just another example of confirmation bias at work? Either way, I don’t think private citizen observers of NOAA’s Cooperative Observer Program who gave their time and efforts every day for years really appreciate that their hard work is tossed into a climate data soup then seasoned to create a new reality that is different from the actual observations they made. In the case of Arizona and changing the CLimate Divisions, it would be the equivalent of changing state borders as saying less people lived in Arizona in 1934 because we changed the borders today. That wouldn’t fly, so why should this?
Sure there are all sorts of “justifications” for these things published by NCDC and others, but the bottom line is that they are not representative of true reality, but of a processed reality.
h/t to Dr. Ryan Maue.
UPDATE: Here’s a graph showing cumulative adjustments to the USHCN subset of the entire US COOP surface temperature network done by Zeke Hausfather and posted recently on Lucia’s Blackboard:
This is calculated by taking USHCN adjusted temperature data and subtracting USHCN raw temperature data on a yearly basis. The TOBS adjustment is the lion’s share.
![USHCN-adjustments[1]](http://wattsupwiththat.files.wordpress.com/2012/06/ushcn-adjustments1.png)
“Luther Wu says:
June 6, 2012 at 2:10 pm
All who still trust government, raise your hands.”
==================================================
Lost trust? I didn’t have it to begin with.
Typical. We have to fix the data to correct for unspecified errors that led to a much to warm report at the time. ::sigh:: And then they with a straight face ask us to believe them.
Being a bear of very little brain I have a question. Can this sort of trick work forever? Will the artificial wave in the temperature anomaly just keep rolling along? In other words if you constantly adjust the past down and tinker a bit upward with the present to produce a constant upward trend, regardless of what is actually happening in the world, does the wave you have created ever crash on the shore? And if the natural variation is upward a bit too, well… bonus.
If we were dealing with financial fraud or embezzelment of this nature the trick would eventually become untenable. But with Climate Science it seems to be the perfect crime. Gotta love those anomolies. I guess it works because all you really have to do is create the illusion of rising temperatures, and you are chipping back the the old temperatures that you raised earlier on, so it never gets that far out of the ordinary. Whereas in financial fraud the object is to actually remove money from the system, it’s not enough to make it look like you are making money. That eventually sinks you.
Also I don’t see why this is news to anyone. I’ve been reading about this adjustment exercise for years now on this blog and elsewhere and no one has actually ever denied it. It’s just a rule of “climate science” that must be taught in CS 101. All past temperature adjustments are downwards and all present temperature adjustments are upwards. This keeps the narrative alive and the funding flowing. It’s a fundamental professional conceit. If you can’t do that you can’t be a true climate scientist.
And if you check monthly you will notice that the discrepancy increases over time. With every passing month NCDC adjusts pre-1950 temperatures down a bit more and post-1950 temperatures up.
Shown here is the amount by which the NCDC database has changed since May 2008 until April 2012.
http://climate4you.com/images/NCDC%20MaturityDiagramSince20080517.gif
Now you see why the Team loses input data or cannot show it due to ‘non-disclosure agreements’, or just flat out refuses to release data under FOIA requests. The team will see their fault here as failing to ‘lose’ (or hide the decline in) the State records.
It is a relatively simple choice – mendacity or incompetence – in either case they should not be funded and allowed to continue.
This page has a link to a NOAA chart of the adjustments. It essentially shows that there is NO warming since the 1930s without the adjustments:
http://jennifermarohasy.com/2009/06/how-the-us-temperature-record-is-adjusted/
Thanks
JK
Someone should run Benford’s Law of first and second leading digits on the adjusted data to verify that tampering has been done. This is a statistical test that is run on accounting data to alert auditors of book cooking. The test is independent of any kind of data and units used.
quidsapio, you missed a sentence in Anthony’s paste of the original paper. It reads, “Ken looked at entire years of data from the 1920s and 1930s for numerous different states and found that this ‘cooling’ of the old data was fairly consistent across the board.”
JFD
Lawyers, here’s your chance! Just think up a basis for a legal action based on some injury this fraud has caused. Regulations based on this chicanery have produced financial losses everywhere.
Fraud.
Anybody else anywhere else about anything else, and they’d go to jail.
We the people must stand up to the lies of AGW. This is unbelievable. Changing the truth to fit the lie is deplorable!
We have all been on about this before and will continue for some time yet I am sure. Homogenization is fools effort, as the results fool even the doer.
This is something many of us have long suspected. My guess is that the vast majority of paper documents have been or will be lost, water logged or shredded.
In 20 years time we will be shocked that glaciers were not covering the 30’s dust bowl states.
Holy Hockey-Sticks, Batman! The NOAA instrumental record is nothing but a protracted series of computers models. Take a look at the “Quality Control, Homogeneity Testing, and Adjustment Procedure”.
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ushcn.html#QUAL
At nearly every step, the data from the previous is entered into a computer program which makes the adjustments. Examples:
“The TOB debiased data are input into the MMTS program and is the second adjustment.”
“The debiased data from the second adjustment are then entered into the Station History Adjustment Program or SHAP. ”
“Each of the above adjustments is done is a sequential manner. The areal edits are preformed first and then the data are passed through the following programs (TOBS, MMTS, SHAP and FILNET). ”
It’s impossible for an outside to see verify any of these steps since (at least just going by what they put on thier website), it is impossible to get “under the hood” on thier procedures. I wonder if anyone’s looked into getting the “code” from NOAA.
Sure there are all sorts of “justifications” for these things published by NCDC and others, but the bottom line is that they are not representative of true reality, but of a processed reality.
———————–
Surely you can’t say this until you understand the differences between the methods by which the 2 averages were calculated. Assuming that the older values are correct and the new values are incorrect just because you like the older values better does not standup.
Let’s see, speaking hypothetically cos I don’t know, if it turns out that the old temps are measured at places that people prefer to live and those places are cool in a hot state, that will introduce a bias if you just add the numbers up and divide by the number of places. If those places are at a particular altitude, those places might warm more than other places, not necessarily with some simple linear relationship. so those old averages might be wrong.
So here is another article lined up for you. When the weather service says there were deficiencies in the old method of calculating average temps what were those deficiencies? It must be documented somewhere.
“REPLY: Explain why virtually every adjustment made to the raw data causes a temperature trend increase. – Anthony”
I can never understand why a large pay increase has less of an impact than a small tax increase.
DocMartyn says:
June 6, 2012 at 3:12 pm
———————————————————–
Just a few minutes of Fahrenheit 451: the autoignition temperature of paper.
“In an oppressive future, a fireman whose duty is to destroy all books begins to question his task. “
http://www.imdb.com/title/tt0060390/
I’m not playing Devil’s advocate, but trying to find a sympathetic interpretation of the “adjustments” I came up with this:
Imagine the”surface” suggested when the thermometer readings are z-coordinates and the thermometer locations are x- and y-coordinates. With some knowledge of the actual terrain, some readings may be “suspiciously” low (or high), and might be adjusted by some kriging algorithm. (Think of, say, Colorado with five thermometers spaced like the pips on the five-of-hearts playing card, all reading about the same except for the central one.)
Furthermore, faulty thermometers might be more prone to read a bit low rather than a bit high. Were this the case, a non-biased adjustment might actually “cool the past”.
I’m not saying it’s so, just thinking aloud. Whatever adjustments were made, the rationale and the algorithms should be available, as well as the raw data. Too much to ask, I fear.
Should have said “a bit high”
What will the temperature in 1934 have been tomorrow? And the day after that? Is there some kind of time machine involved in these changes?
The chart above shows how many degrees cooler….
Should this read, “…how many degrees cooler….?
Nuts. Make that, “how many degrees warmer“.
As Phil C noted, the Jeff Masters link goes on to explain the reasons for the change. And I think they make a lot of sense. They are going to a more modern gridded system. I think this bullet point is the key:
“1. For the TCDD, each divisional value from 1931-present is simply the arithmetic average of the station data within it, a computational practice that results in a bias when a division is spatially under sampled in a month (e.g., because some stations did not report) or is climatologically inhomogeneous in general (e.g., due to large variations in topography).”
In the original calc, the average for Arizona was just the average of whatever stations reported in each month. That’s pretty much what you have to do if you’re working with pen and paper.
It’s the reason why most climate work is now done with anomalies. Then you don’t get the effect that a month will seem warmer when mountain stations don’t report, etc.
Anyway, if you have to work with absolute temperatures, then you have to look at what kind of stations you have in the mix when calculating an average. And the least you can do is area weighting, which the new system seems to include. If you have an area represented by few stations, you give them more weight in the average.
That’s probably why the average has gone down. It’s likely that mountain regions in Arizona were underrepresented. If you upweight stations there it brings down the average.
The real question is whether this has anything to do with trend.
Golly gee, I wonder what Mosher will say about this? Will it be the standard “trust the climate scientists, they know what they’re doing”?
Lukewarmers agree with all temperature adjustments. We little people are just too…… little…..to understand the highly technical procedure of sorting and analyzing data, or reading thermometers for that matter.
We can expect Zeke & Co.R will replicate the results thereby validating the “adjustments”, then later Muller will have to reevaluate his data set so that it too agrees.
Sorry for the snark, but frankly it is beyond the pale this is called “science”. It certainly wouldn’t pass any industrial standard I’m aware of.
noaaprogrammer:
Not sure Benford’s analysis is appropriate here. I think you’d be likely to skew a little heavy towards 1 as the first digit just because of the way the numbers fall on the orders of magnitude involved.
What is wrong with you Americans? If you’re sure of the info above why don’t you take them to court?
There is trillions $ riding on the CAGW fraud and you’d be doing your taxpayers and all other taxpayers on the planet a favour if you could expose this fraud and con trick.
First move should be a spot on Fox news or whatever to get the ball rolling. But some how you must get some REAL publicity. If a major network runs with this story others must respond and then print and pollies have to join in and respond as well.
This is the only way it will work, always has, always will. You’re currently discussing this in house but you need to break out into the neighbourhood.