Guest essay by Dan Travers
On Thursday, March 13, 2014, the U.S. National Climactic Data Center switched to gridded GHCN-D data sets it uses to report long term temperature trends – with a resulting dramatic change in the official U.S. climate record. As seems to always happen when somebody modifies the temperature record, the new version of the record shows a significantly stronger warming trend than the unmodified or, in this case, discarded version.
The new dataset, called “nClimDiv,” shows the per decade warming trend from 1895 through 2012 for the contiguous United States to be 0.135 degrees Fahrenheit. The formerly used dataset, called “Drd964x,” shows the per decade warming trend over the same period to be substantially less – only 0.088 degrees. Which is closer to the truth?
As will be illustrated below in the side by side comparison graphs, the increase in the warming trend in the new data set is largely the consequence of significantly lowering the temperature record in the earlier part of the century, thereby creating a greater “warming” trend.
This particular manipulation has a long history. For an outstanding account of temperature record alterations, tampering, modifications and mutilations across the globe, see Joseph D’Aleo and Anthony Watts’ Surface Temperature Records: Policy-Driven Deception?
It should be noted that the 0.088 degree figure above was never reported by the NCDC. The Center’s previous practice was to use one data set for the national figures (nClimDiv or something similar to it) and a different one for the state and regional figures (Drd964x or something similar). To get a national figure using the Drd964x data, one has derive it from the state data. This is done by taking the per decade warming trend for each of the lower forty-eight states and calculating a weighted average, using each states’ respective geographical area as the weightings.
The chart below shows a state by state comparison for the lower forty-eight states of the per decade warming trend for 1895-2012 under both the old and the new data sets.
In the past, alterations and manipulations of the temperature record have been made frequently and are often poorly documented. See D’Aleo and Watts. In this instance, it should be noted, the NCDC made considerable effort to be forthcoming about the data set change. The change was planned and announced well in advance. An academic paper analyzing the major impacts of the transition was written by NOAA/NCDC scientists and made available on the NCDC website. See Fenimore, et. al, 2011. A description of the Drd964x dataset, the nClimDiv Dataset, and a comparison of the two was put on the website and can be see here.
The relative forthcoming approach of the NCDC in this instance notwithstanding, looking at the temperature graphs side by side for the two datasets is highly instructive and raises many questions – the most basic being which of the two data sets is more faithful to reality.
Below are side by side comparisons under the two data sets for California, Maine, Michigan, Oregon and Pennsylvania for the period 1895-2009, with the annual data points being for the twelve month period in the respective year ending in November. The right-side box is the graph under the new nClimDiv dataset, the left-side box is the graph for the same period using the discarded Drd964x dataset. (The reason this particular period is shown is that it is the only one for which I have the data to make the presentation. In December 2009, I happened to copy from the NCDC website the graph of the available temperature record for each of the lower forty-eight states, and the data from 1895 through November 2009 was the most recent that was available at that time.)
I will highlight a few items for each state comparison that I think are noteworthy, but there is much that can be said about each of these. Please comment!
California
![]()
![]()
Left: Before, Right: After – Click to enlarge graphs
- For California, the change in the in datasets results in a lowering of the entire temperature record, but the lowering is greater in the early part of the century, resulting in the 0.07 degree increase per decade in the Drd964x data becoming a .18 degree increase per decade under the nClimDiv data.
- Notice the earliest part of the graphs, up to about 1907. In the graph on left, the data points are between 59 and 61.5 degrees. In the graph on the right, they are between 56 and 57.5 degrees.
- The dips at 1910-1911 and around 1915 in the left graph are between 57 and 58 degrees. In the graph on the right they are between 55 and 56 degrees.
- The spike around 1925 is above 61 degrees in the graph on the left, and is just above 59 degrees in the graph on the right.
Maine
· The change in Maine’s temperature record from the dataset switch is dramatic. The Drd964x data shows a slight cooling trend of negative .03 degrees per decade. The nClimDiv data, on the other hand, shows a substantial .23 degrees per decade warming.
· Notice the third data point in the chart (1898, presumably). On the left it is between 43 and 44 degrees. On the right it is just over 40 degrees.
· Notice the three comparatively cold years in the middle of the decade between 1900 and 1910. On the left the first of them is at 39 degrees and the other two slightly below that. On the right, the same years are recorded just above 37 degrees, at 37 degrees and somewhere below 37 degrees, respectively.
· The temperature spike recorded in the left graph between 45 and 46 degrees around 1913 is barely discernable on the graph at the right and appears to be recorded at 41 degrees.
Michigan
- Michigan’s temperature record went from the very slight cooling trend under Drd964x data of -.01 degrees per decade to a warming trend of .21 degrees per decade under nClimDiv data.
- In Michigan’s case, the differences between the two data sets are starkly concentrated in the period between 1895 and 1930, where for the entire period the temperatures are on average about 2 degrees lower in the new data set, with relatively modest differences in years after 1930.
Oregon
· Notice the first datapoint (1895). The Drd964x dataset records it at slightly under 47.5 degrees. The new dataset states at slightly over 45 degrees, almost 2.5 degrees cooler.
· The first decade appears, on average, to be around 2.5 degrees colder in the new data set than the old.
· The ten years 1917 to 1926 are on average greater than 2 degrees colder in the new data set than the old.
· As is the case with California, the entire period of the graph is colder in the new data set, but the difference is greater in the early part of the twentieth century, resulting in the 0.09 degrees per decade increase shown by the Drd984x data becoming a 0.20 degree increase per decade in the nClimDiv data.
Pennsylvania
· Pennsylvania showed no warming trend at all in the Drd964x data. Under the nClimDiv data, the state experienced a 0.10 degree per decade warming trend.
· From 1895 through 1940, the nClimDiv data shows on average about 1 degree colder temperatures than the Drd964x data, followed by increasingly smaller differences in later years.
i have to admit the global warming indeed is 100% CAUSED and controlled by HUMANS……..they figured out no matter what the weather does WE CONTROL the records and can make them say whatever we desire!
This of course being completely pointless, since we all agree that the USA is only 3% of the planet ((c) J Hansen)
“He who controls the past controls the future, and he who controls the present controls the past” – George Orwell
Which is closer to the truth??? Trick question? If the truth is known why not publish that? If the truth is not known there is no valid answer to the question.
Do they ever give reasons for this?
On it’s face, this is nothing more than data manipulation.
Just a minor adjustment of the temperature record to better fit the CO2 record. What’s the problem!
How do they justify the lower/older temperatures in the new models? Are we saying that “gridding” the data applies lower temperature readings to a greater geographical area(s) and thus brings about a lower average temperature? Has anyone tried to reproduce the grids to see whether there is any “gerrymandering” taking place? Moving the grid around slightly could definitely change the outcome of this type of analysis, I would think. That would be a type of sensitivity analysis that I think should be done to determine whether there is an impact to where the grid is placed.
I would be interested to hear whether other more knowledgeable readers see this as a change promoting consistency, which could at least be defended, or a cynical form of data manipulation. The most important information would be why the authors or producers of this data believe that it is important to produce a gridded product versus a data set that is closer to the raw data. I would also expect that the authors would have to show some form of analysis on different grid placements. I think that it is clear that when you start to average things, you run the risk of completely muddying the waters so that no useful information can be discerned.
many of the “click to enlarge” features are not working .
Is this part of the new “lie and exaggerate” policy of the alarmists?
They are both wrong and that’s the truth
Well, it’s the right time of year for cherries.
Many more of these “revisions” and the 19th century will become an ice age.
I’m just trying to figure out why, when you look at those graphs as a whole, you can’t clearly see World War I, the Depression, or World War II.
You can see the Depression in the Michigan chart (but not the others), but if temps are tied so strongly to CO2, you’d think a couple of world wars (with attendant increases in overall production and emission of CO2) would show up, at least a little.
As with all such changes , what ever the data, this puts the lie on the uncertainty they were claiming before or after, or maybe both.
“the difference is greater in the early part of the twentieth century, resulting in the 0.09 degrees per decade increase shown by the Drd984x data becoming a 0.20 degree increase per decade in the nClimDiv data.”
That’s over a full degree per century. What was the claimed accuracy on the data beforehand and what is it now? Has the uncertainty been reduced by 1 degree per century.?
George Orwell had this in 1984.
Re-write history into a version of events that suits the current governments needs.
I challenge anybody to give a valid reason for changing the temperature values in 1895. Did it take until 2014 before they found out that the thermometers were badly calibrated?
According to the abstract, these adjustments were intended to fix:
“1. For the TCDD, each divisional value from 1931-present is simply the arithmetic average of the station data within it, a computational practice that results in a bias when a division is spatially undersampled in a month (e.g., because some stations did not report) or is climatologically inhomogeneous in general (e.g., due to large variations in topography).
2. For the TCDD, all divisional values before 1931 stem from state averages published by the U.S. Department of Agriculture (USDA) rather than from actual station observations, producing an artificial discontinuity in both the mean and variance for 1895-1930 (Guttman and Quayle, 1996).
3. In the TCDD, many divisions experienced a systematic change in average station location and elevation during the 20th Century, resulting in spurious historical trends in some regions (Keim et al., 2003; Keim et al., 2005; Allard et al., 2009).
4. Finally, none of the TCDD’s station-based temperature records contain adjustments for historical changes in observation time, station location, or temperature instrumentation, inhomogeneities which further bias temporal trends (Peterson et al., 1998).”
The land-based temperature data no longer has any meaning. Good stations without siting problems don’t show these same trends thus invalidating this exercise. The foxes are gaurding the henhouse.
How can climatologists predict the future when they can’t even predict the past?
Well I was assuming that they actually provided an error estimation for their data. So far I have not been able to even find one!
However this is interesting on the removal of degree.day fields since underlying data not long exist 😕 Huh?
>>
Qualitative Note on Degree Day Values: Preliminary analysis of degree day
data indicate that some values may be noticeably different from current
values. There are two primary reasons for this:
1) General: New dataset and methodology (see prior notes on changes to how
climate division data are computed)
2) Specific to the degree-day products: Population weights have been updated
to use 2010 Census data.
>>
Run that again? Degree.day products depend on population count ? WTF?
If any aspect of your temperature record moves because of a change in “populatin wieghts” you have a problem with your data processing!
The closer you look the worse it gets.
I suspect gridding ends up applying more homogenization resulting in upward adjustments to the very best, rural data. Once that data has been “upgraded”, the UHI contamination is complete. This is not science.
Michael E. Newton says:
April 29, 2014 at 8:38 am
How can climatologists predict the future when they can’t even predict the past?
>>
Sure they can. We can safely predict that in the next few years the past will get colder.
This is obvious from the lack of warming or “pause”. Since there is no warming this decade, the effect of GHG forcing is to make the past cooler.
You have no grasp on how climatolgy works, Where have you been ???
For all the temperature tampering over the years see below.
http://stevengoddard.wordpress.com/?s=ncdc
http://stevengoddard.wordpress.com/?s=tampering
IF the observable data does not match the predicted data, then just change the observable data sets to match the predicted data look better .. Most would call this pure corruption of data while others seem to pass this off as sound climate science. This has little resemblance to any science.
Mosher Drive By in 3…2…1…
Andrew
Well then. The ‘climate change’ we are currently not experiencing must be that much worse due to the steepness of this new incline.
I am sure the NCDC will provide the computer codes that they used to calculate all of the cited “adjustments” (particularly TOBS). /sarc