One way adjustments: The Latest Alteration to the U.S. Climate Record

Up_trendGuest essay by Dan Travers

On Thursday, March 13, 2014, the U.S. National Climactic Data Center switched to gridded GHCN-D data sets it uses to report long term temperature trends – with a resulting dramatic change in the official U.S. climate record. As seems to always happen when somebody modifies the temperature record, the new version of the record shows a significantly stronger warming trend than the unmodified or, in this case, discarded version.

The new dataset, called “nClimDiv,” shows the per decade warming trend from 1895 through 2012 for the contiguous United States to be 0.135 degrees Fahrenheit. The formerly used dataset, called “Drd964x,” shows the per decade warming trend over the same period to be substantially less – only 0.088 degrees. Which is closer to the truth?

As will be illustrated below in the side by side comparison graphs, the increase in the warming trend in the new data set is largely the consequence of significantly lowering the temperature record in the earlier part of the century, thereby creating a greater “warming” trend. 

This particular manipulation has a long history. For an outstanding account of temperature record alterations, tampering, modifications and mutilations across the globe, see Joseph D’Aleo and Anthony Watts’ Surface Temperature Records: Policy-Driven Deception?

It should be noted that the 0.088 degree figure above was never reported by the NCDC. The Center’s previous practice was to use one data set for the national figures (nClimDiv or something similar to it) and a different one for the state and regional figures (Drd964x or something similar). To get a national figure using the Drd964x data, one has derive it from the state data. This is done by taking the per decade warming trend for each of the lower forty-eight states and calculating a weighted average, using each states’ respective geographical area as the weightings.

The chart below shows a state by state comparison for the lower forty-eight states of the per decade warming trend for 1895-2012 under both the old and the new data sets.

clip_image002

In the past, alterations and manipulations of the temperature record have been made frequently and are often poorly documented. See D’Aleo and Watts. In this instance, it should be noted, the NCDC made considerable effort to be forthcoming about the data set change. The change was planned and announced well in advance. An academic paper analyzing the major impacts of the transition was written by NOAA/NCDC scientists and made available on the NCDC website. See Fenimore, et. al, 2011. A description of the Drd964x dataset, the nClimDiv Dataset, and a comparison of the two was put on the website and can be see here.

The relative forthcoming approach of the NCDC in this instance notwithstanding, looking at the temperature graphs side by side for the two datasets is highly instructive and raises many questions – the most basic being which of the two data sets is more faithful to reality.

Below are side by side comparisons under the two data sets for California, Maine, Michigan, Oregon and Pennsylvania for the period 1895-2009, with the annual data points being for the twelve month period in the respective year ending in November. The right-side box is the graph under the new nClimDiv dataset, the left-side box is the graph for the same period using the discarded Drd964x dataset. (The reason this particular period is shown is that it is the only one for which I have the data to make the presentation. In December 2009, I happened to copy from the NCDC website the graph of the available temperature record for each of the lower forty-eight states, and the data from 1895 through November 2009 was the most recent that was available at that time.)

I will highlight a few items for each state comparison that I think are noteworthy, but there is much that can be said about each of these. Please comment!

 

California

clip_image004clip_image006

Left: Before, Right: After –  Click to enlarge graphs

  • For California, the change in the in datasets results in a lowering of the entire temperature record, but the lowering is greater in the early part of the century, resulting in the 0.07 degree increase per decade in the Drd964x data becoming a .18 degree increase per decade under the nClimDiv data.
  • Notice the earliest part of the graphs, up to about 1907. In the graph on left, the data points are between 59 and 61.5 degrees. In the graph on the right, they are between 56 and 57.5 degrees.
  • The dips at 1910-1911 and around 1915 in the left graph are between 57 and 58 degrees. In the graph on the right they are between 55 and 56 degrees.
  • The spike around 1925 is above 61 degrees in the graph on the left, and is just above 59 degrees in the graph on the right.

Maine

clip_image008clip_image010

· The change in Maine’s temperature record from the dataset switch is dramatic. The Drd964x data shows a slight cooling trend of negative .03 degrees per decade. The nClimDiv data, on the other hand, shows a substantial .23 degrees per decade warming.

· Notice the third data point in the chart (1898, presumably). On the left it is between 43 and 44 degrees. On the right it is just over 40 degrees.

· Notice the three comparatively cold years in the middle of the decade between 1900 and 1910. On the left the first of them is at 39 degrees and the other two slightly below that. On the right, the same years are recorded just above 37 degrees, at 37 degrees and somewhere below 37 degrees, respectively.

· The temperature spike recorded in the left graph between 45 and 46 degrees around 1913 is barely discernable on the graph at the right and appears to be recorded at 41 degrees.

Michigan

clip_image012clip_image014

  • Michigan’s temperature record went from the very slight cooling trend under Drd964x data of -.01 degrees per decade to a warming trend of .21 degrees per decade under nClimDiv data.
  • In Michigan’s case, the differences between the two data sets are starkly concentrated in the period between 1895 and 1930, where for the entire period the temperatures are on average about 2 degrees lower in the new data set, with relatively modest differences in years after 1930.

Oregon

clip_image016clip_image018

· Notice the first datapoint (1895). The Drd964x dataset records it at slightly under 47.5 degrees. The new dataset states at slightly over 45 degrees, almost 2.5 degrees cooler.

· The first decade appears, on average, to be around 2.5 degrees colder in the new data set than the old.

· The ten years 1917 to 1926 are on average greater than 2 degrees colder in the new data set than the old.

· As is the case with California, the entire period of the graph is colder in the new data set, but the difference is greater in the early part of the twentieth century, resulting in the 0.09 degrees per decade increase shown by the Drd984x data becoming a 0.20 degree increase per decade in the nClimDiv data.

Pennsylvania

clip_image020clip_image022

 

· Pennsylvania showed no warming trend at all in the Drd964x data. Under the nClimDiv data, the state experienced a 0.10 degree per decade warming trend.

· From 1895 through 1940, the nClimDiv data shows on average about 1 degree colder temperatures than the Drd964x data, followed by increasingly smaller differences in later years.

 

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
224 Comments
Inline Feedbacks
View all comments
Nick Stokes
April 29, 2014 11:37 pm

John Slayton says: April 29, 2014 at 10:57 pm
“The report clearly states that corrections have been made”

People here have funny ideas about adjustments. They aren’t “alterations to the historic record”. They are usually someone taking a mass of digitized data and putting it through a computer algorithm prior to calculation of a regional or global average. It’s not “correcting the past”. It’s trying to estimate what is representative of the region.
There was no capability to do this in 1892. Thermometer data was read, written into a log book, sent to some central office with telegrams or written forms, when it may have got into the newspaper. Then it sat, until unearthed by digitizers in the 1970’s or thereabouts.
Those records still largely exist, as do the newspaper reports. No-one has gone through manually altering them. And they were transcribed into GHCN as part of a major project in the early 1990’s. GHCN re-issued them on CD’s. That’s not how adjustment happens.
GHCN publishes daily and monthly unadjusted data. I’ve compared some of that with contemporary news reports (some reported here). It matches. And you can check that the monthly data is indeed an arithmetic average of the daily data.

David A
April 30, 2014 1:06 am

drumphil says, …’So, can anyone falsify Moshers claims by doing what he suggests, and coming up with a different answer?
===================================================================
I have some very simple suggestions of my own for Mr. Mosher. My challenge to him is to start by explaining JUST this one stations adjustment. http://stevengoddard.wordpress.com/2014/03/01/spectacular-data-tampering-from-giss-in-iceland/
http://stevengoddard.wordpress.com/2014/01/17/if-iceland-likes-their-1940s-warm-period-they-cant-keep-their-1940s-warm-period-period/
You see, even the meteorologist in charge of that station says these adjustments are bogus.
Now lets go from the particular to something a little more general. If Mr. Mosher wishes to graduate from explaining just the Iceland adjustments, perhaps he would be so kind to explain the US adjustments. TOB does not account for this.
http://stevengoddard.wordpress.com/data-tampering-at-ushcngiss/
The graphs are pretty well labeled, so a detailed rebuttal of each graph is required.
Going global it appears that GISS and RSS appear to be diverging ever more.
http://stevengoddard.wordpress.com/2013/05/10/giss-rapidly-diverging-from-rss/
I am open to an explanation of this as well.
I have seen similar questions raised about European T records, and Australia’s records as well.
Mr. Mosher will not take the time to answer these challenges, just as he did not answer challenges to his rude assertions calling a thousand page plus scientific report, referencing hundreds of cogent peer reviewed studies, a report written by “clowns” here… http://wattsupwiththat.com/2014/04/20/dueling-climate-reports-this-one-is-worth-sharing-on-your-own-blog/#comment-1617414
Mr. Mosher often admonishes skeptics for not doing their homework. Here he did not do his homework. The NIPCC report is very long, with hundred of references to historical peer reviewed science on the real world affects of CO2 on the biosphere and biology of this planet. It takes on many of the purported harms of CO2, and shows where and why those suggestions are likely wrong, not using models, but real world studies. It is a study of the real world affects showing the projected harm did not happen. (Except in the sense of the predicted harm failing to happen) The study is chalk full of those kinds of statistics. Mosher failed to address ANY of the hundreds of scientific observed measured and documented facts in a very long report.
BTW, Mr. drumphil, you never answered my rebuttal on the same thread to you here…http://wattsupwiththat.com/2014/04/20/dueling-climate-reports-this-one-is-worth-sharing-on-your-own-blog/#comment-1618029

David A
April 30, 2014 1:10 am

Nick Stokes says:
April 29, 2014 at 4:19 pm
DR says: April 29, 2014 at 3:53 pm
“As observed by Steven Goddard, there have been massive adjustments for Michigan temperatures.”
===============
Nick says..”They aren’t massive adjustments to the same data. They are different datasets with a different mix of stations. You just can’t compare regional averages unless they use anomalies.
=================================
What Mr. Goddard’s graph shows is the total difference between NCDC reported temperatures and GHCN HCN raw temperatures. What Mr. Goddard often does is take ALL GHCN continuously active stations, and use those for a basis. In this case, using those stations, the reported T matches the Northern and more Southern Great lakes record smashing ice as the clear cut coldest march on record. This is ,IMV, both constructive and informative. In the case of the US as a whole this method far more closely matches the early official US charts, shown in this post of Mr Goddard here… http://stevengoddard.wordpress.com/data-tampering-at-ushcngiss/
You see, when you attempt to factor in THOUSANDS of station changes, and just as many disparate sitting issues at each station, and a very uncertain science on UHI with disparate views by educated people, and add in a legitimately questionable TOB adjustment, and you shake and bake the entire record into regional averaging and anomalies of ever changing stations, the chances of ending up with a FUBAR record grows exponentially.
EM Smith did some excellent articles on this.cauldron of mashed assumptions and stations changes. Mark Boffill, I highly recommend you go to his sight here. E.M. is both brilliant and a very very rare straight shooter. http://chiefio.wordpress.com/category/agw-and-gistemp-issues/
The concomitant cooling of the past, (both the attempted scrubbing of the MWP, as well as the false cooling of the late 30 and early 40 in the US and other places), in combination with questionable adjustments and recent divergence from RSS , and, in addition, UHI questions etc, all lead to my perspective that the VERY MINOR warming detected, (certainly nothing outside of past periods of warming) plus the failed predictions of doom, is “much ado about nothing”.

SandyInLimousin
April 30, 2014 1:13 am

One difference between Nick Stokes and Steve Mosher, SM does a drive by and NS comes back and continues what he has started. Whether you agree with what Nick says or not he is to be congratulated for getting into a discussion. Until Steve Mosher does the same he should be ignored as a troll would/should be.

April 30, 2014 1:20 am

Fascinating stuff, but you guys should be very wary of seeing conspiracy in everything. A sober analysis of the justifications and the methodology is badly needed, but the changes should not be dismissed as part of THE AGENDA out of hand.
Remember that this is USA-only ground station data, and it will be no more than a footnote to the greater debate, unless of course the same kind of adjustments can be rolled out globally…
But what is also clear is that, if we take the revised data at face value, it becomes much more difficult to establish any ACCELERATION in the warming trend for USA, as it backdates much of the warming to the earliest part of the graph, a time when CO2 emissions were substantially less, thereby flattening the curve. So if it IS a conspiratorial move it may be counter-productive.

Nick Stokes
April 30, 2014 2:07 am

David A says: April 30, 2014 at 1:10 am
“What Mr. Goddard’s graph shows is the total difference between NCDC reported temperatures and GHCN HCN raw temperatures. What Mr. Goddard often does is take ALL GHCN continuously active stations, and use those for a basis.”

And that’s a problem with making up your own index. The answer you get depends on the mix of stations. More North, and the average goes down. USHCN has tried to make their distribution reasonably representative, though USCRN is more systematic. But GHCN historically was just a collection of whatever data could be retrieved. No one ever tried to make it representative of Michigan.
As to whether it “matches” the ice – that’s a very subjective judgment. The data for Alpena suggests otherwise.

David A
April 30, 2014 3:02 am

Alpena is only one location, assuming raw and no sitting issues, and assuming no adjustments to the past record at that location. The ice is not subjective, The Great Lakes are less affected by currents then the ocean ice shelves. (No warm of cold currents from far away locations) Wind is likewise less of a factor on the Great Lakes,(landlocked body of water) and can be both positive and negative.to ice coverage.
Beyond that, the Great Lakes cover a massive area, and the ice is, to a degree, one massive thermometer covering a vast area. It is the coldest year on record ON the Great Lakes, and in the continuously active GHCN stations, and not by just a little. (This covers a very large area) Adding in the testimony of the Great Lakes ice being massively above the previous record, is a cogent and constructive argument.
Also, my general arguments on using all continuously active locations are sound, as are my points with regard to the likely errors of the current system you advocate, which I call FUBAR.
Your point on the differences between the two data sets is not without potential merit however. A map of the differences in the locations would be informative Keep in mind, the location of the storm tracks and the cold air following those weather systems, can easily overwhelm a hundred or more miles of N/S difference In addition Goddard’s number one coolest ranking for those stations is ALSO based on a past of those same locations..
Additionally most of Mr.Goddard’s map do not compose of different data sets, as this link shows..
.http://stevengoddard.wordpress.com/data-tampering-at-ushcngiss/

David A
April 30, 2014 3:16 am

One finale point about the continuously active GHCN stations. if they were in error due to random station location, they would show both a warmer or cooler past and present, compared to your “well chosen” stations. However the GHCN stations CONSISTENTLY show a warmer past, and a cooler present, then the adjusted homogenized current data sets. A random selection of stations should average out.
This consistently is testimony to the assertion of a manmade cooling of the past, and warming of the present, and is in line with Jim Hansen’s early US T charts..

Nick Stokes
April 30, 2014 3:32 am

David A says: April 30, 2014 at 3:16 am
“One finale point about the continuously active GHCN stations. if they were in error due to random station location, they would show both a warmer or cooler past and present, compared to your “well chosen” stations. However the GHCN stations CONSISTENTLY show a warmer past, and a cooler present”

Yes, of course. They aren’t located randomly. GHCN stations are predominantly in the south of the state. That’s why you get a warmer average. But SG attributes that to “adjustment”.
You get more warmth in the past because that’s when GHCN was just collecting whatever it could. When it became an ongoing program in the mid 90s, that got rationalized, with a more even distribution.
People who do averages properly know all this. They use gridding and area weighting to avoid distribution biases. And they use anomalies to counter the effects of things like latitude.

John Slayton
April 30, 2014 3:47 am

Nick Stokes: People here have funny ideas about adjustments.
May well be true. Your description of adjustment would leave one with the impression that adjustments are only about groups of stations, and are applied uniformly across those groups. Things might be heading in that direction, but they haven’t been so restricted up to now, at least if we are to believe NCDC’s description of the procedure at http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ushcn.html#QUAL (yeah, my bold):
The data for each station in the USHCN are subjected to the following quality control and homogeneity testing and adjustment procedures.
A quality control procedure is performed… to identify suspects (> 3.5 standard deviations away from the mean) and outliers (> 5.0 standard deviations). Until recently these suspects and outliers were hand-verified with the original records.
….
The homogeneity adjustment scheme described in Karl and Williams (1987) is performed using the station history metadata file to account for time series discontinuities due to random station moves and other station changes.
….
Currently all data adjustments in the USHCN are based on the use of metadata.

I find it hard to believe you are serious when you say adjustments are not ‘alterations to the historic record’ or that they are not ‘correcting the past.’ If I go to http://cdiac.ornl.gov/cgi-bin/broker?id=048839&_PROGRAM=prog.gplot_meanclim_mon_yr2012.sas&_SERVICE=default&param=TMAX&minyear=1893&maxyear=2012 and get the station history graph for Tejon Ranch TMAX, I get a very different history than if I pull down Tejon Ranch TMAXTOBS at http://cdiac.ornl.gov/cgi-bin/broker?id=048839&_PROGRAM=prog.gplot_meanclim_mon_yr2012.sas&_SERVICE=default&param=TMAXTOBS&minyear=1893&maxyear=2012 .
In the cited 1892 document, the head of the Weather Bureau explicitly says ‘corrections’ have been made that bring urban temperatures into line with surrounding areas. Are you suggesting he was mistaken? My intention here was not to imply that the current processes of systematic adjustment are in any way similar to his ‘corrections.’ I probably should have stated that in the first place, but I thought it was adequately suggested by the phrase ‘ad hoc basis by persons unknown.’

Gail Combs
April 30, 2014 3:55 am

John F. Hultquist says: April 29, 2014 at 10:16 pm
Gail Combs says “I caught them lying …”
Perhaps the NWS reported “Water Equivalent” rather than snow depth insofar as snows are greatly different in the former.
>>>>>>>>>>>>>>>>>>>
NO
Jeff Masters originally reported the February 12th readings as RAIN and the temperatures as above freezing. I called him a liar on his blog. He then changed it to a high above freezing and snow. When I called him a liar a second time, he changed the records to a high of 31°F and 0.37 inches of snow.
I wish I had gotten screen shots to prove what I am saying but I did not. One does not expect the past to change like it does in some Orwell novel.
Another example is the Norfolk City vs Norfolk International Airport I posted here on WUWT a few years ago.

…Here is a quick look at the only city & close by airport listed for North Carolina. The city is on the North Carolina/Virgina border and right on the ocean. Take a look at the city vs the airport! Norfolk City and
Norfolk International Airport
Here is a graph of the raw 1856 to 2009 Atlantic Multidecadal Oscillation Amazing how the temperatures follow the Atlantic ocean oscillation as long as the weather station is not sitting at an airport isn’t it?

The Norfolk International Airport was a straight line of ever increasing temperatures while Norfolk City followed the oscillations of the Atlantic Multidecadal Oscillation.
Now those charts are no longer available and the Norfolk International Airport chart matches Norfolk City and the Atlantic Multidecadal Oscillation.
What is also interesting is NONE of the NC ‘official weather stations’ are any further into the mountains than Chapel Hill in the middle of the state (Piedmont.)
Again from that older comment of mine.

I live in North Carolina and did a quick look at my state. It is VERY interesting.
There is nothing closer to the mountains than Chapel Hill which is just west of Raleigh. All the areas with longitudes further west are also further south and that puts them on the seacoast. Chapel Hill is on the plains. Seems the mountain areas are no longer part of the record, imagine that.
I also found that at Wunderground the Moncure NC station is no longer available, it flips you to the new Sanford NC airport. It was available the last time I looked. Asheville NC is the big city in the mountains, home of the Biltmore Estate (1895) Its weather station now only goes back to 2005 at the AIRPORT of course. The site also directs you to the “nearby” (80 miles) city of Greenville (downtown) which only goes back to 1970. That site has been declared “unofficial” The “official” station is now the Greenville/Spartanburg, South Carolina (Airport)….

E. M. Smith in looking at the “Death of the Thermometers” found similar situations where the thermometers that were dropped were in the mountains.

GHCN – GIStemp Interactions – The Bolivia Effect
Alright Already, what is this Bolivia Effect?
Notice that nice rosy red over the top of Bolivia? Bolivia is that country near, but not on, the coast just about half way up the Pacific Ocean side. It has a patch of high cold Andes Mountains where most of the population live….
One Small Problem with the anomally map. There has not been any thermometer data for Bolivia in GHCN since 1990.
None. Nada. Zip. Zilch. Nothing. Empty Set.
So just how can it be so Hot Hot Hot! in Bolivia if there is NO data from the last 20 years?
Easy. GIStemp “makes it up” from “nearby” thermometers up to 1200 km away. So what is within 1200 km of Bolivia? The beaches of Chili, Peru and the Amazon Jungle.
Not exactly the same as snow capped peaks and high cold desert, but hey, you gotta make do with what you have, you know? (The official excuse given is that the data acceptance window closes on one day of the month and Bolivia does not report until after that date. Oh, and they never ever would want to go back and add date into the past after a close date. Yet they are happy to fiddle with, adjust, modify, and wholesale change and delete old data as they change their adjustment methods…)

So THEY LIE and they have been caught at it repeatedly.

Mark Bofill
April 30, 2014 4:29 am

Ron House says:
April 29, 2014 at 6:17 pm
{…}

The first part is a critique of skeptics’ strategy

Yes, this part is what I was referring to. More specifically Steven’s argument that the temperature record isn’t the way to go.

Re (1), there is no central skeptical command directing strategy, so everyone does what they like, and good on them.

So who’s been sending me the checks? Sure, they’re signed ‘Big Oil’ but the check says the bank name is First Skeptic Central. 😉
No seriously, obviously we all know that. I’m pretty sure Steven knew it when he wrote the comment. I was just curious about Anthony’s take.
Thanks for your response.

John
April 30, 2014 5:07 am

How can these people sleep at night, knowing they’re introducing false temperature readings?

Nick Stokes
April 30, 2014 5:09 am

John Slayton says: April 30, 2014 at 3:47 am
“and get the station history graph for Tejon Ranch TMAX, I get a very different history than if I pull down Tejon Ranch TMAXTOBS”

There’s no mystery there. You can get the original data, or you can get data with TOBS correction applied. It’s clearly labelled. Are you saying that knowledge of the effect of TOBS should be suppressed?
As to the head of the Weather Bureau, he didn’t “explicitly say ‘corrections’ have been made that bring urban temperatures into line with surrounding areas” at all. He said, in your quote: “with the shelters now in use and considering the various corrections applied, the readings are substantially the same.”
He said, “the readings are substantially the same”. He didn’t say the corrections made them so, or that that was their purpose. I suspect it’s a reference to instrument calibrations. Incidentally, what “Weather Bureau” are we talking about?

Bob Kutz
April 30, 2014 5:57 am

Re Nick Stokes @5.09;
“Are you saying that knowledge of the effect of TOBS should be suppressed? ”
Are you saying that there is some new knowledge of TOBS that we did not previously possess and that this knowledge resulted in a unidirectional change to 100 year old data?
The fact that there have been a series of these corrections, and that they always go in the same direction, tells me more about the scientists than the data or our knowledge.

Nick Stokes
April 30, 2014 6:14 am

Bob Kutz says: April 30, 2014 at 5:57 am
“Are you saying that there is some new knowledge of TOBS that we did not previously possess and that this knowledge resulted in a unidirectional change to 100 year old data?”

It’s as old as USHCN itself. But data with and without TOBS correction has always been available. Do you think it should be otherwise?
TOBS is clearcut. Many observing times changed. The changes have been recorded. It was a systematic change from the previously set 5pm to a now common 9am or 10am. It affects readings in a predictable way. It’s hard to justify not correcting.

April 30, 2014 6:21 am

Nick Stokes says:
April 30, 2014 at 6:14 am

TOBS is clearcut. Many observing times changed. The changes have been recorded. It was a systematic change from the previously set 5pm to a now common 9am or 10am. It affects readings in a predictable way. It’s hard to justify not correcting.

Ah, no. This is an utter fabrication. The adjustments are not at all in line with any reasonable TOBS adjustment & you know it.

Latitude
April 30, 2014 6:32 am

OK….so who thinks you can adjust data…and get an anomaly?
..show of hands

Bob Kutz
April 30, 2014 7:14 am

Nick, your understanding of TOBS differs from mine in such a way that one of us is just making up a story that does not reflect reality.
From GHCN;
The TOB software is an empirical model used to estimate the time of observation biases associated with different observation schedules and the routine computes the TOB with respect to daily readings taken at midnight. Details on the procedure are given in, “A Model to Estimate the Time of Observation Bias Associated with Monthly Mean Maximum, Minimum, and Mean Temperatures.” by Karl, Williams, et al.1986, Journal of Climate and Applied Meteorology 15: 145-160.
SO . . . Mr. Stokes, since your story varies wildly from what the GHCN claims on their own website, maybe you ought to think about looking into things before making assertions.
“From the previously set 5pm to a now common 9am or 10 am.”???
Typical warmist sheep. You don’t bother looking into things and asking why. You just accept the gruel you are fed and defend those who provide it, to the point of just making things up as needed.
Nick, the data set without TOBS shows about half the warming of the data set with TOBS added. The most recent data set increases the difference. How is it that our understanding of temperature observations 100 years ago now allows us to more accurately adjust them ALL DOWN?
Doesn’t that even pique your curiosity, just a little bit???

John Slayton
April 30, 2014 7:34 am

Nick Stokes: Incidentally, what “Weather Bureau” are we talking about?
Ah, good, a question where we might find an agreed upon answer. It was, per the title page:
U.S. Department of Agriculture. Report of the Chief of the Weather Bureau for 1891.
This was the first report since the Weather Bureau was moved from the Signal Corps to the Ag Department. The Chief referred to was Professor Mark W. Harrington of the University of Michigan.
A short publication history and easy archive of Bureau reports is available at:
http://www.lib.noaa.gov/collections/imgdocmaps/reportofthechief.html
Well, it’s a new day in LA and it’s going to be hot. Been nice chatting with you.
: >)

David A
April 30, 2014 7:53 am

Nick Stokes says…
David A says: April 30, 2014 at 3:16 am
“One finale point about the continuously active GHCN stations. if they were in error due to random station location, they would show both a warmer or cooler past and present, compared to your “well chosen” stations. However the GHCN stations CONSISTENTLY show a warmer past, and a cooler present”
==============================================================
Nick…”Yes, of course. They aren’t located randomly. GHCN stations are predominantly in the south of the state. That’s why you get a warmer average. But SG attributes that to “adjustment”.
======================================================================
David…, assertion without evidence,, and, as my post made abundantly clear, this is consistently true throughput the US. BTW, I notice you forgot to mention the cooler present, which I stated. If they are further south, why are they not warmer in the present?
=================
Nick says…You get more warmth in the past because that’s when GHCN was just collecting whatever it could. When it became an ongoing program in the mid 90s, that got rationalized, with a more even distribution.
=======================================
David, Hum?,, fails basic logic I again stated that random distribution would produce random results. The warmest side has consistently been criticized for every all their the adjustments , cooling the past, warming the present, just happens to lead one to increased perception of CAGW,
Once again, if those station continuously active stations were consistently [too] warm in the past, they would be to warm today, but now, those random unbiased stations, just happen to be [too] cool in the present. Sounds like you are trying to have your cake, and eat it to.
Again, they are warmer in the past, and cooler in the present; clear evidence of the artificial and likely wrong nature of adjustments.
=========================================================
Nick says…People who do averages properly know all this. They use gridding and area weighting to avoid distribution biases. And they use anomalies to counter the effects of things like latitude.
===============================================================
David..”Nick, I am sorry if I do not bow at the feet of those who can decipher the holy book of CO2, but rudimentary logic precludes your assertion. By only answering one small part of my post, and very poorly at that, you are not convincing. Also, as I already stated, when you attempt to factor in THOUSANDS of station changes, and just as many disparate sitting issues at each station, and a very uncertain science on UHI with disparate views by educated people, and add in a legitimately questionable TOB adjustment, and you shake and bake the entire record into regional averaging and anomalies of ever changing stations, the chances of ending up with a FUBAR record grows exponentially.
The truth is the stations choices have a curious tendency to move towards airports, cool the past, warm the present, and through algorithms and elimination of many stations, spread [their] anomalies to large areas. Now, we are not talking about a major movement of whatever minor or non existent warming, but even .10 or .05 over ten or so years, is a large swing on the graphs that are produced. As I said, the increasing divergence of RSS from the surface data sets, is as great as the predicted warming trend. I also challenged you or Mr. Mosher to just explain the Iceland adjustments. I also linked you to a series of graphs by Mr Goddard here, http://stevengoddard.wordpress.com/data-tampering-at-ushcngiss/ where your mix of data sets does not even happen for the most part. and you and Mr. Mosher failed to respond.

JustAnotherPoster
April 30, 2014 8:04 am

Stokes.
How can you compare and use a station which recorded a max temp of lets say 65 degrees 50 years ago at 5pm to one of 60 degrees ago at 9am
And why do the adjustments ALWAY go in the same way…
This blog post on a SAP Data Analytic site is still prehaps one of the best i’ve seen with regards to global warming.
http://scn.sap.com/community/hana-in-memory/blog/2013/11/06/big-data-geek–is-it-getting-warmer-in-virginia–noaa-hourly-climate-data–part-2 (and featured here a while back)
The pure unadjusted data is loaded in from raw CSV files, the methodology is open…. and guess what. The data shows absolutely zero warming trend at all.
the biggest mystery of climate science is when other clever people analyse the data. It shows flat trend lines, yet after NOAA and other agencies process the data this becomes a warming trend.
Its one of the worlds intriguing mysteries… the fact that non climate scientists often struggle to come up with hockey sticks… warming trends and any sort of “human fingerprint” save for the odd UHI efffect.
Once you start adjusting data the data isn’t a record of anything anymore, its your interpretation of what the data should look like.
Again Just because you think the adjustments are valid, and they have peer reviewed papers doesn’t make them valid. Its your own opinion that they are valid. I don’t think they are and many others don’t think the data adjustments are valid.
Peer review isn’t a particularly valid back stop.
Raw Data is what it is. Adjusting the data means thats its not an observation anymore.

David A
April 30, 2014 8:09 am

Nick, here is one state, same station raw to same stations raw to same stations with “corrections…
http://stevengoddard.wordpress.com/2014/04/26/noaa-blowing-away-all-records-for-data-tampering-in-2014/

David A
April 30, 2014 8:10 am

Nick, here is one state, same station raw to same stations with “corrections…
http://stevengoddard.wordpress.com/2014/04/26/noaa-blowing-away-all-records-for-data-tampering-in-2014/

herkimer
April 30, 2014 8:10 am

If we ignore for the moment the mechanics of the recent temperature adjustments , do the resulting warming rates make any sense .The data shows that the greatest warming rates( 0.2 to 0.29F/decade ) are for the NORTH EAST, EAST NORTH CENTRAL and WEST NORTH CENTRAL or mostly states close to the Canadian border . This looks strange as one would expect the SOUTH or CENTRAL parts to have warmed more . However when one looks at the warming rates in Canada, things have warmed more. Canada as a whole has warmed 1.6 C since 1948 mostly due to the extra warming of the ARCTIC and the Atlantic Ocean . This equates to 0.432F/decade compared to 0.135F/decade for CONTIGUOUS United States . However the CANADIAN regions next to United States have warmed less . For example , the Great Lakes and St Lawrence Valley region only show an annual warming rate of 0.9C over about 64 years or 0.25F / decade. This is similar to the 0.2 to 0.29 F /decade for the US states close to the Canadian border as per the new adjustments . So I cannot say that the new warming rates are high. The figure of 0.135F/decade also equates to 0.74C/CENTURY which is similar to the warming rate for the globe for the past century.
What is interesting is that the annual and winter temperatures for Contiguous US are declining since 1998 or the last 17 years and the greatest cooling is for the very states next to the Canadian border . So things are changing recently from the 1895 -2013 warming trend.
Winter temperatures in United States have been declining now for 17 years at about 1.78F/ decade according to NCDC/NOAA, CLIMATE AT A GLANCE data. Matter of fact, in United States, 8 out 12 months of the year are cooling. Winters, spring [2months] and fall are all cooling while only 3 months, namely March , June and July are still warming. ANNUAL US temperatures are declining at (-0.36 F/DECADE) since 1998. So there is very little evidence of global warming in United States for the last 16-17 years.