Contribution of USHCN and GISS bias in long-term temperature records for a well-sited rural weather station

Guest post by David W. Schnare, Esq. Ph.D.

When Phil Jones suggested that if folks didn’t like his surface temperature reconstructions, then perhaps they should do their own, he was right. The SPPI analysis of rural versus urban trends demonstrates the nature of the overall problem. It does not, however, go into sufficient detail. A close examination of the data suggests three areas needing address. Two involve the adjustments made by NCDC (NOAA) and by GISS (NASA). Each made their own adjustments and typically these are serial, the GISS done on top of the NCDC. The third problem is organic to the raw data and has been highlighted by Anthony Watts in his Surface Stations project. That involves the “micro-climate” biases in the raw data.

As Watts points out, while there are far too many biased weather station locations, there remain some properly sited ones. Examination of the data representing those stations provides a clean basis by which to demonstrate the peculiarities in the adjustments made by NCDC and GISS.

One such station is Dale Enterprise, Virginia. The Weather Bureau has reported raw observations and summary monthly and annual data from this station since 1891 through the present, a 119 year record. From 1892 to 2008, there are only 9 months of missing data during this 1,404 month period, a missing data rate of less than 0.64 percent. The analysis below interpolates for this missing data by using an average of the 10 years surrounding the missing value, rather than basing any back-filling from other sites. This correction method minimizes the inherent uncertainties associated with other sites for which there is not micro-climate guarantee of unbiased data.

The site itself is in a field on a farm, well away from buildings or hard surfaces. The original thermometer remains at the site as a back-up to the electronic temperature sensor that was installed in 1994.

The Dale Enterprise station site is situated in the rolling hills east of the Shenandoah Valley, more than a mile from the nearest suburban style subdivision and over three miles from the center of the nearest “urban” development, Harrisonburg, Virginia, a town of 44,000 population.

Other than the shift to an electronic sensor in 1994, and the need to fill in the 9 months of missing reports, there is no reason to adjust the raw temperature data as reported by the Weather Bureau.

Here is a plot of the raw data from the Dale Enterprise station.

There may be a step-wise drop in reported temperature in the post-1994 period. Virginia does not provide other rural stations that operated electronic sensors over a meaningful period before and after the equipment change at Dale Enterprise, nor is there publicly available data comparing the thermometer and electronic sensor data for this station. Comparison with urban stations introduces a potentially large warm bias over the 20 year period from 1984 to 2004. This is especially true in Virginia as most such urban sites are typically at airports where aircraft equipment in use and the pace of operations changed dramatically over this period.

Notably, neither NCDC nor GISS adjusts for this equipment change. Thus, any bias due to the 1994 equipment change remains in the record for the original data as well as the NCDC and GISS adjusted data.

The NCDC adjustment

Although many have focused on the changes GISS made from the NCDC data, the NCDC “homogenization” is equally interesting, and as shown in this example, far more difficult to understand.

NCDC takes the originally reported data and adjusts it into a data set that becomes a part of the United States Historical Climatology Network (USHCN). Most researchers, including GISS and the East Anglia University Climate Research Center (CRU) begin with the USHCN data set. Figure 2 documents the changes NCDC made to the original observations and suggests why, perhaps, one ought begin with the original data.

The red line in the graph shows the changes made in the original data. Considering the location of the Dale Enterprise station and the lack of micro-climate bias, one has to wonder why NCDC would make any adjustment whatever. The shape of the red delta line indicates these are not adjustments made for purposes of correcting missing data, or for any obvious other bias. Indeed, with the exception of 1998 and 1999, NCDC adjusts the original data in every year! [Note, when a 62 year old Ph.D. scientist uses an exclamation point, their statement is rather to be taken with some extraordinary attention.]

This graphic makes clear the need to “push the reset button” on the USHCN. Based on this station, alone, one can argue the USHCN data set is inappropriate for use as a starting point for other investigators, and fails to earn the self-applied moniker as a “high quality data set.”

The GISS Adjustment

GISS states that their adjustments reflect corrections for the urban heat island bias in station records. In theory, they adjust stations based on the night time luminosity of the area within which the station is located. This broad-brush approach appears to have failed with regard to the Dale Enterprise station. There is no credible basis for adjusting station data with no micro-climate bias conditions and located on a farm more than a mile from the nearest suburban community, more than three miles from a town and more than 80 miles from a population center of greater than 50,000, the standard definition of a city. Harrisonburg, the nearest town, has a single large industrial operation, a quarry, and is home to a medium sized (but hard drinking) university (James Madison University). Without question, the students at JMU have never learned to turn the lights out at night. Based on personal experience, I’m not sure most of them even go to bed at night. This raises the potential for a luminosity error we might call the “hard drinking, hard partying, college kids” bias. Whether it is possible to correct for that in the luminosity calculations I leave to others. In any case, the lay out of the town is traditional small town America, dominated by single family homes and two and three story buildings. The true urban core of the town is approximately six square blocks and other than the grain tower, there are fewer than ten buildings taller than five stories. Even within this “urban core” there are numerous parks. The rest of the town is quarter-acre and half-acre residential, except for the University, which has copious previous open ground (for when the student union and the bars are closed).

Despite the lack of a basis for suggesting the Dale Enterprise weather station is biased by urban heat island conditions, GISS has adjusted the station data as shown below. Note, this is an adjustment to the USHCN data set. I show this adjustment as it discloses the basic nature of the adjustments, rather than their effect on the actual temperature data.

While only the USHCN and GISS data are plotted, the graph includes the (blue) trend line of the unadjusted actual temperatures.

The GISS adjustments to the USHCN data at Dale Enterprise follow a well recognized pattern. GISS pulls the early part of the record down and mimics the most recent USHCN records, thus imposing an artificial warming bias. Comparison of the trend lines is somewhat difficult to see in the graphic. The trends for the original data, the USHCN data and the GISS data are: 0.24,

-0.32, and 0.43 degrees C. per Century, respectively.

If one presumes the USHCN data reflect a “high quality data set”, then the GISS adjustment does more than produce a faster rate of warming, it actually reverses the sign of the trend of this “high quality” data. Notably, compared to the true temperature record, the GISS trend doubles the actual observed warming.

This data presentation constitutes only the beginning analysis of Virginia temperature records. The Center for Environmental Stewardship of the Thomas Jefferson Institute for Public Policy plans to examine the entire data record for rural Virginia in order to identify which rural stations can serve as the basis for estimating long-term temperature trends, whether local or global. Only a similar effort nationwide can produce a true “high quality” data set upon which the scientific community can rely, whether for use in modeling or to assess the contribution of human activities to climate change.

David W. Schnare, Esq. Ph.D.

Director

Center for Environmental Stewardship

Thomas Jefferson Institute for Public Policy

Springfield Virginia

===================================

UPDATE: readers might be interested in the writeup NOAA did on this station back in 2002 here (PDF, second story). I point this out because initially NCDC tried to block the surfacestations project saying that I would compromise “observer privacy” by taking photos of the stations. Of course I took them to task on it when we found personally descriptive stories like the one referenced above and they relented. – Anthony

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
196 Comments
Inline Feedbacks
View all comments
JustPassing
February 27, 2010 2:33 am

Panel Discussion on “Climategate” – Haas School
What Should We Learn from Climategate? Panel discussion with Maximilian Auffhammer, Associate Professor, Agricultural & Resource Economics; Bill Collins, Department Head, Climate Science, Lawrence Berkeley National Laboratory; Rich Muller, Professor of Physics, author, Physics for Future Presidents; Margaret Torn, Program Head, Climate and Carbon Sciences, Lawrence Berkeley National Laboratory on questions about the integrity of the peer review process for climate change research. Presented at the Haas School, UC Berkeley, by the Energy Institute at Haas, Berkeley Energy & Resources Collaborative, and Climate & Energy Policy Institute. Moderated by Severin Borenstein, Co-Director, Energy Institute at Haas. (January 26, 2010)

jaymam
February 27, 2010 2:37 am

Jerry Gustafson (19:57:21)
Yes it is important to know how the temperature is measured. That should be stated on graphs and almost never is.
At sites I’ve looked at, for some unexplained reason the daily mean temperature trends upward while the 9am temperature remains the same.
i.e. if 9am temperatures are used there is no warming. Surely this is important and requires further study?

E.M.Smith
Editor
February 27, 2010 2:39 am

Ruhroh (01:02:41) :
Hey Cheif,

Assuming that is me (Chief though…)

What’s your take on that Raw vs. Raw blinker from rockyhigh66?

I don’t know to what this refers. I did a search of the page for rockyhigh66 and it is only in your posting. So I’m not sure what to look at…
Is it on a different thread?

rbateman
February 27, 2010 2:42 am

Speaking of California’s distant past:
http://www.kcbs.com/bayareanews/Scientists-Say-Storms-With-Devastating-as-Earthqua/6456044
1861-2 Winter was known as the “Inland Sea”. So much water came out of the hills so fast that it was months before the waters went down in the Central Valley. 300-400% of normal precip.
Which has exactly nothing to do whatsoever with C02.

rbateman
February 27, 2010 2:52 am

E.M.Smith (00:18:54) :
Yes, and the faults in Calif. are in overdrive right now.

jmrSudbury
February 27, 2010 3:00 am

Looking at that satellite shot, I wondered if the green arrow pointed to the weather station and if that was a bush or a tree beside it. I thought I would check out the Dale Enterprise, Virginia locatation at the http://www.surfacestations.org/ site, but it says, “No web site is configured at this address.”
John M Reynolds

Adam Gallon
February 27, 2010 3:29 am

http://www.publications.parliament.uk/pa/cm200910/cmselect/cmsctech/memo/climatedata/uc3902.htm
“Memorandum submitted by the Institute of Physics (CRU 39)”
This may have been noted before, if so, please snip!

E.M.Smith
Editor
February 27, 2010 3:32 am

juan (23:32:33) : The two flags showing estimation are E and X. The record for Dale Evanston shows the following:
Ah, yes… one of the things that just drives me up a wall. That whole “what data is real and what is fabricated” problem.
BTW, I spent a long time learning that “raw” means “cooked” and “robust” means “close enough or guessed well”… Along the way I figure out that none of the data used by all the temperature series folks can possibly be “raw”.
It’s really fairly simple once you think about it. They all use a “monthly mean” that is created from summation((MIN+MAX/2) over days available) so by definition those monthly means are CONSTRUCTED and not “raw”. It’s also a bit unclear how many days can have missing values or estimated values and still give you a Monthly Mean value in the “raw” dataset.
Then it goes on to another step where some things are E Estimated and others are X that is a different kind of estimated. But in both of those cases the data is not flagged as “missing” since it is present, just fabricated….
FWIW, the NOAA guidelines allow observers to “estimate” a value if there is no reading for that day and just put it on the paper form. One hopes that one of the “Estimated” flags is telling you it was just made up on the spot, while others might tell you if it was made up after the fact by programatic means like FILnet.
All this “data” is what gets fed into GIStemp, which uses it to “in-fill” missing values (that isn’t an estimate, it’s an in-fill, and that is different from the FILnet filling in of missing values, that one presumes is not an estimate, since it’s a fill in … ) but that then eventually has some homgenizing done, that isn’t an estimate either (but can be based on estimated data), but also isn’t a FILnet… but does add data that are missing to segments that need it. And eventually this goes to the anomaly mapping step that adds sea surface anomalies from “optimal interpolations” that are not estimates, nor FILnet, nor infill, but just fill in the missing grid boxes…
Sadley, I must report that prior paragraph is NOT humor nor is it overstating things. In fact, I suspect I’ve forgotten a few steps of data fabrication…
And folks wonder why I don’t trust the 1/10 C place in GIStemp product…
At any rate, by the time the data are in the GISS web site, all that has been swept under the rug and all that is available on the “data” page is the data that is present at that STEP of GIStemp with no provenance information about what bits are fabricated by various means in some prior step of the process. This gets worse the more STEPS you go through. In some cases I’ve seen what look like whole years of data made up from whole cloth for individual sites. Part of why I’d advise to start from as far upstream as possible for an analysis.
With all that said, I doubt that it has any impact on the analysis or the conclusions presented here. The fill in process looks to add ‘reasonable’ values with the major fault being that it clips peaks. That is, it will never estimate or fill in something like the 1998 hot spikes.
But yeah, the data are “holey data” even for ideal stations. Just leaves you to wonder how bad it is for the bad stations…
BTW, in making anomaly maps, GIStemp makes ‘seasonal means’ then uses them to make annual means. As I read the code, a seasonal mean can be missing one whole month and an annual mean can be missing one whole season. IIRC, with all the rules accounted for, you need a minimum of 7 months of data to make an annual mean, but you don’t have to have an actual winter season… So by the time you reach annual anomalies, the data have been stretched Waaaayyyy thin. Even the estimated FILnet filled in homogenized in-fill optimal interpolation “data”…
These folks:
http://diggingintheclay.blogspot.com/2010/02/of-missing-temperatures-and-filled-in.html
find that the percentage of missing data runs about 50% especially in recent decades for most regions of the world. It’s well worth a read. Just make sure the dishes are put away, the dog is out of the room, and you have a stiff drink handy…
But I think they only looked at the -9999 missing data flag too. And have not accounted for the E and X records as ‘missing’…
Now where did I leave that mug…

David
February 27, 2010 3:42 am

No Nick, the problem is that the outcome is utterly inconsistent with the apparent aim of the documentation, which by the way has not been updated in 13 years.

David
February 27, 2010 3:58 am

…and the explanation is completely generic, with no usable audit trail to individual sites, so to all intents and purposes the process is undocumented. If my business documentation looked like this, with a 13-year-old general statement of intent and no specific explanation of individual entries in my ledgers, I would be in jail.

David L
February 27, 2010 4:10 am

Is it common practice to apply statistical testing on the linear regression? I would be curious about the p-value on those trends.

CPT. Charles
February 27, 2010 4:45 am

Whoa. Mag 8.8 earthquake in Chile
http://kore.us/52ArTw
So, yesterday we had a quake off Okinawa, today in Chile.
That’s quicker than usual …. [the ‘Ring of Fire’ samba]

stephan
February 27, 2010 4:54 am

Ot but the recent spate of earthquakes etc haiti and now chile, betcha anything its due to solar status.

jack morrow
February 27, 2010 5:33 am

We know there is a seemingly criminal aspect to the reporting and the “team” of scientists must have a “money” reason for doing it but, once again show me the prosecuted ones. Mann, Hansen, IPCC, Jones(is he back yet). Oh, and Gore. Until something happens to these guys we will still get the same old same old.
Money talks and right now the good ole boys are rolling in it and we are paying for their schemes. I’m frustrated. And angry. I despise being taken.

old construction worker
February 27, 2010 5:39 am

REPLY: We may be able to do this if NCDC will give me access to B44 forms which are top view site sketches and description of surroundings, but so far they have not made them available. -A
‘The Dale Enterprise………… electronic temperature sensor that was installed in 1994’
After reading about all the site problems, my construction experience kicks in. Knowing the government, 1) a letter went out the the sites for informing personal of the pending updates. 2) Asking the site personal to submit a plot plan according to “code”. 3) After plot plan approval and , asking site personal to acquired x amount of bids. 4) Award bid. 5) Install equipment. 6) Submit invoice for payment.
Questions
Did the contractor follow the site plan?
If changes where made, was the USNCH informed?
When did the USNCH start writing their adjustment programs?
Did the USNCH run a control study of “warming bias” from a cross section of sites?
I could go on and on with questions that need to be answered concerning equipment placement and the birth of “warming bias adjustment”.
I’m not saying there was any fraud. The whole situation reminds me of PPP.
PPP= Piss Poor Planning

maz2
February 27, 2010 6:10 am

“BBC tells the truth – shock horror! – iceberg not caused by global warming”
“So, why did they specifically rule out global warming in this report? Because the BBC is no longer unchallenged, is the answer. Along with Al Gore, the IPCC and those dedicated academics at the “University” of East Anglia and Pennsylvania State, they know that every claim they make is now scrutinised by experts and deconstructed virally on the Internet.
AGW sceptics are now connected globally. They know the websites to trust, they can draw on a huge team of specialists and experts, many of them better qualified than the scare-mongers in white coats. This network of sceptics has a global outreach far beyond the scope of the clapped-out BBC. In the case of the iceberg, the science was so basic and indisputable that even Auntie BBC, the wizened old crone who steals our money in the form of the licensing fee, dared not expose herself to world ridicule by making false claims. Instead, she opted to gain credit for honesty on this issue, while continuing to promote the AGW scam elsewhere.
In itself, it is a very small victory for the truth; but its implications are enormous. It tells us the scam merchants are on the back foot; they are in retreat; it will still require trench warfare for years to dislodge them, but the tide has turned. Just one sentence, almost a throwaway line, in a news report, but it signals an awareness that we are on their case. The AGW hysterics have irretrievably lost the battle for public opinion and now it is time to peel their layers of fabrication and falsehood like an onion.”
http://blogs.telegraph.co.uk/news/geraldwarner/100027663/bbc-tells-the-truth-shock-horror-iceberg-not-caused-by-global-warming/

Chris D.
February 27, 2010 6:39 am

Should check out what they’ve done to the Walhalla, SC record this time. Bizarre.
Meanwhile, they don’t touch the record for Saluda, SC at all. That MMTS is 10′ from the AC unit and right next to the parking lot. Came across that one by accident. I’ll upload a cell phone pic – need to return with a better camera.

Green Sand
February 27, 2010 6:41 am

O/T but I think of interest Times-online story University ‘tried to mislead MPs on climate change e-mails’
http://www.timesonline.co.uk/tol/news/environment/article7043566.ece
Even now they still cannot tell it as it is!

starzmom
February 27, 2010 6:46 am

This may be slightly off topic, but I want to thank Anthony and the moderators and everyone else who posts at this site. You all are a bunch of incredibly smart, educated and witty people. Most especially you have given the courage and the information to be much more willing to speak out about this issue in my daily life. Not only do I feel that I have learned a great deal, I have the tools in my brain to support my argument, and the confidence to carry it out.
THANK YOU!!

pyromancer76
February 27, 2010 6:49 am

@jothi85 (16:57:03) :
“If
1: this almost 1 deg C negative bias for pre 1990 temp records by the NCDC is common
and
2: this almost 1.5 deg C negative bias for pre 1900 temp records by the GISS is common
then the whole claim of AGW red herring.
once that is established…. follow the money. you will find the crooks & their enablers”
Yes, this is the essential next step. There must be no misplaced “mercy” without complete establishment of the principles of Transparency and Accountability or we can have no hope for Truth and Justice.
Those who lied, those who pushed draconian public policies, and those who aided and abetted these crimes, must be held accountable and punished. If the punishment fits the crime, then those in leadership positions of this conspiracy deserve not only to be fired but to be fined the amounts of the grants with which they carried out this nefarious business and to lose their pensions/retirement (from the public purse). Jail time is also indicated after proper adjudication.
How about all those so-called professional organizations? How about George Soros and the billions he has put into becoming global emperor through the “green movement” and “globalization”.? How about the money behind the frauds that put an unqualified president at the head of the USA. (This is not a “birther” argument. This is straight forward constitutional. Anyone who runs for president must be a Natural Born Citizen — born of two American citizens. No parent of a US president can be a foreign national, period. This is the Commander-In-Chief, after all).
Please note: EARTH DAY –April 22 — IS ON LENIN’S BIRTHDAY. Wonder how that coincidence happened? This year is the 40th anniversary of Earth Day. From earthday.com
“Forty years after the first Earth Day, the world is in greater peril than ever. While climate change is the greatest challenge of our time, it also presents the greatest opportunity – an unprecedented opportunity to build a healthy, prosperous, clean energy economy now and for the future.
Earth Day 2010 can be a turning point to advance climate policy, energy efficiency, renewable energy and green jobs. Earth Day Network is galvanizing millions who make personal commitments to sustainability. Earth Day 2010 is a pivotal opportunity for individuals, corporations and governments to join together and create a global green economy. Join the more than one billion people in 190 countries that are taking action for Earth Day. ”
Do we have our work cut out for us! Glenn Reynolds suggests we might be ready for another Great Awakening–an American tradition (as is Arbor Day, I think).

JerryB
February 27, 2010 7:08 am

GISS uses NCDC temperature adjustments only for USHCN
station data, not for the data of any other stations.
The Dale Enterprise station times of observation were
either sunset, or 18:00 hours, for most years. In either case
a TOB (time of observation bias) , relative to midnight
readings, would occur. A study of several Virginia station
data suggests that a TOB adjustment of at least 0.5 C
would be appropriate for observations at 18:00 hours at
such stations.

RockyRoad
February 27, 2010 7:09 am

Jack Morrow RE: Show me the prosecuted ones…
Consider the following (in my earlier post and although somewhat dated, is just the beginning):
http://www.climategate.com/u-s-lawyers-get-their-legal-briefs-in-order
Problem is, the discovery process is being overwhelmed with evidence but that is a good thing. Not a day goes by but some new damning evidence is exposed. I suspect “homogenization” of temperature data as illustrated in this thread will be one key exhibit once the actual algorithms used are found. You’re going to see trial lawyers have a big hand in all this and fraudulent scientists and RICO targets will get taken down.
It was feared right after Climategate that the story wouldn’t have legs. Well, I believe it is running pretty fast right now and expanding daily. These are indeed exciting times!

Basil
Editor
February 27, 2010 7:16 am

David Schnare (20:29:56) :
Data sources for the analysis.
The “raw” data come through the NOAA Locate Station portal at:
http://www.ncdc.noaa.gov/oa/climate/stationlocator.html

Thank you for the response. If you used the monthly data, it is stated to have undergone some “quality control,” correct? I’m okay with that, and am not trying to be critical, but this is “raw” only by way of comparison to the subsequent processing the data is subjected to by NCDC or GISS. I’ve searched for an official explanation of what the “quality control” is that this monthly data has gone through, but have never found one. Would you happen to know what it means?
To others, the monthly data in these records (at least the ones I’ve examined), contain the following caveat:
“These data are quality controlled and may not be identical to the original observations”
Does anybody know what “quality controlled” at this stage of the data represents? Again, this is before all the NCDC/GISS type of adjustments.

Richard Garnache
February 27, 2010 7:17 am

Robert of Ottawa
This probably does need to be said to this group,but I have heard several questions about why they apply UHI correction the way they do. Thank you for providing the correct answer. I’m ashamed I didn’t see that for myself.

MIke O
February 27, 2010 7:18 am

What is interesting about UHI effect is that the suburban area temperatures lows are much lower than the urban. I live just outside Detroit (yes it is still urban) and the nighttime temperature where I am can be 4 – 5 degrees cooler than in the city (or more). The interesting thing about this is that I live in a township size city (6mi x 6mi) with a population of 100,000 people surrounded for 10 miles by similar communities and densities. This is in a county of 1.3 MM people. What would the difference be between where I am and a well-sited rural station (Hmmm, I believe the Univ of Michigan maintains one of these not too far away)?
Bottom line, UHI is being significantly understated.