by John Goetz
On September 15, 2008, Anthony DePalma of the New York Times wrote an article about the Mohonk Lakes USHCN weather station titled Weather History Offers Insight Into Global Warming. This article claimed, in part, that the average annual temperature has risen 2.7 degrees in 112 years at this station. What struck me about the article was the rather quaint description of the manner in which temperatures are recorded, which I have excerpted here (emphasis mine):
Mr. Huth opened the weather station, a louvered box about the size of a suitcase, and leaned in. He checked the high and low temperatures of the day on a pair of official Weather Service thermometers and then manually reset them…
If the procedure seems old-fashioned, that is just as it is intended. The temperatures that Mr. Huth recorded that day were the 41,152nd daily readings at this station, each taken exactly the same way. “Sometimes it feels like I’ve done most of them myself,” said Mr. Huth, who is one of only five people to have served as official weather observer at this station since the first reading was taken on Jan. 1, 1896.
That extremely limited number of observers greatly enhances the reliability, and therefore the value, of the data. Other weather stations have operated longer, but few match Mohonk’s consistency and reliability. “The quality of their observations is second to none on a number of counts,” said Raymond G. O’Keefe, a meteorologist at the National Weather Service office in Albany. “They’re very precise, they keep great records and they’ve done it for a very long time.”
Mohonk’s data stands apart from that of most other cooperative weather observers in other respects as well. The station has never been moved, and the resort, along with the area immediately surrounding the box, has hardly changed over time.
Clearly the data collected at this site is of the highest quality. Five observers committed to their work. No station moves. No equipment changes according to Mr. Huth (in contrast to the NOAA MMS records). Attention to detail unparalleled elsewhere. A truly Norman Rockwell image of dedication.
After reading the article, I wondered what happened to Mr. Huth’s data, and the data collected by the four observers who preceded him. What I learned is that NOAA doesn’t quite trust the data meticulously collected by Mr. Huth and his predecessors. Neither does GISS trust the data NOAA hands it. Following is a description of what is done with the data.
Let’s begin with the process of getting the data to NOAA:

Mr. Huth and other observers like him record their data in a “B91 Form”, which is submitted to NOAA every month. These forms can be downloaded for free from the NOAA website. Current B91 forms show the day’s minimum and maximum temperature as well as the time of observation. Older records often include multiple readings of temperature throughout the day. The month’s record of daily temperatures is added to each station’s historical record of daily temperatures, which can be downloaded from NOAA’s FTP site here.
The B91 form for Mohonk Lake is hand-written, and temperatures are recorded in Farenheit. Transcribing the data to the electronic daily record introduces an opportunity for error, but I spot-checked a number of B91 forms – converting degrees F to tenths of degree C – and found no errors. Kudos to the NOAA transcriptionists.
Next comes the first phase of NOAA adjustments.

The pristine data from Mohonk Lake are subject to a number of quality control and homogeneity testing and adjustment procedures. First, data is checked against a number of quality control tests, primarily to eliminate gross transcription errors. Next, monthly averages are calculated from the TMIN and TMAX values. This is straightforward when both values exist for all days in a month, but in the case of Mohonk Lake there are a number of months early in the record with several missing TMIN and/or TMAX values. Nevertheless, NOAA seems capable of creating an average temperature for many of those months. The result is referred to as the “Areal data”.
The Areal data are stored in a file called hcn_doe_mean_data, which can be found here. Even though the daily data files are updated frequently, hcn_doe_mean_data has not been updated in nearly a year. The Areal data also seem to be stored in the GHCN v2.mean file, which can be found here on NOAA’s FTP site. This is the case for Mohonk Lake.
Of course, more NOAA adjustments are needed.

The Areal data is adjusted for time of observation and stored as a seperate entry in hcn_doe_mean_data. TOB adjustment is briefly described here. Following the TOB adjustment, the series is tested for homogeneity. This procedure evaluates non-climatic discontinuities (artificial changepoints) in a station’s temperature caused by random changes to a station such as equipment relocations and changes. The version 2 algorithm looks at up to 40 highly-correlated series from nearby stations. The result of this homogenization is then passed on to FILNET which creates estimates for missing data. The output of FILNET is stored as a seperate entry in hcn_doe_mean_data.
Now GISS wants to use the data, but the NOAA adjustments are not quite what they are looking for. So what do they do? They estimate the NOAA adjustments and back them out!

GISS now takes both v2.mean and hcn_doe_mean_data, and lops off any record before 1880. GISS will also look at only the FILNET data from hcn_doe_mean_data. Temperatures in F are converted and scaled to 0.1C.
This is where things get bizarre.
For each of the twelve months in a calendar year, GISS looks at the ten most recent years in common between the two data sets. For each month in those ten most recent years it takes the difference between the FILNET temperature and the v2.mean temperature, and averages them. Then, GISS goes through the entire FILNET record and subtracts the monthly offset from each monthly temperature.
It appears to me that what GISS is attempting to do is remove the corrections done by NOAA from the USHCN data. Standing back to look at the forest through the trees, GISS appears to be trying to recreate the Areal data, failing to recognize that v2.mean is the Areal data, and that hcn_doe_mean_data also contains the Areal data.
Here is a plot of the difference between the monthly raw data from Mohonk Lake and the data GISS creates in GISTEMP STEP0 (yes, I am well aware that in this case it appears the GISS process slightly cools the record). Units on the left are 0.1C.
Even supposedly pristine data cannot escape the adjustment process.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

Ric Werme (06:03:04), MarkW (05:37:56)
Hansen’s messianic tendancies
– oh dear, even if the direst predictions of AGW turn out to be true, the *planet* isn’t in any danger, nor is *creation*, which I assume means the majority of plant & animal life
– the only thing in any real danger is ‘human-civilisation’.
– which is pretty important to you & I, admittedly!
– but the planet isn’t in any danger from the natural gas that is CO2.
If the min-max temps are valid from the last TOB, the TOB shouldn’t make a lot of difference unless a min or max occurred right before the reading. What’s up with that adjustment?
Averaging many readings to ‘improve’ resolution works only if you have some random noise in the system. Assuming you can make an accurate reading of an integer value and you don’t have any systemic errors. Which is fine for measuring a fixed value. Temperature varies all over the place, day to day and season to season. I’m not convinced of the validity of this method. Long term trends, perhaps. 0.1C? Perhaps not.
Is the calibration of the thermometer actually used to measure the temperature checked on a periodic basis?
this is a great thing for us.
Can someone explain why each station does not have both Celsius and Fahrenheit thermometers to avoid errors from conversion and reconversion?
REPLY: Because the COOP surface network was not designed to detect a climate change signal, it was an aid to forecast verification. – Anthony
Phil M (05:25:20) :
“Slightly off topic
– In September Satellite temps (lower troposphere, AMSU)
– it looks like this month is going to come out with an anomaly of around +0.2C
– the highest for this year…
It will be interesting to see if this coming winter is as cold as the last one
– or if we return to regular anomalies of around +0.2C….”
UAH shows most layers of the atmosphere cooler than at this time last year. The following link — http://igloo.atmos.uiuc.edu/cgi-bin/test/print.sh?fm=09&fd=23&fy=2007&sm=09&sd=23&sy=2008 — indicates rapid refreezing around the arctic basin and on the primary ice shield as well as significantly more snow cover in the Northern reaches of Canada and Russia than at the same time last year.
The chances of a warmer winter than last year don’t appear to be good — even if La Nina does not reactivate.
‘Averaging many readings to ‘improve’ resolution works only if you have some random noise in the system.’
The importance of this cannot be overstated. The concept “random” has
very specific meaning. Human reading bias is not random, and placing
parking lots, AC vents, cars, and buildings near a recording site does NOT
provide for a random result. Assuming a Gaussian distribution on some
data set where no such distribution exists, and then doing fancy math on
the result might win grant proposals, but the process has no value at all.
Only faulty conclusions can be drawn.
Venturing slightly off topic, it is improper analysis of EXACTLY this sort
that gave us the current mortgage mess. The same hubris, ignorance
and political motivation applies in both cases.
B91 example:
here
Phil M,
“- the only thing in any real danger is ‘human-civilisation’.”
You are correct. If the laws that have been proposed to lower CO2 emissions are enacted, civilization is in very grave danger.
Harold, I also noted the circle edge of ice forming in the Arctic (all the way to the Strait and then around to the Russian side of the Arctic). I had to go to October of last year to find the event occurring. The temperature of the connecting point between land and water in the Arctic circle seems evenly colder today compared to last year. That would tell me that the entire area is colder this year than last. Only time will tell but I predict that ice extent and thickness of ice will be greater this year than last, indicating a possible tipping point back to a pre-warming state.
Richard Lindzen has just published a new paper which documents the full range of how science is being “adjusted” to fit the correct “facts” and the motivating issues behind such adjustments. I think it is worthy of its’ own column here but it certainly explains why John’s the issues in investigation here happen in the first place. Here’s the link to the full paper:
http://arxiv.org/ftp/arxiv/papers/0809/0809.3762.pdf
Utilizing those reliable data processing standbys: Finagle Factor, Bougerre Factor and the Diddle Coefficient.
Blue Hill Observatory has also been using the same instruments since record keeping began there in 1885.
http://www.bluehill.org/
Maybe by “creation,” Hansen is referring to his own Cult of Global Warming.
I sometimes wonder if the warmists aren’t starting to get really worried that the Earth will go into a serious cooling phase. If that happens now, they are ruined. On the other hand, if they can convince the world to stablize CO2 levels, and the Earth goes into a cooling phase, they can claim credit and victory. This might explain their sense of urgency.
Just speculating.
So the pristine data, albeit with +/- 0.5F uncertainty, is corruped at the moment it is converted to Centigrade (I’m an old slide rule guy, too!) by magically “increasing” its accuracy to 0.18F (0.1C) and likely allowing downstream software (e.g. climate models) to massage any “bias” thus injected at the time of conversion? Nah! They wouldn’t do that…..!
Phil M,
”
“- the only thing in any real danger is ‘human-civilisation’.”
You are correct. If the laws that have been proposed to lower CO2 emissions are enacted, civilization is in very grave danger.”
– the ‘cure’ may be worse than the ‘disease’
BTW, I’m not saying I agree with the AGW scaremongering, only that even if it’s true, the planet, creation, life-on-earth will get along just fine
– humans may have to adjust….
– and to say otherwise is just more scaremongering, which, I think in itself, is interesting.
Phil M,
I agree that scaremongering is not helpful. I also agree that adjusting, or adapting to temperatures, whether up or down, is the correct strategy.
Mike Bryant
I mean, Mr H is (scaremongering ^ 2) to an irrational degree, which is interesting…
– if he really believes that stuff about destroying the planet…
I am a retired professional civil engineer, and majored in physics. When I took a heavy duty chemistry course back in 1953, the professor gave a pre-lab lecture, followed by a lab, all about how to read a mercury thermometer, and explained why we shouldn’t be worried if our experiments didn’t come out right. He explained how and why mercury thermometers weren’t reliable, particularly in mid-range readings. I don’t think that those mercury thermometers used then, whenever then was, have improved with age.
It is senseless and useless to attempt to “correct” or “adjust” those old thermometer readings. The difference in measurements are subject to a degree of inaccuracy well within the margin of error, which means that there aren’t really any significant differences that could or can be detected.
Fugues don’t lie, but liars figure.
Interesting…I work with/for a guy who has built and sold a number of businesses. His philosophy is too always look first at the back door, how do I get out of this thing I create. I suspect proponents of our over heated demise are now checking doors.
Damn I like this place.
Slightly off topic, yet not…
http://www.americanthinker.com/blog/2008/09/corrupted_science_revealed.html
I’ll let you all decide.
Joe Black (09:44:16) :
Thank you for the look at a B91:
My comments would be:
1. The observer has apparently followed what has always been standard procedures (or at least were for years in the reading of weather instruments) and both read and recorded the instrument to the nearest 1/10 of marked intervals.
2. He has also recorded it in the smallest unit (F not C) so that accuracy is maximized.
3. Converting units and then rounding will almost always result in some distortion of the starting value, but I see no value in converting units to calculate means, calculate highs or lows, and other simple operations. Conversions to other units should occur AFTER such calculations, not during or before.
Thank you CPT. Charles for that excellent link!
Those who want to understand exactly how climate science has been hijacked and corrupted by the Greens/Leftists should read Prof. Richard Lindzen’s expose: click
M.I.T.’s Dr. Richard Lindzen is one of the few really brave scientists who dare to point out what is happening in climate science. No doubt he will be attacked by the global warming contingent even more viciously than Gov. Palin.
After reading Prof. Lindzen’s critique, you will understand how the climate science community has been completely hijacked by the eco-environmentalist movement.
Lindzen’s paper is required reading in order to understand why certain posters on this site run their constant interference, using ad hominem attacks against anyone opposing Al Gore’s AGW/CO2/climate catastrophe hypothesis.
After reading Dr. Lindzen’s paper, you will understand what is really going on behind the scenes in the the AGW/runaway global warming argument.
LarryOldTimer:
ANOTHER slide rule guy! I attended the same lecture, only a few years later. I keep harping about nearly the same problems with mercury thermometers that supposedly produced the data sets extending back into the 1700s(!) which claim to show temp variances out to 0.01 degrees. I too have wondered about the long-term calibration of mercury thermometers, since glass is actually plastic and “sags” over long periods of time due to gravity. Would not the the tube change dimensions over two hundred years? How about 50 years? Would vaporized mercury permeate the glass, making less available in the measuring coulmn? Does anyone know? Does anyone care?
Remember: With slide rules we went to the moon. With computers we haven’t been back!
Dave Dodd and LarryOldTimer. Never mind these new fangled devices such as slide-rules! Have you ever attempted to do temperature adjustments involving Farenheit to Centigrade and back using an abacus whilst penning your work with Roman Numerals?