Adjusting Pristine Data

by John Goetz

On September 15, 2008, Anthony DePalma of the New York Times wrote an article about the Mohonk Lakes USHCN weather station titled Weather History Offers Insight Into Global Warming. This article claimed, in part, that the average annual temperature has risen 2.7 degrees in 112 years at this station. What struck me about the article was the rather quaint description of the manner in which temperatures are recorded, which I have excerpted here (emphasis mine):

Mr. Huth opened the weather station, a louvered box about the size of a suitcase, and leaned in. He checked the high and low temperatures of the day on a pair of official Weather Service thermometers and then manually reset them…

If the procedure seems old-fashioned, that is just as it is intended. The temperatures that Mr. Huth recorded that day were the 41,152nd daily readings at this station, each taken exactly the same way. “Sometimes it feels like I’ve done most of them myself,” said Mr. Huth, who is one of only five people to have served as official weather observer at this station since the first reading was taken on Jan. 1, 1896.

That extremely limited number of observers greatly enhances the reliability, and therefore the value, of the data. Other weather stations have operated longer, but few match Mohonk’s consistency and reliability. “The quality of their observations is second to none on a number of counts,” said Raymond G. O’Keefe, a meteorologist at the National Weather Service office in Albany. “They’re very precise, they keep great records and they’ve done it for a very long time.”

Mohonk’s data stands apart from that of most other cooperative weather observers in other respects as well. The station has never been moved, and the resort, along with the area immediately surrounding the box, has hardly changed over time.

Clearly the data collected at this site is of the highest quality. Five observers committed to their work. No station moves. No equipment changes according to Mr. Huth (in contrast to the NOAA MMS records). Attention to detail unparalleled elsewhere. A truly Norman Rockwell image of dedication.

After reading the article, I wondered what happened to Mr. Huth’s data, and the data collected by the four observers who preceded him. What I learned is that NOAA doesn’t quite trust the data meticulously collected by Mr. Huth and his predecessors. Neither does GISS trust the data NOAA hands it. Following is a description of what is done with the data.

Let’s begin with the process of getting the data to NOAA:

From Co-op to NOAA

Mr. Huth and other observers like him record their data in a “B91 Form”, which is submitted to NOAA every month. These forms can be downloaded for free from the NOAA website. Current B91 forms show the day’s minimum and maximum temperature as well as the time of observation. Older records often include multiple readings of temperature throughout the day. The month’s record of daily temperatures is added to each station’s historical record of daily temperatures, which can be downloaded from NOAA’s FTP site here.

The B91 form for Mohonk Lake is hand-written, and temperatures are recorded in Farenheit. Transcribing the data to the electronic daily record introduces an opportunity for error, but I spot-checked a number of B91 forms – converting degrees F to tenths of degree C – and found no errors. Kudos to the NOAA transcriptionists.

Next comes the first phase of NOAA adjustments.

NOAA to USHCN (part I) and GHCN

The pristine data from Mohonk Lake are subject to a number of quality control and homogeneity testing and adjustment procedures. First, data is checked against a number of quality control tests, primarily to eliminate gross transcription errors. Next, monthly averages are calculated from the TMIN and TMAX values. This is straightforward when both values exist for all days in a month, but in the case of Mohonk Lake there are a number of months early in the record with several missing TMIN and/or TMAX values. Nevertheless, NOAA seems capable of creating an average temperature for many of those months. The result is referred to as the “Areal data”.

The Areal data are stored in a file called hcn_doe_mean_data, which can be found here. Even though the daily data files are updated frequently, hcn_doe_mean_data has not been updated in nearly a year. The Areal data also seem to be stored in the GHCN v2.mean file, which can be found here on NOAA’s FTP site. This is the case for Mohonk Lake.

Of course, more NOAA adjustments are needed.

USCHN (part II and III)

The Areal data is adjusted for time of observation and stored as a seperate entry in hcn_doe_mean_data. TOB adjustment is briefly described here. Following the TOB adjustment, the series is tested for homogeneity. This procedure evaluates non-climatic discontinuities (artificial changepoints) in a station’s temperature caused by random changes to a station such as equipment relocations and changes. The version 2 algorithm looks at up to 40 highly-correlated series from nearby stations. The result of this homogenization is then passed on to FILNET which creates estimates for missing data. The output of FILNET is stored as a seperate entry in hcn_doe_mean_data.

Now GISS wants to use the data,  but the NOAA adjustments are not quite what they are looking for. So what do they do? They estimate the NOAA adjustments and back them out!

USHCN and GHCN to GISS

GISS now takes both v2.mean and hcn_doe_mean_data, and lops off any record before 1880. GISS will also look at only the FILNET data from hcn_doe_mean_data. Temperatures in F are converted and scaled to 0.1C.

This is where things get bizarre.

For each of the twelve months in a calendar year, GISS looks at the ten most recent years in common between the two data sets. For each month in those ten most recent years it takes the difference between the FILNET temperature and the v2.mean temperature, and averages them. Then, GISS goes through the entire FILNET record and subtracts the monthly offset from each monthly temperature.

It appears to me that what GISS is attempting to do is remove the corrections done by NOAA from the USHCN data. Standing back to look at the forest through the trees, GISS appears to be trying to recreate the Areal data, failing to recognize that v2.mean is the Areal data, and that hcn_doe_mean_data also contains the Areal data.

Here is a plot of the difference between the monthly raw data from Mohonk Lake and the data GISS creates in GISTEMP STEP0 (yes, I am well aware that in this case it appears the GISS process slightly cools the record). Units on the left are 0.1C.

Even supposedly pristine data cannot escape the adjustment process.

0 0 votes
Article Rating
104 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Admin
September 23, 2008 7:54 pm

Mr. Goetz has exceeded himself, let’s give him a round of applause! – Anthony

September 23, 2008 7:55 pm

Lord Kelvin weeps.

evanjones
Editor
September 23, 2008 7:59 pm

Yes, very good.
(And why on earth doesn’t GISS simply adjust NOAA raw data?)

George M
September 23, 2008 8:11 pm

That is a wonderful piece of detective work, showing that these climatic bit-heads can’t leave even the purest data alone, but it leaves several loose ends. I believe that in the original post, a copy of a form was shown which indicates the presence of an MMTS. Which data is so thoroughly massaged, MMTS or the Min/Max? If, indeed, Mr. Huth’s data is still used, what happens to the MMTS data? I would not expect it to be there if it is not used for something.
Also, I hope Mr. Goetz has sent a copy of this by registered mail to Anthony DePalma of the New York Times, pointing out how futile Mr. Huth and his predecessors’ work has been.
[Reply by John Goetz: There are several interesting mismatches between the B91 and NOAA MMS data, in addition to the MMTS issue. For one thing, Mr. Huth told the NYT that he recorded the temperatures at around 4:00 PM every day. However, every B91 I looked at signed by Mr. Huth indicated the time of observation was 5:00 PM. Maybe a nit, but hardly pristine.]

Dave Dodd
September 23, 2008 8:21 pm

Will someone explain to me (and perhaps other lurking newbies) whether “a pair of official Weather Service thermometers” can be read to a granularity of 0.1F? My experience with mercury thermometers leads me to believe that reading “accurately” to 0.1 degree involves a high degree of subjectivity. My old science teacher in HS would have required us to read an order of magnitude greater than that or 0.01 degree and then round back. WIll someone please enlighen me?
[Reply by John Goetz: They are read to an accuracy of 1 degree F, but the conversion process to C is where 0.1 C “accuracy” comes into play. I have yet to see a B91 with a temperature recorded in anything other than full degrees. Not proof, of course, that they don’t exist.]
REPLY by Anthony: John, you are close but not quite correct. The thermometer reading is read in 0.1F resolution, but then rounded to the nearest degree F at the time the observer makes the reading and writes it down on the B91 form.

Jeff B.
September 23, 2008 8:24 pm

(And why on earth doesn’t GISS simply adjust NOAA raw data?)
Well that would give Hansen less opportunity to hide, and then find warming.

dearieme
September 23, 2008 8:46 pm

They were nincompoops before they were crooks.

Jeff Alberts
September 23, 2008 8:57 pm

REPLY by Anthony: John, you are close but not quite correct. The thermomter reading is read in 0.1F resolution, but then rounded to the nearest degree F at the time the observer makes the reading and writes it down on the B91 form.

Which means the margin of error is at least as much as the purported warming of the late 20th century. Yup, high quality.
Pardon me while I go puke.

Dave Dodd
September 23, 2008 9:07 pm

Read to 0.1F and record as an integer — GREAT! Simple math rules! My old science teacher will rest peacefully! However, if your B91 data are integers and you average 10,000 integers you still get an integer as the end result, even if you change to a different metric system! The oft-cited AGW temperature rise of 0.7C/century is mathmatically incorrect, is it not? One often sees temps cited to 0.01 degree for data sets presumably from the nineteenth century. Somebody’s finagling!

Mike C
September 23, 2008 9:53 pm

John,
I’ve seen this before. The V2 file you are using is mislabeled. It is actually late V1 stuff… you get TOB, Homogeneity and Filnet adjustments, except the Homogeneity adjustment is actually the version 1 Homogeneity adjustment (SHAP, for documented discontinuities and Karl et al 1988 for urbanization effects). Since this station had no documented discontinuities, there should be no changes for SHAP. Then they employ Karl et al 1988 to adjust for urbanization, which is basically averaging the local USHCN stations.
Since Hansen uses his own urbanization scheme (night lights), he subtracts the Karl et al urbanization adjustment then applies his nightlight adjustment, which for this station will be no adjustment because this is a lights = zero station.
So let’s try to untangle here a little:
Raw Data
plus
TOB
plus
SHAP aka high frequency variation (actually a value of zero)
Plus
Karl et al 1988 (urbanization) aka low frequency variation
Plus
Filnet
Then Hansen takes over;
Minus
Karl et al
Plus
Hansen Night Lights adjustment (actually a value of zero)
Hansen can only use the V 1 homogeneity adjustment scheme because it separates the SHAP and Urbanization adjustments.
You see, the USHCN V2.0 is Hansen’s little problem because the V2.0 Homogeneity adjustment is sold by the NCDC as adjusting for both high and low frequency variation (adjustments for station discontinuities AND urbanization). Hansen cannot subtract any urbanization adjustment in V2.0 then add his night-lights scheme.
Now, here is where it gets really, really funky: Go to the KBSF home page here:
http://home.earthlink.net/~ponderthemaunder/index.html
flip down to the story about how NCDC wants Urban Heat Islands in the USHCN record. Then click on the link to Claud Williams’ powerpoint presentation at the AMS, Jan 2006 and tell me if you figured out why neither NCDC nor Hansen have properly adjusted schemes.

evanjones
Editor
September 23, 2008 9:53 pm

Well, with oversampling I guess you can fine it down, but I do think the 0.1 claim is a bit tight.

Bobby Lane
September 23, 2008 9:59 pm

Are you kidding me? Back and forth and back and forth conversions? Estimates? Algorithms? Averages? I’m terrible at math myself, but this just screams for the existence of room for small inaccuracies that will add up overtime to big changes. I mean, if each day was adjusted up one one-thousanth of a degree over the period of a year you get nearly 4 tenths of a degree rise (.001 x 365 = .365) in that time span. No doubt real warming is taking place at times because of the LIA recovery, but such an addition continuing over time becomes rather extreme. This process reminds me of a game of Telephone. What was said in the beginning and what comes out in the end is bound, by accident or purpose (or both), to be different. Talk about parallel dimensions!

Editor
September 23, 2008 10:35 pm

So, from the news before the “Hansen Returns” visit to Congress where “James Hansen, one of the world’s leading climate scientists, will today call for the chief executives of large fossil fuel companies to be put on trial for high crimes against humanity and nature”
It seems to me this poor, defenseless Mohonk data that never harmed no one has had barely speakable but high crimes committed against algorithms and nature. It may not be a martyr, but perhaps it can be a poster child.
So, do we have the raw Mohonk data? How about a graph?

JFA in Montreal
September 23, 2008 10:37 pm

And kudos for your choice of picture on that post.
Very à propos ! 🙂

crosspatch
September 23, 2008 10:39 pm

So if I take 100 readings with recorded temperatures in whole degrees, add them together and divide by 100, that give me a number to a precision of two decimal places! Every year the data gets more “accurate”!
/sarcasm

Eric Anderson
September 23, 2008 10:47 pm

What Jeff Alberts said.

September 23, 2008 11:16 pm

As I recall from my 1960’s non-digital-era geodesy course, you can get accuracy greater than your precision with a sufficient number of measurements, but this is just the opposite, rounding to 0.5 degrees when your precision is 0.05 degree. Not quite ‘measure with a micrometer, cut with a chainsaw,’ but I suppose it met the requirements of the time.
Obviously back in 1896 they didn’t know that mere tenths of a degree could spell life or death for a penguin, polar bear, or buckeye tree.

Demesure
September 23, 2008 11:34 pm

Very nice presentation John Goetz, thank you.
“The quality of their observations is second to none on a number of counts,”
The observers should look at what the GISS has done to their “second to none” Mohonk lake data: no monthly data in the online database since 2007 (filled with 9999) !
2007 999.9 -6.8 1.4 6.8 16.0 999.9 999.9 999.9 999.9 999.9 999.9 999.9

Bobby Lane
September 24, 2008 12:01 am

I need a little help with something. I am reading elsewhere (i.e., not on this site) but I cannot make sense of a certain statement. Regarding the Eocene period, it is stated that:
“At the same time, however, equatorial temperatures were found to be about 4K colder than at present.”
I thought that might be four thousand at first, but then I remembered I was dealing with temperature. So I assumed the K is degrees Kelvin. Well, I googled a converter so I could find out what that meant in terms I could understand (Fahrenheit). It converted 4K to -452.5. But somehow reading the statment that “equatorial temperatures were found to be about 453F cooler than at present” is a bit difficult to stomach. Am I making a methodological error?
Here is the paragraph in that paper from which it comes, which is interesting reading in itself.
“In the first example, the original data analysis for the Eocene (Shackleton and Boersma, 1981) showed the polar regions to have been so much warmer than the present that a type of alligator existed on Spitzbergen as did florae and fauna in Minnesota that could not have survived frosts.
At the same time, however, equatorial temperatures were found to be about 4K colder than at present. The first attempts to simulate the Eocene (Barron, 1987) assumed that the warming would be due to high levels of CO2, and using a climate GCM (General Circulation Model), he
obtained relatively uniform warming at all latitudes, with the meridional gradients remaining much as they are today. This behavior continues to be the case with current GCMs (Huber, 2008). As a result, paleoclimatologists have devoted much effort to ‘correcting’ their data, but,
until very recently, they were unable to bring temperatures at the equator higher than today’s (Schrag, 1999, Pearson et al, 2000). However, the latest paper (Huber, 2008) suggests that the equatorial data no longer constrains equatorial temperatures at all, and any values may have existed. All of this is quite remarkable since there is now evidence that current meridional
distributions of temperature depend critically on the presence of ice, and that the model behavior results from improper tuning wherein present distributions remain even when ice is absent.” (from page 10)

Jon
September 24, 2008 12:07 am

A question came to mind: how does TOBS handle DST?

Bobby Lane
September 24, 2008 12:08 am

Nevermind, I got it. I put in, say 98 degrees F, converted it to K, subtracted 4 degrees from the K result, and converted it back to F. That makes more sense. The point was just that the equatorial regions were cooler than at present (which was stated) but not necessarily cold (which is what my first, and incorrect, calculation made me think).

Jan RH
September 24, 2008 1:40 am

John G. states: “yes, I am well aware that in this case it appears the GISS process slightly cools the record.”
But, correct me if I’m wrong, doesn’t the fact that Mohonk-GISS temp’s are going from positive to negative just mean that GISS temp’s are getting comparatively higher as time goes by?
Which means that the GISS process produces warming in the record.

Leon Brozyna
September 24, 2008 3:03 am

The words that come to mind — Rube Goldberg.

MattN
September 24, 2008 3:09 am

“This article claimed, in part, that the average annual temperature has risen 2.7 degrees in 112 years at this station.”
And 2.7 degrees appears to be exactly the amount of the overall adjustment since 1896….
Reply – However, surrounding stations don’t show that amount of increase and some even show a decrease. See Calling All Climate Sleuths for an example. – Dee Norris

September 24, 2008 3:31 am

It all started with the 2.7 Fahrenheit (?) per hunderd years increase in the Mohonk Lakes data. Are the pristine data also giving that increase? How did NOAA change the increase, and, finally, what did GISS procedures do to it?

Chris H
September 24, 2008 3:43 am

“For one thing, Mr. Huth told the NYT that he recorded the temperatures at around 4:00 PM every day. However, every B91 I looked at signed by Mr. Huth indicated the time of observation was 5:00 PM. Maybe a nit, but hardly pristine.”
Maybe he really *does* take temp measurements at 4pm, but the time recorded in the database is adjusted for Summer Time? (Would make sense since Summer Time is a purely human convention, which might otherwise complicate analysis?)

Phil M
September 24, 2008 4:14 am

John RH
“John G. states: “yes, I am well aware that in this case it appears the GISS process slightly cools the record.”
But, correct me if I’m wrong, doesn’t the fact that Mohonk-GISS temp’s are going from positive to negative just mean that GISS temp’s are getting comparatively higher as time goes by?
Which means that the GISS process produces warming in the record.”
Yes – I’d agree with your reading of that
– as shown (Mohonk – GISStemp) shows that the GISS process produces warming….surprise!

September 24, 2008 4:17 am

A very good and entertaining post; I knew about the adjustments previously, but hadn’t realised how much this resembles a kind of sausage factory for numbers. The long-winded process reminds me of the “think of a number” trick that we used to astound our friends with at school (at age 6 or thereabouts): “Add 100, then take away 5, then take away the number you first thought of…” Magic! No wonder it all somehow “adds up” to Global Warming. :o)

September 24, 2008 4:19 am

Bobby Lane
A degree Kelvin is the same size as a degree Centigrade. The C scale has zero at the freezing point of water, the K scale has zero at absolute zero. Your conversion method was correct.

Phil M
September 24, 2008 4:21 am

Rounding to integers (F)
– yes, it’s a pity that they do this
– which just goes to show that the temperature monitoring was never intended for the purpose for which it is now being used….
– but, by using large enough samples it *is* possible to recover the information that was lost in the rounding process.
– that’s the benefit of using a large sample set – the error introduced by the rounding can be effectively eliminated by using taking readings from many places
– although I do agree that there is an overall problem with the accuracy of the whole system, which Anthony has pointed out many times
– hence all the ‘correction’ factors that get applied….

Peanut Gallery(formerly know as the artist Tom in Florida)
September 24, 2008 4:38 am

This is a classic example of taking a simple thing, adding a large dose of government and presto:
awholebunchofstuffthatisallmixedupanddoesn’tdowhatwasoriginallyintended

Mike Bryant
September 24, 2008 4:47 am

Thanks John Goetz,
This is just mind-boggling. Of course, I’ve read of these adjustments, but to see them laid out like this…
I guess putting this data through all these acrobatics makes them oh so perfect.

September 24, 2008 4:55 am

Two words come to mind….
“Paralysis By Analysis”
Why in the world the information gets manipulated is beyond me…
http://www.cookevilleweatherguy.com

September 24, 2008 4:55 am

*LOL*…or was that 3 words?? Haven’t had enough coffee yet! 🙂

Mike Bryant
September 24, 2008 5:00 am

Like I said Phil, oh so perfect. Shame on you. I thought you were a scientist. Or was it sarcasm?

Phil M
September 24, 2008 5:25 am

Slightly off topic
– In September Satellite temps (lower troposphere, AMSU)
– it looks like this month is going to come out with an anomaly of around +0.2C
– the highest for this year…
It will be interesting to see if this coming winter is as cold as the last one
– or if we return to regular anomalies of around +0.2C….

MarkW
September 24, 2008 5:37 am

Is Hansen starting to lose it?
http://www2.ljworld.com/news/2008/sep/23/nasa_climate_expert_warns_dire_consequences_global/
“If we don’t get this thing under control we are going to destroy the creation,” said James Hansen,

Editor
September 24, 2008 5:49 am

Paul (04:19:39) :

Bobby Lane
A degree Kelvin is the same size as a degree Centigrade. The C scale has zero at the freezing point of water, the K scale has zero at absolute zero. Your conversion method was correct.

Yes, but your terminology is confused. Technically “degree Kelvin” shouldn’t be used. It may have happened when Centigrade was renamed Celcius (and cycle/sec Hertz, etc), but Kelvins were redefined to make them used more like other measurements.
The phrase “about 4K colder than at present” does have one confusion, i.e. K is for Kelvin and K is a prefix for a 1000 multiplier. The K in 4K isn’t a prefix, so it must be Kelvins. Consider “that pipe is 1m shorter than the old one.” You’d know that if the old pipe was 1.618 meters long, you’d know the the new pipe was 0.618 meters long. A melting ice cube’s temperature is 273 Kelvins or 0 degrees Celcius, not 273 degrees Kelvin. “About 4K colder than at present” is the same as “4 degrees C colder.”
In dog nights, it’s about one dog, i.e. a three dog night would be a four dog night.

Bill Marsh
September 24, 2008 5:53 am

Mark W,
“Starting to lose it”?

Editor
September 24, 2008 6:03 am

MarkW (05:37:56) :

Is Hansen starting to lose it?
http://www2.ljworld.com/news/2008/sep/23/nasa_climate_expert_warns_dire_consequences_global/
“If we don’t get this thing under control we are going to destroy the creation,” said James Hansen,

I think so. He seems to be both becoming more and more messianic in his speeches and seems to be reaching out to large forums. There was a rock & environmental festival in the spring (that got mostly rained out) that he spoke at. From the coverage it seemed he was trying to expand his flock. As he seems to be losing his support from science as reality, he seems to be drawing more and more on a faithful following. Evan suggested that Hansen not be forced from NASA lest he become a martyr, my sense is the sooner the better. I do think that any Hansen watcher look beyond the science to try to figure out where he’s going.

Editor
September 24, 2008 6:15 am

“science as reality”? I meant “science and reality,” though it seems to work either way. 🙂

Harold Ambler
September 24, 2008 6:21 am

I sent an e-mail to Benjamin Cook, the NOAA meteorologist whose data was used by the New York Times for its Mohonk House article, asking him about a year at the turn of last century shown to have a sub-freezing average temperature. Having lived in the Northeast for most of my adult life, I knew this was pretty unlikely! This is what he said:
“It turns out that the graphics people at the times converted between Celsius and Fahrenheit incorrectly, so the temperatures in the graph were way too cold (although the shape of the curve and the trends were the same). I’ve already notified them, and they said they would be fixing the online graphic.”
It was good of Benjamin to get back to me. Hopefully, the Times folks will do what they have promised.

Harold Ambler
September 24, 2008 6:29 am

P.S. The graph in the Times article, using the incorrectly converted figures, shows a 20-degree swing from the coldest annual temperature to the warmest. This also seems pretty surprising, and I have sent an e-mail to Benjamin asking him about it.

September 24, 2008 6:54 am

Successively rendering significant figures insignificant, then “creating” significant figures and adjusting them back into insignificance is mind boggling. At some point in this process, “data” disappears and is replaced by “number sets” which purportedly represent what the data sets shoulda/coulda/woulda looked like had they been collected timely from properly installed and calibrated instruments in the first place.
The suggestion that the denizens of this globe should invest more than $100 trillion to correct a “problem” projected to occur based on these “supple” number sets is laughable, at best.

Dan McCune
September 24, 2008 6:56 am

I just checked with http://www.surfacestations.org/ and this site has not been surveyed. Anyone near by who could verify the equipment is reliable as the measurments?
Surveyed?
Active?
lat
long
CRN Rating*
USHCN ID 305426
Station name MOHONK LAKE
Distance from Post office 0.1
dir_PO SE
State NY
lat 41.77
long -74.15
GHCN ID 72504006
Elev (ft) 1245
location MOUNTAIN HOTEL ON LAKE 4 MILES WNW OF PO AT NEW PALTZ, NY
MMS id 20026
Reply – Ummm, Yes. See Calling All Climate Sleuths – Dee Norris

Bill Illis
September 24, 2008 7:06 am

Anthony’s trip to the NCDC this spring allowed us to see how much they are adjusting the raw temperature records with these adjustments.
USHCN V2 has two adjustments for TOBS (increases the average trend by 0.2C) and the Homogenity Adjustment (also increases the trend by 0.2C). So the adjustments increase the overall temperature trend (in the US) by 0.4C compared to the raw data.
These adjustments are shown in Slide 7 and Slide 16 of the powerpoint given to Anthony by the NCDC (not shown anywhere else on the net that I have seen).
(this link locks up sometimes)
http://wattsupwiththat.files.wordpress.com/2008/05/watts-visit.ppt#256,1,U.S. HCN Temperature Trends: A brief overview
Original can be found in this post by Anthony.
http://wattsupwiththat.com/2008/05/13/ushcn-version-2-prelims-expectations-and-tests/
Of course all the adjustments in USHCN V1 are shown in this chart (0.55F or 0.3C) So Version 2 increases the trend by a further 0.1C compared to Version 1.
http://www.ncdc.noaa.gov/img/climate/research/ushcn/ts.ushcn_anom25_diffs_urb-raw_pg.gif

Phil M
September 24, 2008 7:37 am

John Goetz
– sorry, I mis-read the graph title (several times!)
Mike Bryant
– me? I wasn’t being sarcastic
– you can get data back (statistically) that’s been ‘lost’ in rounding…
– you just need enough spacial or temporal samples…
– which is what the GISS software it trying to do by using thousands of sites…
– but it is interesting that the trend we’re looking for is of the same magnitude (per century) as the rounding error.
– persumably the results were rounded to 1F because it was felt that this was the useful limit of accuracy of the data, which is also interesting (i.e. the raw data isn’t very accurate)

September 24, 2008 7:42 am

This is why I don’t trust the NOAA and GISTemp datasets. If they do this to temperature readings within the USofA, Lord knows what they do to temperature readings from the ROW. I still haven’t figured out what happens to Canadian data.

Phil M
September 24, 2008 7:55 am

Ric Werme (06:03:04), MarkW (05:37:56)
Hansen’s messianic tendancies
– oh dear, even if the direst predictions of AGW turn out to be true, the *planet* isn’t in any danger, nor is *creation*, which I assume means the majority of plant & animal life
– the only thing in any real danger is ‘human-civilisation’.
– which is pretty important to you & I, admittedly!
– but the planet isn’t in any danger from the natural gas that is CO2.

Retired Engineer
September 24, 2008 7:57 am

If the min-max temps are valid from the last TOB, the TOB shouldn’t make a lot of difference unless a min or max occurred right before the reading. What’s up with that adjustment?
Averaging many readings to ‘improve’ resolution works only if you have some random noise in the system. Assuming you can make an accurate reading of an integer value and you don’t have any systemic errors. Which is fine for measuring a fixed value. Temperature varies all over the place, day to day and season to season. I’m not convinced of the validity of this method. Long term trends, perhaps. 0.1C? Perhaps not.

John Cooper
September 24, 2008 7:58 am

Is the calibration of the thermometer actually used to measure the temperature checked on a periodic basis?

claudiomad
September 24, 2008 8:14 am

this is a great thing for us.

Paddy
September 24, 2008 8:45 am

Can someone explain why each station does not have both Celsius and Fahrenheit thermometers to avoid errors from conversion and reconversion?
REPLY: Because the COOP surface network was not designed to detect a climate change signal, it was an aid to forecast verification. – Anthony

Harold Ambler
September 24, 2008 9:34 am

Phil M (05:25:20) :
“Slightly off topic
– In September Satellite temps (lower troposphere, AMSU)
– it looks like this month is going to come out with an anomaly of around +0.2C
– the highest for this year…
It will be interesting to see if this coming winter is as cold as the last one
– or if we return to regular anomalies of around +0.2C….”
UAH shows most layers of the atmosphere cooler than at this time last year. The following link — http://igloo.atmos.uiuc.edu/cgi-bin/test/print.sh?fm=09&fd=23&fy=2007&sm=09&sd=23&sy=2008 — indicates rapid refreezing around the arctic basin and on the primary ice shield as well as significantly more snow cover in the Northern reaches of Canada and Russia than at the same time last year.
The chances of a warmer winter than last year don’t appear to be good — even if La Nina does not reactivate.

Frank Perdicaro
September 24, 2008 9:38 am

‘Averaging many readings to ‘improve’ resolution works only if you have some random noise in the system.’
The importance of this cannot be overstated. The concept “random” has
very specific meaning. Human reading bias is not random, and placing
parking lots, AC vents, cars, and buildings near a recording site does NOT
provide for a random result. Assuming a Gaussian distribution on some
data set where no such distribution exists, and then doing fancy math on
the result might win grant proposals, but the process has no value at all.
Only faulty conclusions can be drawn.
Venturing slightly off topic, it is improper analysis of EXACTLY this sort
that gave us the current mortgage mess. The same hubris, ignorance
and political motivation applies in both cases.

Joe Black
September 24, 2008 9:44 am

B91 example:
here

Mike Bryant
September 24, 2008 10:02 am

Phil M,
“- the only thing in any real danger is ‘human-civilisation’.”
You are correct. If the laws that have been proposed to lower CO2 emissions are enacted, civilization is in very grave danger.

Pamela Gray
September 24, 2008 10:29 am

Harold, I also noted the circle edge of ice forming in the Arctic (all the way to the Strait and then around to the Russian side of the Arctic). I had to go to October of last year to find the event occurring. The temperature of the connecting point between land and water in the Arctic circle seems evenly colder today compared to last year. That would tell me that the entire area is colder this year than last. Only time will tell but I predict that ice extent and thickness of ice will be greater this year than last, indicating a possible tipping point back to a pre-warming state.

George
September 24, 2008 10:40 am

Richard Lindzen has just published a new paper which documents the full range of how science is being “adjusted” to fit the correct “facts” and the motivating issues behind such adjustments. I think it is worthy of its’ own column here but it certainly explains why John’s the issues in investigation here happen in the first place. Here’s the link to the full paper:
http://arxiv.org/ftp/arxiv/papers/0809/0809.3762.pdf

Ed Scott
September 24, 2008 11:25 am

Utilizing those reliable data processing standbys: Finagle Factor, Bougerre Factor and the Diddle Coefficient.

Josiah
September 24, 2008 12:00 pm

Blue Hill Observatory has also been using the same instruments since record keeping began there in 1885.
http://www.bluehill.org/

brazil84
September 24, 2008 12:28 pm

Maybe by “creation,” Hansen is referring to his own Cult of Global Warming.
I sometimes wonder if the warmists aren’t starting to get really worried that the Earth will go into a serious cooling phase. If that happens now, they are ruined. On the other hand, if they can convince the world to stablize CO2 levels, and the Earth goes into a cooling phase, they can claim credit and victory. This might explain their sense of urgency.
Just speculating.

Dave Dodd
September 24, 2008 12:30 pm

So the pristine data, albeit with +/- 0.5F uncertainty, is corruped at the moment it is converted to Centigrade (I’m an old slide rule guy, too!) by magically “increasing” its accuracy to 0.18F (0.1C) and likely allowing downstream software (e.g. climate models) to massage any “bias” thus injected at the time of conversion? Nah! They wouldn’t do that…..!

Phil M
September 24, 2008 12:47 pm

Phil M,

“- the only thing in any real danger is ‘human-civilisation’.”
You are correct. If the laws that have been proposed to lower CO2 emissions are enacted, civilization is in very grave danger.”
– the ‘cure’ may be worse than the ‘disease’
BTW, I’m not saying I agree with the AGW scaremongering, only that even if it’s true, the planet, creation, life-on-earth will get along just fine
– humans may have to adjust….
– and to say otherwise is just more scaremongering, which, I think in itself, is interesting.

Mike Bryant
September 24, 2008 1:08 pm

Phil M,
I agree that scaremongering is not helpful. I also agree that adjusting, or adapting to temperatures, whether up or down, is the correct strategy.

Phil M
September 24, 2008 2:54 pm

Mike Bryant
I mean, Mr H is (scaremongering ^ 2) to an irrational degree, which is interesting…
– if he really believes that stuff about destroying the planet…

LarryOldtimer
September 24, 2008 5:36 pm

I am a retired professional civil engineer, and majored in physics. When I took a heavy duty chemistry course back in 1953, the professor gave a pre-lab lecture, followed by a lab, all about how to read a mercury thermometer, and explained why we shouldn’t be worried if our experiments didn’t come out right. He explained how and why mercury thermometers weren’t reliable, particularly in mid-range readings. I don’t think that those mercury thermometers used then, whenever then was, have improved with age.
It is senseless and useless to attempt to “correct” or “adjust” those old thermometer readings. The difference in measurements are subject to a degree of inaccuracy well within the margin of error, which means that there aren’t really any significant differences that could or can be detected.
Fugues don’t lie, but liars figure.

Ray
September 24, 2008 5:39 pm

Interesting…I work with/for a guy who has built and sold a number of businesses. His philosophy is too always look first at the back door, how do I get out of this thing I create. I suspect proponents of our over heated demise are now checking doors.
Damn I like this place.

CPT. Charles
September 24, 2008 6:14 pm

Slightly off topic, yet not…
http://www.americanthinker.com/blog/2008/09/corrupted_science_revealed.html
I’ll let you all decide.

Rod Smith
September 24, 2008 7:22 pm

Joe Black (09:44:16) :
Thank you for the look at a B91:
My comments would be:
1. The observer has apparently followed what has always been standard procedures (or at least were for years in the reading of weather instruments) and both read and recorded the instrument to the nearest 1/10 of marked intervals.
2. He has also recorded it in the smallest unit (F not C) so that accuracy is maximized.
3. Converting units and then rounding will almost always result in some distortion of the starting value, but I see no value in converting units to calculate means, calculate highs or lows, and other simple operations. Conversions to other units should occur AFTER such calculations, not during or before.

September 24, 2008 8:00 pm

Thank you CPT. Charles for that excellent link!
Those who want to understand exactly how climate science has been hijacked and corrupted by the Greens/Leftists should read Prof. Richard Lindzen’s expose: click
M.I.T.’s Dr. Richard Lindzen is one of the few really brave scientists who dare to point out what is happening in climate science. No doubt he will be attacked by the global warming contingent even more viciously than Gov. Palin.
After reading Prof. Lindzen’s critique, you will understand how the climate science community has been completely hijacked by the eco-environmentalist movement.
Lindzen’s paper is required reading in order to understand why certain posters on this site run their constant interference, using ad hominem attacks against anyone opposing Al Gore’s AGW/CO2/climate catastrophe hypothesis.
After reading Dr. Lindzen’s paper, you will understand what is really going on behind the scenes in the the AGW/runaway global warming argument.

Dave Dodd
September 24, 2008 8:16 pm

LarryOldTimer:
ANOTHER slide rule guy! I attended the same lecture, only a few years later. I keep harping about nearly the same problems with mercury thermometers that supposedly produced the data sets extending back into the 1700s(!) which claim to show temp variances out to 0.01 degrees. I too have wondered about the long-term calibration of mercury thermometers, since glass is actually plastic and “sags” over long periods of time due to gravity. Would not the the tube change dimensions over two hundred years? How about 50 years? Would vaporized mercury permeate the glass, making less available in the measuring coulmn? Does anyone know? Does anyone care?
Remember: With slide rules we went to the moon. With computers we haven’t been back!

F Rasmin
September 24, 2008 8:33 pm

Dave Dodd and LarryOldTimer. Never mind these new fangled devices such as slide-rules! Have you ever attempted to do temperature adjustments involving Farenheit to Centigrade and back using an abacus whilst penning your work with Roman Numerals?

Greg Smith
September 24, 2008 8:35 pm

Here’s the latest from NASA where they acknowledge urban heat islands. Must be a first.
http://www.nasa.gov/topics/earth/features/heat_wave_los_angeles.html

September 24, 2008 9:08 pm

Mr Smokey said (20:00:48) :
“Thank you CPT. Charles for that excellent link!
Those who want to understand exactly how climate science has been hijacked and corrupted by the Greens/Leftists should read Prof. Richard Lindzen’s expose”
If I might say so, Mr Smokey, you overstate the position. My interpretation of Professor Lindzen’s article is that he was pointing out one way in which the debate has been influenced. Just because some people whom I might consider to be a sandwich short of a picnic have argued for something does not, of itself, mean their position is wrong. Many a nutter is right, just for the wrong reasons. I don’t believe the tree hugging, touchy-feely, no-shampoo-for-me-thank-you mob to be right but whether they are or not is not determined by the amount of hand-chewed raffia clothing they wear but by the strength of the case they present. Equally, that I am urbane, sophisticated and have charm you could iron linen with does not mean I am right.
The value of Professor Lindzen’s article is that it promotes caution and objective analysis in an attempt to separate scientific investigation from political grandstanding.

denny
September 24, 2008 9:23 pm

F Rasmin (20:33:35) :
Dave Dodd and LarryOldTimer. Never mind these new fangled devices such as slide-rules! Have you ever attempted to do temperature adjustments involving Farenheit to Centigrade and back using an abacus whilst penning your work with Roman Numerals?
To which I might add….With a chisel…on stone tablets???

Bobby Lane
September 24, 2008 9:30 pm

Rick and Paul,
Thanks for your help. What I figured out was that the tropics at the time of the Eocene were about 8-10 degrees F cooler than they are now, according to studies of the Eocene period. I don’t know how it would have affected flora and fauna, apart from being certain that it would have, but it is interesting. It appears at that time that Minnesota could well have been warmer overall than Colombia, which is fascinating in all kinds of ways.

September 25, 2008 4:24 am

I got the impression that the weather station pictures that littered one episode of the BBC Iain Stewart series, were taken from Anthony Watts’ superb records. Anthony – you said somewhere earlier you’d been contacted by the BBC. If so, this would be surely another misrepresentation a lot bigger than Wunsch’s complaints re Swindle, since your aim is to check the still-unresolved serious doubts about the records?
Thanks Fat Bigot for wise words about Prof Lindzen’s new paper – also visible at ICECAP. I too think it deserves a thread here or at CA but it’s extremely difficult not to get carried away with emotions when dealing with these issues of vulnerability to corruption in Climate Science – as the recent blog on Hansen here shows – and that helps nobody in the end.
But we all do it at times… as Steve would say, take a deep breath…

Retired Engineer
September 25, 2008 10:21 am

Converting F to C? No problem. C=(F-XXXII)*V/IX
Any measurement has an error budget. From a variety of sources, including the observer. Mercury thermometers are no execption. Assuming pure mercury (perhaps a WAG in itself) there should be no diffusion loss. The vapor pressure doesn’t come into play (an alcohol thermometer is another story). While glass does sag, thet process is extremely slow.
The two big questions are: variations in capillary diameter and calibration. The thin hole in the measuring part does vary in size. How much is a good question. It affects linearity. How good is the match between this thermometer and the next? That’s the calibration part. Not easy to generate precise temperatures. 0 C is not too hard, anything else depends on a lot of factors.
So I have serious questions about “pristine” data, adjusted or not. I spent many years measuring stuff, temperature included. I also saw a lot of abuses, papers published out to four decimal places with equipment that had 1% accuracy at best. Lots of digits impress people. Gives the impression you know what you are doing, even if it is floobydust. (h/t to Robert Pease)
<1 degree C in a hundred years? That’s below the noise threshold.

September 25, 2008 11:31 am

FatBigot:
“The value of Professor Lindzen’s article is that it promotes caution and objective analysis in an attempt to separate scientific investigation from political grandstanding.”
That was my point. What other well known climatologist has been willing to author a paper that shows how the environmental lobby has inserted numerous partisan promoters of AGW/runaway global warming into positions in which they speak out on behalf of the entire organization?
Prof. Lindzen says explicitly what other climate scientists have only hinted at, and he names names. When an environmentalist advocate, who has no grounding in climatology or meteorology, and in some cases no scientific background at all manages to become the spokesperson for an entire scientific organization, naturally the media will report what the partisan non-climatologist says. And that is exactly the purpose of these enviro-machinations.
The more people that see how the climate issue is being dishonestly manipulated by partisan advocates of a predetermined outcome, the better, IMHO.

September 25, 2008 2:18 pm

John,
Thanks for your excellent post.
Mr. dePalma’s reference to phenological records is interesting, especially the longest and “best” of these, the Kyoto cherry blossom record. A good summary w/ pictures and graphs of that record, with explanation of its validity as a temp record, is here:
http://arnoldia.arboretum.harvard.edu/pdf/articles/1893.pdf

The “big picture”, as far as I can see, is not whether it is warmer now than it was 100 years ago. I’m not even sure the data “tinkerers” can overcome the evidence that there were warm periods in the past prior to industrialism which equalled or exceeded the present-day warmth.

September 25, 2008 2:22 pm

A-a-n-n-d another block quote bites the dust. Here, in plain quotes, the Kyoto Cherry Blossom record summary:
“The calculations show that during the 11th through the 13th centuries, average temperatures were at their warmest averages, often as high as 8° C , as indicated by early dates of the cherry blossom festival. There were occasionally very cold years, as indicated by late flowering years, but on the whole this was the warmest average period.”

Harold K McCard
September 25, 2008 2:29 pm

John Goetz,
I’m sure that you know that the daily temperature data from the Mohonk Lake surface station can be found for the interval 05/1948 through 12/2005 at
http://cdiac.ornl.gov/cgi-bin/broker?_PROGRAM=prog.climsite.sas&_SERVICE=default&id=305426
I selected a random sample of ten completed B91 Forms for Mohonk Lake from the site that you referenced
http://www7.ncdc.noaa.gov/IPS/coop/coop.html?_page=2&state=NY&foreign=false&selectedCoopId=305426&_target3=Next+%3E
and verified that the values for TMAX(F) and TMIN(F), with one exception, were identical. The exception which occurred on 07/12/1956 is an obvious transcription error; the value (67°F) listed in the “At OBSN” column was recorded for TMIN(F) instead of the correct value (64°F). In addition to TMAX(F) and TMIN(F), the USHCN site also lists TAVE(F) which equals [TMAX(F) + TMIN(F)]/2 rounded to the next higher integer when either TMAX(F) or TMIN(F) is an odd integer. Therefore, I believe the data archived at the site that I referenced is the “RAW” daily temperature data for station 305426.
So far, so good …
I then calculated the monthly averages for TAVE(F), TMAX(F) and TMIN(F), for the interval 05/1948 through 12/1959 and compared the results with the corresponding monthly values that I downloaded from
http://cdiac.ornl.gov/cgi-bin/broker?_PROGRAM=prog.climsite_monthly.sas&_SERVICE=default&id=305426
How did they compare? Not so good!!
The average differences for the 139 month interval are (s.d. shown in parentheses):
RAW TAVE(F) – HCN TAVE(F) = 1.39°F (0.43°F)
RAW TMAX(F) – HCN TMAX(F) = 1.43°F (0.62°F)
RAW TMIN(F) – HCN TMIN(F) = 0.83°F (1.17°F)
What adjustments do you think were made by NCDC that caused these differences?
I may add to this post after I complete my examination of the 1948:2005 RAW data set.

Harold K McCard
September 25, 2008 2:32 pm

John Goetz,
I’m sure that you know that the daily temperature data from the Mohonk Lake surface station can be found for the interval 05/1948 through 12/2005 at
http://cdiac.ornl.gov/cgi-bin/broker?_PROGRAM=prog.climsite.sas&_SERVICE=default&id=305426
I selected a random sample of ten completed B91 Forms for Mohonk Lake from the site that you referenced
http://www7.ncdc.noaa.gov/IPS/coop/coop.html?_page=2&state=NY&foreign=false&selectedCoopId=305426&_target3=Next+%3E
and verified that the values for TMAX(F) and TMIN(F), with one exception, were identical. The exception which occurred on 07/12/1956 is an obvious transcription error; the value (67°F) listed in the “At OBSN” column was recorded for TMIN(F) instead of the correct value (64°F). In addition to TMAX(F) and TMIN(F), the USHCN site also lists TAVE(F) which equals [TMAX(F) + TMIN(F)]/2 rounded to the next higher integer when either TMAX(F) or TMIN(F) is an odd integer. Therefore, I believe the data archived at the site that I referenced is the “RAW” daily temperature data for station 305426.
So far, so good …
I then calculated the monthly averages for TAVE(F), TMAX(F) and TMIN(F), for the interval 05/1948 through 12/1959 and compared the results with the corresponding monthly values that I downloaded from
http://cdiac.ornl.gov/cgi-bin/broker?_PROGRAM=prog.climsite_monthly.sas&_SERVICE=default&id=305426
How did they compare? Not so good!!
The average differences for the 139 month interval are (s.d. shown in parentheses):
RAW TAVE(F) – HCN TAVE(F) = 1.39°F (0.43°F)
RAW TMAX(F) – HCN TMAX(F) = 1.43°F (0.62°F)
RAW TMIN(F) – HCN TMIN(F) = 0.83°F (1.17°F)
What adjustments do you think were made by NCDC that caused these differences?
I may add to this post after I complete my examination of the 1948:2005 RAW data set.
Reply: Anthony’s spam bucket tends to grab stuff with a high link-to-text ratio. I think your post was above whatever that limit is. It looks like one of the moderators did find it and let it get through.
As for what you are seeing, I suspect it is homogenization, TOBS, and FILNET differences.

Harold K McCard
September 25, 2008 2:44 pm

Moderator,
I tried to post a comment a short time ago but I didn’t receive the usual response “awaiting moderation.” I tried subnitting it again but still no response.
Did you receive anything from me?
hmccard
Reply: No – Anne

More:
Nevermind, found them in the spam bucket – Anne

Frank Lansner /Denmark
September 26, 2008 12:35 am

Yes, fantastic writing!!
Here comes a stuuupid question, To Goetz or anyone:
Goetz you write:
“Here is a plot of the difference between the monthly raw data from Mohonk Lake and the data GISS creates in GISTEMP STEP0 (yes, I am well aware that in this case it appears the GISS process slightly cools the record). ”
Hmmm…
If GISS in 1880 is 0,4 degrees COLDER than Mohonk
– and in 2000 is 0,0 degrees COLDER than Mohonk
Yes, then the COOLING done by GISS is bigger in 1880 than 2000 ?
And then GISS has induced a WARMING TREND !?
– ups i have the feeling that i got something wrong, but i have to mention…
K.R. Frank
Reply by John Goetz: By cooling the record, I mean the trend from 1896 – 2006 in GISS is slightly less warm than the trend in the raw data. GISS actually warms the older temperatures and cools the later ones, thus reducing the overall trend.

Harold K McCard
September 26, 2008 12:33 pm

John Goetz,
Thanks, I’m a novice in this field and do not fully understand NCDC’s homogenization, TOBS, and FILNET adjustments. Here are a few of my comments and observations:
Homogenization: I really don’t know anything about that aspect.
TOBS: TOBS is a mystery to me. I read BarryW’s September 24th, 2008 at 12:50 pm comment to your post on CA and I have read the general discussion of the subject on the USHCN website. Referencing al station data to midnight may be logical but it is not apparent to me how that adjustment is made.
FILNET: Surprisingly, the 1948:2005 daily RAW data set with 3X21062 data points has only 14 missing data points; two days where TAVE and TMAX are both missing, two days where TAVE and TMIN are both missing and two days where TAVE, TMAX and TMIN are all missing. On each of these six days, the values for TMAX and TMIN were listed on the B91 Forms. Therefore, it doesn’t appear to me that FILNET was of much use.
As regards Mohonk Lake, I examined the 12 B91 Forms for 1951 and noted that 4 PM was the Hour of Observation on all forms. For 1951, the monthly differences between RAW and HCN temperatures (∆Ts) are:
MONTH ∆TAVE ∆TMAX ∆TMIN
JAN 1.33°F 1.03°F 1.08°F
FEB 1.64°F 1.56°F 1.30°F
MAR 1.19°F 1.01°F 0.85°F
APR 1.32°F 1.23°F 0.88°F
MAY -1.05°F -3.21°F 0.73°F
JUN 1.05°F 1.09°F 0.69°F
JUL 1.04°F 1.16°F 0.54°F
AUG 1.25°F 1.32°F 0.76°F
SEP 1.53°F 1.66°F 0.93°F
OCT 1.39°F 1.13°F 1.18°F
NOV 1.42°F 1.32°F 1.22°F
DEC 1.06°F 0.88°F 0.91°F
AVE 1.10°F 0.85°F 0.92°F
I chose 1951 to review because of the negative values for ∆TAVE and ∆TMAX in May. It turns out that the value of 77°F recorded on the B91 Form for that month was transcribed in the RAW data set as -77°F. It appears to have been corrected by NCDC before including it in the HCN data set.The reasons for the other differences are not apparent to me.
Reply by John Goetz: The homogenization algorithm does catch the type of outlier you describe (the transcription error). However (and I have not seen the code), I do not believe it will necessarily take a -77°F value and change it to 77°F. It might change it to a 63 or 84 or some other number that is based on the values of the “closely correlated” nearby stations.
Ideally, NOAA would spit out a list of the outliers and have a transcriptionist go back and manually double-check the B91. They just need to do that once, and then once a month after that on all newly-added data.

Harold K McCard
September 26, 2008 3:19 pm

Re: My 09/26/08 (12:33:15) Post
The second sentence in the last paragraph should have read: “It turns out that the value of 77°F recorded on the B91 Form for second day of that month…”

Harold K McCard
September 28, 2008 4:18 pm

John Goetz,
Thanks again for your insights. I have commented previously on this blog that I have been puzzled by what appeared to me to be step-wise adjustments that NCDC makes to surface station data. As I’m sure you know, WRDC archives surface station data which I compared to USHCN data for several surface stations and observed the step-wise differences. I wondered if the similar step-wise differences might be observed between NCDC’s daily and monthly data sets for Mohonk Lake,NY (305426). I have completed my analysis and observed similar step-wise differences between the daily and monthly data sets.
I use Excel but, unfortunately, I haven’t learned how to export Excel graphics to WordPress. It would be much easier to convey what I have observed if I could do so. If you permit me, I’ll try to briefly explain what I have observed.
First, I refer to the HCN data set as the RAW data set. I updated the RAW data set by correcting the -77°F transcription error and using the B91 Forms to fill the data cells for the missing six days. While doing that, I realized that the entire B91 Form for April 1965 had not been transcribed properly. Therefore, I incorporated the B91 Form data.
I then calculated the monthly averages for TAVE(F), TMAX(F) and TMIN(F), for the interval 05/1948 through 12/2005 and compared the results with the corresponding monthly values that I downloaded from
http://cdiac.ornl.gov/cgi-bin/broker?_PROGRAM=prog.climsite_monthly.sas&_SERVICE=default&id=305426
I defined the following adjustments:
1. Adj RAW TMAX = RAW TMAX – HCN TMAX
2. Adj RAW TMIN = RAW TMIN – HCN TMIN
3. Adj RAW TMED = Adj RAW TMAX – Adj HCN TMAX
4. Adj RAW TAVE = Adj RAW TMED + RAW del TAVE – HCN del TAVE
Where
1. RAW del TAVE = RAW TAVE – RAW TMED
2. HCN del TAVE = HCN TAVE –HCN TAV3
Briefly, here are some of my observations:
1. Stepwise adjustments occurred in 1954-56 and in 1990.
2. Steps-wise changes in Adj RAW TAVE:
Average 1949:1954 = 0.39°F
Average 1956:1980 = 0.63°F
Average 1991:2005 = 0.03°F
3. Steps-wise changes in Adj RAW TMAX:
Average 1949:1954 = 1.27°F
Average 1956:1980 = 1.85°F
Average 1991:2005 = 1.20°F
4. Steps-wise changes in Adj RAW TMIN:
Average 1949:1954 = 0.92°F
Average 1956:1980 = 1.02°F
Average 1991:2005 = 1.62°F
So … does this make any difference from a climate change perspective? I don’t know. For the 1949:2005 interval, simple linear regression shows the following:
RAW TAVE = 1.79°F
HCN TAVE = 1.55°F
RAW TMAX = 2.56°F
HCN TMAX = 2.97°F
RAW TMIN = 1.01°F
HCN TMIN = 0.19°F
I guess that it depends upon on your perspective.

Harold K McCard
September 28, 2008 5:57 pm

Re: My last post
HCN del TAVE = HCN TAVE –HCN TAV3
should have read
HCN del TAVE = HCN TAVE –HCN MED
Sorry about that …

Mike Bryant
September 28, 2008 6:41 pm

I am glad you clarified that, Harold.

Harold K McCard
September 29, 2008 8:24 am

Re: My 09/28 (16:18:10) Post
I should have noted that under “…some of my observations:” the step-wise changes in Adj RAW TAVE, Adj RAW TMAX and Adj RAW TMIN were with respect to the average annual temperature.
Also, another correction:
3. Adj RAW TMED = Adj RAW TMAX – Adj HCN TMAX
should have read
3. Adj RAW TMED = [Adj RAW TMAX + Adj RAW TMIN]/2
Again, sorry about the confusion. I guess I was in too much of a hurry to go to dinner.

willem van, aerschot
October 1, 2008 1:36 pm

Solution for the Greenland gulfstream slowdown
http://www.guardian.co.uk/flash/0,,1656541,00.html
Clean water ,clean energy ,and the solution for the Greenland gulfstream slowdown:
Put in sea a construction of windmills combined with electric boilers, these get preheated by suncollectors and sunmirrors.
the steam they produce gets used by steam engines wich produce again electricity ,salt and when you condesate the steam coming from the steam engines sweet water.
You take out salt water coming from the warm gulfstream and the left over salt from the steam engines gets used to release in the cold stream.
So the cold stream gets saltier again and and won’t mingel with the warm gulfstream becaurse before it was less saltier caursed by the melting gletchers and slowing down the warm gulfstream becaurse less saltier water is lighter.
Preventing an possibel iceage.
The sweet water can then be used to make hydrogen ,so if there is for exampel a lot of wind and you don’t need all that electricity , you are abel to stock it and use it later.
The rest of the steam you blow in the air.
The oxygen deriving from turning water into hydrogen can be devided in sea to clean the water and to make more suitabel for marinelife.
So you produce electricity and /or hydrogen

October 10, 2008 7:03 am

Am I missing something, but does one have to pay to access “raw” USHCN data?
REPLY: Yes and no, you can get the absolute RAW B91 data forms free, but you have to hand transcribe. If you want a CD of data, then yes you have to pay for it to NCDC. – Anthony

October 10, 2008 11:45 am

Aw geez….It’s a nice easy but big download from Environment Canada and current data is viewable and cut and pastable into Excel for all stations.

hmccard
October 19, 2008 1:56 pm

John Goetz,
TOBS continues to be a mystery to me. I have compared the areal data with the COBS data in hcn_doe_mean_data file for the 1900:2006 interval and was perplexed by the results. Significant step-wise changes in the difference between the areal and COBS data occurred in 1909 and 1955. All difference values were constant between the changes as listed below:
MONTH 1900-08 1909-55 1956-2006
JAN 1.10°F 1.10°F 1.40°F
FEB 1.10°F 1.50°F 1.80°F
MAR 0.70°F 0.90°F 1.30°F
APR 0.60°F 1.10°F 1.50°F
MAY 0.50°F 1.30°F 1.60°F
JUN 0.30°F 0.88°F 1.20°F
JUL 0.30°F 0.80°F 1.00°F
AUG 0.50°F 1.00°F 1.20°F
SEP 0.70°F 1.31°F 1.60°F
OCT 0.80°F 1.11°F 1.50°F
NOV 1.30°F 1.31°F 1.70°F
DEC 0.90°F 0.90°F 0.90°F
AVE 0.72°F 1.10°F 1.41°F
DJF 1.00°F 1.18°F 1.43°F
MAM 0.58°F 1.09°F 1.47°F
JJA 0.37°F 0.90°F 1.13°F
SON 0.94°F 1.23°F 1.60°F
I also compared the areal data with the COBS data in hcn_doe_max_data and hcn_doe_min_data files for the 1900:2006 interval and observed the same step-wise pattern, although the values were different.
I have also made the same comparisons for other surface station and observed similar perplexing differences between areal and TOBS data. For example, Fort Collins, CO (053003) for the 1900:2006 interval displayed significant step-wise changes in TAVG, TMAX and TMIN in 1905 and 1940 but there was no difference between the areal and TOBS data between 1905 and 1940.
Can anyone explain to me how NCDC calculates TOBS?

hmccard
October 19, 2008 2:19 pm

Re: My last message
I should have indicated that the first part of my message pertained to Mohonk Lake.

hmccard
October 20, 2008 1:25 pm

John,
Thanks for directing me to the CA thread. I found a copy of Karl, et al, 1986 in Hu McCulloch’s comment #110:
http://ams.allenpress.com/archive/1520-0450/25/2/pdf/i1520-0450-25-2-145.pdf
I read the article hurriedly but I need to give it more thought. My first impression is that TOBS may introduce considerable distortion to the areal data. For example, the month-to-month variability for Mohonk Lake listed in my 10/19 message, especially in the 1956-2006 interval, doesn’t seem logical to me. This variability may be due to the spatial patterns of δ shown in Fig. 6 of the article. The gradients for δ are quite steep during DJF in the Mohonk Lake region. The iso-contours also shift quite rapidly.
I found nothing in the TOBS article that explains the significant step-wise changes in the difference between the areal and COBS data occurred in 1909 and 1955 for Mohonk Lake. I checked several monthly B91 Forms before and after 1909 and 1955 and there were no significant change in Time of Observation. (In 1954 and 1956, the time listed on the B91 forms was 4:00 PM; in 1908 and 1910, the time listed was either 6:00 PM or “about sunset.”)
In the Fort Collins example that I mentioned where there was no difference between the areal and TOBS data between 1905 and 1940 … I guess the TOBS “adjuster” must have been “OFF” during that interval!!
Needless to say, I am still puzzled by the inexplicable step-wise changes that I have observed. I’ll examine some more sites when I have time and see if a similar pattern persists.
BTW, I noticed that many of the comments on the CA thread referred to COBS as though it was used to adjust the daily TMAX and TMIN data. Since the B91 Forms list only the time of observation for the month, it seems clear to me that COBS is a monthly adjustment.
Thanks, also, for the guidance on how to export a graphic from Excel to WordPress. I’ll try using it to display some of the step-wise changes that have been puzzling me.

November 13, 2008 2:44 pm

[…] is there even a temperature increase? The ‘average’ temperature on Earth is calculated from such unbelievably incorrect, inaccurate, estimated, made up, averaged temperature readings that are adjusted with […]

December 16, 2008 6:45 am

[…] See, the “bold action” in the quote above refers to Climate Change, and if something has nothing to do with science (or facts for that matter) and everything to do with politics, it’s Climate Change (formerly: Global Warming). GW is a movement based entirely upon junk science, incomplete computer models and incredibly inaccurate temperature readings. […]

January 25, 2009 10:16 am

[…] the other hand, with a plethora of issues with GISS data, including adjustments to pristine data, failing to catch obviously corrupted data, significant errors in splicing and reporting pointed […]