Record highs? – NOAA staffers are beginning to doubt the accuracy of the measurement system

While Joe Romm squawks about record highs being “obscenely hot” over at Think-Climate Progress, there’s a quiet bit of questioning going on within NOAA about the veracity of the surface temperature measurements, particularly related to ASOS stations at airports, which have made up a significant number of recent record high temperatures in the USA in June.

Below is an extraordinary interchange between Dr. Roger Pielke Sr., and Greg Carbin, of the NOAA Storm Prediction Center.

Discussion Of June All-Time Record Maximum Temperatures In The United States Since 1950 And Possible Effect Of Instrument Changes

Figure and Data Analysis Provided by Greg Carbin of NOAA

I received the e-mail below from Conrad Ziegler that alerted me to a really important analysis of siting issues with respect to surface temperature data. The relevant part of Conrad’s e-mail, and the e-mails on this figure by Greg Carbin and Richard Grumm are reproduced below (with their permission).

Date: Thu, 30 Jun 2011 08:44:32 -0500

From: Conrad Ziegler <xxxxxx>

To: pielkesr@xxxxxxCc: gregory.carbin@xxxxx, richard.grumm@xxxxxxx   Conrad Ziegler <conrad.ziegler@xxxxxx>

Subject: Fwd: [Map Discussion] All-time maximum temperatures broken or tied in


Hello Roger!

I hope this message finds you well!

I thought you’d be interested in the correspondence below between Rich Grumm and Greg Carbin about US record high temperatures since 1950. I’ve attached Greg’s interesting figure that accompanies his message below (Greg, I hope you don’t mind my sharing this).  I know you have long advocated the need for adequate siting of surface operational meteorological sensors worldwide, and have written on the subject of effects of poor siting on temperature time series.  Rich’s message raises the bar so to speak…the instrumentation itself needs to be properly designed and sensors aspirated correctly.  One would have hoped this was at minimum a standard thorughout the US, though reading Rich’s message I’m no longer so sure.

Best Regards,


Following is Greg’s initial e-mail on his analysis

From: gregory.carbin@xxxxxDate: Wednesday, June 29, 2011 10:37 pm

Subject: All-time maximum temperatures broken or tied in June

To: HWT Map Discussion Listserv <discussion@xxxxxx>, SUNY

Map Listserv <map@xxxxxx>


I was wondering how the 41 all-time max temperature records broken or tied in the U.S. during June 2011 compared to other Junes of the past. So, I wrote a script this evening to grab the all-time annual maximum temperatures that were broken or tied during the month of June, for all Junes, back to 1950. The resulting chart is attached. June 2011 has had the most all-time maxes broken or tied (16 broken/32 tied through June 29) since June 1998 when 89 all-time max temperatures where either broken (31) or tied (58). However, the month of June 1994 really bakes the cake! That’s when a total of  229 all-time max temperatures where either broken (112) or tied (117). June 1990 was also another hot one with a lot of records either tied or broken. See the chart for more details.


Greg Carbin

Warning Coordination Meteorologist

NOAA-NWS Storm Prediction Center

National Weather Center

Norman, Oklahoma 73072-7268

Richard Grumm’s of NOAA replied

From: Richard.Grumm@xxxxx Date: June 30, 2011 7:49:12 AM CDT

To: gregory.carbin@xxxx Cc: SUNY Map Listserv <map@xxxxxxx>, HWT Map Discussion

Listserv <discussion@xxxxxx>

Subject: Re: [Map Discussion] All-time maximum temperatures broken or tied in June


Nicely done. Kind of warms the soul.

I ponder if the sensors used in ASOS massively deployed circa 1991 played some role in this, that is [a] downward trend though most of this would be COOP site data. Did we go through some program to improve those sites in the 1990s?

As for ASOS sites, if used,  we had one really badly placed sensor at KMDT when we opened in State College.  The ASOS was aspirated and on grass, not a roof.  It ran cooler in parallel to our older unit (was it an unapirated HO83?). We did not officially began using the ASOS until AFTER your 1994 spike. Hitting 100F at KMDT has been extremely difficult since the new sensor. Being on grass and on the ground helps.

Additionally, we had 2 very warm ASOS units and I know an office not too far west of me that did. It turns out how the fans are installed is critical. One does not want to pull hot air up into the system. We had a sensor that was really warm last year and in the July heat wave (KSEG) was often an eastern US hot spot. New fans lead to local cooling.

How much of the 1980s was badly placed sensors? Did we improve COOP sites in the 1990s? How much of this was the pattern? Some of those 1980s years were hot but how much did sensors contribute to this too? Gotta find the eggs for this Duncan Hines cake mix.



The relevant part of my reply to Conrad’s e-mail is reproduced below.

Date: Thu, 30 Jun 2011 08:20:54 -0600 (MDT)

From: Roger A Pielke Sr <pielkesr@xxxxx>

To: Conrad Ziegler <conrad.ziegler@xxxxx>

Cc: Roger A Pielke Sr. <>, gregory.carbin@xxxx,

richard.grumm@xxxx Subject: Re: Fwd: [Map Discussion] All-time maximum temperatures broken or tied

in June

Hi Conrad

Thank you for sharing the information on the surface temperature measurements! I have a question and a list of some of papers on this issue below.

First, Greg/Rich – do I have your permission to post your e-mails and figure on my weblog?

Second, we have been examining the effect of siting quality on the long term surface temperature record, with our most recent paper

Fall, S., A. Watts, J. Nielsen-Gammon, E. Jones, D. Niyogi, J. Christy, and R.A. Pielke Sr., 2011: Analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends. J. Geophys. Res., in press. Copyright (2011) American Geophysical Union.

We also have other papers on this subject; e.g. see

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841.

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2010: Correction to: “An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841″, J. Geophys. Res., 115, D1, doi:10.1029/2009JD013655.

Steeneveld, G.J., A.A.M. Holtslag, R.T. McNider, and R.A Pielke Sr, 2011: Screen level temperature increase due to higher atmospheric carbon dioxide in calm and windy nights revisited. J. Geophys. Res., 116, D02122, doi:10.1029/2010JD014612.

Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229.

Parker, D. E., P. Jones, T. C. Peterson, and J. Kennedy, 2009: Comment on Unresolved issues with the assessment of multidecadal global land surface temperature trends. by Roger A. Pielke Sr. et al.,J. Geophys. Res., 114, D05104, doi:10.1029/2008JD010450.

Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2009: Reply to comment by David E. Parker, Phil Jones, Thomas C. Peterson, and John Kennedy on .Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 114, D05105,


Hanamean, J.R. Jr., R.A. Pielke Sr., C.L. Castro, D.S. Ojima, B.C. Reed, and Z. Gao, 2003: Vegetation impacts on maximum and minimum temperatures in northeast Colorado. Meteorological Applications, 10, 203-215.

Pielke Sr., R.A., T. Stohlgren, L. Schell, W. Parton, N. Doesken, K. Redmond, J. Moeny, T. McKee, and T.G.F. Kittel, 2002: Problems in evaluating regional and local trends in temperature: An example from eastern Colorado, USA. Int. J. Climatol., 22, 421-434.

We are currently looking at the instrument change issue as a follow on to our Fall et al 2011 paper. Your input on our paper would be very valuable.

This is really important and influential work and I am glad you have shared it!

Best Regards


Last evening when Greg sent me the latest figure with the June 2011 data, he also updated the information with the e-mail below.

On Thu, 30 Jun 2011, gregory.carbin@xxxx wrote:


The updated chart is attached to this e-mail. It may be trivial to add one more record for the month but it was a new record (not a tie) and it was also the only new all-time record high broken in Kansas during the month. There were all-time max temperatures in Kansas that were tied during the month but only this one, on the last day of the month, was a new record. The other states with stations breaking all-time high temperature records in June were: FL, NM, OK, and TX. Again, there were 17 new all-time maximum temperature records set in June 2001, and 25 all-time maxes tied during June 2011 according to the data available from NCDC as of late this evening.

The top three years and numbers from the chart, if you need them handy, are:

Year, # Tied,# Broken, Total

1994, 117, 112, 229

1990, 95, 62, 157

1998, 58, 31, 89

I will look forward to your post.



From Anthony: I have documented serious and uncorrected problems with the NOAA ASOS system at airports. One notable train wreck of false high temperatures (which still remain in the climate record today) Occurred in Honolulu , HI at the airport. Just look at the difference.

The data from the two Oahu stations, 3.9 miles apart on the south shore. When plotted side by side, was telling. I marked missing data, the record high events, and when the ASOS was repaired.

Graph of data - click for larger image
Graph of PHNL (ASOS) and PTWC (COOP) station data for June 2009 – click for larger image

The ASOS sensor read several degrees high, setting a string of all-time new temperature records. Seems to be a recurring problem, see my analysis over several posts:

This is your Honolulu Temperature. This is your Honolulu Temperature on ASOS. Any questions?

More on NOAA’s FUBAR Honolulu “record highs” ASOS debacle, PLUS finding a long lost GISS station

NOAA: FUBAR high temp/climate records from faulty sensor to remain in place at Honolulu

How not to measure temperature, part 88 – Honolulu’s Official Temperature ±2

Guest Weblog: Ben Herman Of The University Of Arizona – Maximum Temperature Trends and the HO83

Key West, FL sets new subzero “record low” temperature – Update: now snowing!

Hot Air in Washington DC- More ASOS Failures?


I have more coming on this, look for another post on the ASOS/Airport issue related to high temperatures.

0 0 votes
Article Rating
Newest Most Voted
Inline Feedbacks
View all comments
July 7, 2011 5:07 pm

Imagine that. One of the this that I find remarkable is that there are a few LIG sites that continued recording after new digital instruments were installed. I don’t remember many, but Ft Collins I think was actually available online and I suspect a good many more that the information was collected but not digitized. That would be a fun forensic project, digging up stubborn station keeper records to compare.

July 7, 2011 5:19 pm

I got my lesson in inaccurate temperature reporting when I was recently working in Alaska for a few months, and my thermometer never matched the local official thermometer even though I was only a short distance away from,
which has the weirdest anomaly; the Fahrenheit temperature is never in the middle of the range, such as 60.4, 60.5, 60.6 – always the decimal is zero, or one tenth or nine tenths. How can this be?

July 7, 2011 5:28 pm

Game, set, match.
A cherry-pickers dream, our dataset.
Good thing the rest of the world doesn’t have our problems, or there might be some doubt about the temperature record.

Gary Mount
July 7, 2011 5:34 pm

Just a guess but if the temps are recorded in Celsius then it could be a rounding error in the conversion to F temps.

July 7, 2011 5:39 pm

Frank says on July 7, 2011 at 5:19 pm
which has the weirdest anomaly; the Fahrenheit temperature is never in the middle of the range, such as 60.4, 60.5, 60.6 – always the decimal is zero, or one tenth or nine tenths. How can this be?

Appears to be an artifact of C to F conversion.
The C readings for the decimal digit seem to be distributed randomly enough …

Matthew W.
July 7, 2011 5:43 pm

Outstanding that other people are taking a serious look at the siting issue !!

July 7, 2011 5:44 pm

It turns out how the fans are installed is critical. One does not want to pull hot air up into the system.

I don’t know about the first sentence, but I think the second sentence could be added to a policy manual without going through a peer reviewed research project. 🙂

One would have hoped this was at minimum a standard throughout the US, though reading Rich’s message I’m no longer so sure.

Oh my, so many promising candidates for Quote Of The Week.

Did we improve COOP sites in the 1990s?

A lot of sites reviewed by the Surface Stations project sure look like they hadn’t been improved. Especially the ones that now have a MMTS on a short leash.
I look forward to updates from Dr. Pielke. As for Conrad, Greg, and Rich, all I can say is “Welcome – be sure to read some of the other posts here – nothing gets past WUWT any more.”

July 7, 2011 5:45 pm

The current line from los warmistas is that the SurfaceStations project did not show that the degradation of instrumental accuracy imposed upon the various temperature monitoring systems over the past seven or eight decades (the deletion of higher-latitude, higher-altitude monitoring posts’ recent archival and present readings from the databases, local heat island effects “siting the thermometer next to a lamp,” deliberately slurring station readings showing artifactual high temperatures over contiguous areas in which surface station data was no longer being directly reported, and so forth) had been proven insignificant, and that there has still been alarming global warming going on over the past decade and more.
And yet I can’t find support for that contention of statistically significant global warming anywhere except in los warmistas‘ online masturbation fantasies. No objective evidence anyplace.
What the hell is going on? Are the surface temperature datasets blown all to hellangone in their accuracy and reliability or not?

George E. Smith
July 7, 2011 6:01 pm

Maybe there is alot more Weber barbecuing going on these days.
How NOAA can hold a straight face and admit that that abomination outside the Environmental Sciences building at Tempe AZ is on their list of observation stations, is completely beyond my comprehension.
Your Team’s expose of the real situation is slowly bearing fruit Anthony. The very idea, that someone would want to know the temperature on a runway, in the interest of safe aircraft operations, and could care less about hokey climate machinations; does not seemed to have sunk in with NOAA. And Hansen is quite happy to use those silly results as valid for some place 1200 km away.
That’s worse than silly.

July 7, 2011 6:03 pm

What’s the problem? Aren’t people like James Hansen adjusting these stations so they give the correct data? /sarc

July 7, 2011 6:12 pm

Time for another surface stations project.

July 7, 2011 6:13 pm

I hope someone checks KGVL. The Gainesville airport seems to be a lot warmer than what I experience about 10 miles west. It may be that I haven’t cut down all the tress or the I-75 is just east and causes convective showers, but the airport seems warmer both winter and summer. In fact, if you compare it to surrounding instrument readings, it runs warmer than they are as well. Of course, having the sensors in the middle of two runways might have an effect.

July 7, 2011 6:14 pm

Would also like to see that new chart comparing back to the 1930s.
Either way….nice work Roger P.

July 7, 2011 6:16 pm

This has got to be easy to check. Anthony, can’t you get people in a specific city which has a NOAA station at an airport to record temps for a couple months and check them to the official site?

Mac the Knife
July 7, 2011 6:20 pm

A bit of quid pro quo…. a welcomed and applauded sharing amongst professionals on both sides of the issue!

Brian H
July 7, 2011 6:23 pm

Is this passage a spoof? It reads as though they were trying to site the station badly (on a roof) in order to get higher readings. Grumm:

As for ASOS sites, if used, we had one really badly placed sensor at KMDT when we opened in State College. The ASOS was aspirated and on grass, not a roof. It ran cooler in parallel to our older unit (was it an unapirated HO83?). We did not officially began using the ASOS until AFTER your 1994 spike. Hitting 100F at KMDT has been extremely difficult since the new sensor. Being on grass and on the ground helps.

Maximizing records broken by hook or by crook? Is this for real?

July 7, 2011 6:27 pm

Would it really be that difficult to set up some old mercury thermometers in original spec Stevenson screens to run in parallel with the electronics? It would not be that hard to develop the correct mass to mount with the new devices to simulate the old reaction times, etc.
For that matter, you could completely automate and optically digitize a mercury thermometer without too much difficulty, with a simple cognex vision system to read it from a distance. As long as you kept the small (probably milliwatt) camera away, there should be no heat impact. Heck you could even turn it off, power up, hit it with a quick LED flash to take a picture once an hour, and turn it off again. You could probably set it up for under $5k each, internet posting of results included. More if you wanted to make it solar, satellite transmit, etc. OK, $50k if the government is doing the spending. Might have a little parallax error to deal with as well, probably nothing major.
Or, you could use micro LED’s to transmit light momentarily on one side, and receive on the other, as a scan up or scan down (last one off is the temperature on a scan up). As long as you only had it on for a few microseconds per hour, I don’t think you could reasonably argue that it was affecting the temperature. You could mount it far enough away to not affect the thermal mass. Boom, digital thermometer, for real.
I’m starting to think it’s time to go back to the original thermometers and read them electronically. Then run both in parallel for a decade in a large sample of good and poorly sited stations, then attempt to correct the historical record.

Phil's Dad
July 7, 2011 6:29 pm

Is there an equivalent for lows. I suspect there may be many more of those to come as we fall off the top of the upwave.

Pat Frank
July 7, 2011 6:39 pm

Let’s notice the earliest email date, establishing about when they noticed the problem at NOAA: Wednesday, June 29, 2011. What is that, four years and two months after Anthony began drawing attention to the pervasive siting problem?
I have Roger Pielke Sr.’s 2002 paper about siting problems in eastern Colorado. His paper was followed by an accompanying response from NOAA scientists. That response is very revealing of a certain pervasive attitude. I’ll post an extract later.

Interstellar Bill
July 7, 2011 7:08 pm

32 years ago I personally did more ASHRAE 93-77 thermal tests (hundreds) of solar water-heaters than anyone else in the country. This test involved using platinum resistance thermometers (PRTs) to measure water and air temps within 0.1 deg C. There were detailed calibration protocols using half-meter NTS-tracable mercury-column and a $5K datalogger. I went through many rounds of improvements to measure air temperature with the NTS thermometer-shed and surrounding white gravel, with solar-powered aspiration through a radiation-neutral inlet. I also had a pyrgeometer to measure sky infrared, which varied seasonally but, interestingly, not with changes in cloud cover. These temperatures were vital to giving each collector model an official DOE performance curve. It turned out that their ratings thresholds were easily crossed by a delta-temp changing by 0.1C, a typical display of perennial bureaucratic ignorance.
I took far more trouble to be accurate than 99.99% of today’s stations, just to get an rms of 0.1 C.
Pardon me as I laugh at the Alarmists’ absurd claims of surface-temperature accuracy, let alone the thermodynamic meaninglessness of any number purported to be an average global temperature or its ‘anomaly’ (They probably picked that term for its scariness connotations — did the Warmistas originate this usage?).

July 7, 2011 7:10 pm

Gee Whiz!!!. There are siting issues with thermometers???. Who’da thought that?

July 7, 2011 7:11 pm

Yes, but, but, it’s the trend, the trend that matters………

July 7, 2011 7:14 pm

I’m beginning to think the good old mercury thermometers are still more reliable than the newfangled electronic albatrosses.

July 7, 2011 7:32 pm

don’t know if this has been posted or if it is relevant:
6 July: Herald Tribune: Kate Spinner: Sarasota bucks warming trend
A new analysis of the past three decades of temperatures shows Florida — especially Sarasota — bucking a pronounced warming trend nationwide.
At the Sarasota-Bradenton weather station, average high temperatures fell every month of the year and in some months by nearly two degrees. As a result, Sarasota’s average high temperature for July will drop from 91.3 degrees to 89.8 degrees…
“It doesn’t seem like a big deal, but when you integrate that over a whole year in terms of your heating bill or your crops, it really adds up,” said Anthony Arguez, a physical scientist who managed the reanalysis of the nation’s normal temperatures for the National Oceanic and Atmospheric Administration…
Intrigued by Florida’s departure from the rest of the country in a warming scenario, Martin Hoerling, also of NOAA, compared the last decade of temperatures across the nation to the 30-year record from 1971 to 2000.
Every state in the contiguous U.S. except Florida saw a temperature increase over the period. Florida saw a cooler winter, a cooler spring, a warmer summer and little change in the fall. On a yearly basis, the temperatures average no change…
But Florida’s slowness to warm is not likely to last, said Aiguo Dai, a climate scientist with the National Center for Atmospheric Research in Colorado. Dai said short-term weather fluctuations likely buffered Florida’s warming over the past decade, and long-term patterns caused by sea surface temperature changes may have moderated the state’s temperature over the past 30 years…

David Ball
July 7, 2011 8:17 pm

Anthony was spot on from the beginning when he first started the surfacestations project. Any reasonably intelligent person can see the issues with sighting, unless that person’s income depends on them not understanding.

Brian H
July 7, 2011 8:31 pm


the issues with sighting,

— the siting isn’t too great, either!

July 7, 2011 8:56 pm

Thank you Anthony, your work has maybe allowed those blinded by AGW hysteria and money to see. Time will tell how Diogenes does in the progressive new world.

Leon Brozyna
July 7, 2011 9:33 pm

More on ASOS high temp issues? I don’t think that anything could surpass that Honolulu fiasco, made all the worse by the fact that they’ve kept those “record” high temps, when it was obvious that the equipment was showing temps at a minimum of 3° too high. Bet a record low temp would get a much different treatment.

Bill in Vigo
July 7, 2011 9:43 pm

Wonder what is next. Perhaps it is time for some changes at the notable NASA think tank. I am glad that NOAA is noticing.
Bill Derryberry

Mike McMillan
July 7, 2011 9:54 pm

DR says: July 7, 2011 at 7:14 pm
I’m beginning to think the good old mercury thermometers are still more reliable than the newfangled electronic albatrosses.

Obviously you are mistaken, or why would they have had to revise the early (LIG) USHCN temperatures down so much for USHCNv2? </sarc>

Pat Frank
July 7, 2011 10:00 pm

Following up from my previous comment, my memory failed me a little. It was the paper of Christopher Davey and Roger Pielke, Sr. in 2005 BAMS paper [1] that attracted NOAA’s attention.
Davey-Pielke 2005 is a classic critical expose of poor siting. They visited 57 stations in eastern Colorado, documenting the exposure of the temperature sensors. Their BAMS paper is replete with color photographs worthy of the SurfaceStations project, showing sensors next to buildings, in parking lots, surrounded by brush, near an air-conditioning unit, or near mixed ground of varying albedo.
The worst one, with photographs in their Figure 12, was the Las Aminas station. This one would belongs on the wall along with Anthony’s best classics of bad siting. It’s worth quoting Davey and Pielke Sr. at length. The last sentence is a paragon of understatement (my emphasis added):
The Las Animas site had, by far, the poorest exposure for the USHCN sites we visited (Fig. 12). The temperature sensor is set up over a gravel surface at the southeast corner of the main building of the Las Animas Power Plant. The sensor was moved in the 1980s from an open field about 50 m northwest of the power plant building to its current location on the southeast side. The 10-m building is only about 2 m west of the sensor. It blocks ventilation for the sensor in all directions north and west. Additionally, an exhaust vent is only 2 m north of the sensor; any air discharges from this vent would very likely affect temperature readings. There are also three short stacks between 3 and 10 m north of the sensor. To the north and east, about 10–20 m away, are several sheds with metal siding, along with several other metal storage features. In summary, this is a very poor site for measuring air temperatures.
Moved from an open field to sit next to a power plant 2 meters from an exhaust vent. Perfect. Davey and Pielke Sr. finished their paper by observing, “Similar variability in the climate observing sites in the worldwide dataset of land-surface temperature trends would raise questions concerning the use of the historical record to assess regional and even global temperature changes.
And how did NOAA and the NCDC respond? As good public servants, stewards of important data, and professional committed scientists, they responded constructively, right?
Well: Their response was the very next paper in BAMS [2]. What a coincidence. After a lot of blah about how they are dedicated to providing the very best data, the highest quality instrumentation, and how they take care in choosing site locations, Vose & co. get to the nitty-gritty of what they mean to do about the problems Davey and Pieke Sr. revealed.
It’s worth quoting them extensively, too:
Although undesirable exposures add uncertainty to trend estimates, Davey and Pielke (2005) did not demonstrate that USHCN siting problems systematically bias long-term temperature trends at the regional level. For instance, they did not quantify the temperature bias produced by the exposure problems, nor did they show that those problems actually resulted in spurious temperature trends at any station. Furthermore, their analysis was a static assessment of site exposures over a relatively small part of the country, an area within which station exposures varied considerably. In other words, their results do not show that a large number of USHCN stations have a comparable exposure problem, much less exposures that bias temperature trends over large areas. (emphasis added)”
That is, Vose & co. including Tom Karl, head of the NCDC, after seeing a report that the USHCN may be in disarray, respond with a peremptory dismissal.
Any concerned scientist would have taken alarm and begun a comprehensive survey of the remaining sites, or even just of a representative selection of sites, to determine the extent of the problem. Nope, not a bit of that. The problem is minimized and Davey and Pielke are faulted for not surveying farther afield and for demonstrating that bad siting produces bad data.
Note that Vose & co. do not argue from knowledge. They argue a speculative polemic. That which was not shown to be bad is implicitly asserted to be good. Here’s a plain language restatement of their points: Bad siting not shown to bias temperatures don’t bias temperatures. Siting problems found to be pervasive in bounded regions do not call into question siting in larger regions. Temperatures from poorly sited sensors, not shown to be biased, are not biased. Even if all the examined stations prove badly sited, this does not mean that any of the remaining unexamined stations are badly sited.
Hand-waving dismissal never expected from a professionally competent scientist. Under normal circumstances, a defensive polemic like that would be widely recognized as signaling incompetence.
[1] C. A. Davey and R. A. Pielke Sr. (2005) “Microclimate Exposures of Surface-Based Weather Stations” Bull. Am. Meteor. Soc. 86(4) 497-504.
[2] R. S. Vose, D. R. Easterling, T. R. Karl and M. Helfert, (2005) “Comments on “Microclimate Exposures of Surface-Based Weather Stations”, Bulletin of the American Meteorological Society, 86(4), 504-506.

Pat Frank
July 7, 2011 10:04 pm

Oops the end of paragraph 3 from the bottom should be ‘… and for not demonstrating that bad siting produces bad data.’

John Brookes
July 7, 2011 10:08 pm

So we’d better stick to the satellite record instead?

July 7, 2011 10:16 pm

I live in Hawaii, windward side of Oahu. It’s almost unheard of here for temperatures to top 90 F (32 C) yet that seems the norm at Honolulu Airport. The PTWC temperatures mesh with my day to day expectations while the PHNL thermometer must have been directly behind the hot exhaust of departing aircraft. An 8 F (4.4 C) difference is just not credible.

July 7, 2011 10:46 pm

Mariss, there was a hilarious episode on Channel 2 Honolulu where the weatherman at the 2 meteorological stations at/adjacent to Honolulu Airport said that the rise in temperature was entirely due to asphalt, exhaust, and the lack of vegetation. I believe the tape was posted here. But I saw it on TV.
There is another side though. My temperatures in Kekaha regularly exceed the reported temperatures by NOAA by 5 degrees. I do not believe that is abnormal. In fact I believe these temperatures have been the norm in this desert for thousands of years. I believe it has always been true and has been under-reported because NOAA is guessing and the military simply does not give a damn about AGW.

D. King
July 7, 2011 11:07 pm

Maybe they’ll be able to get rid of that pesky super El Nino.
Remember, they started adding heat to the new electronic thermistors in the 90’s.
Scan video forward to 3:00 and play.

Scottish Sceptic
July 7, 2011 11:44 pm

What I find incredible is that if they had admitted their problems decades ago and spent even a small percent of the money then doled out on wind energy on proper instrumentation, then there is no way on earth we would be in this mess.
Instead: because the temperature readings were showing the result the “save planet earth” money grabbers wanted, they had not the slightest interest in correcting the many and obvious problems of the temperature series.
Worse still … like “News of the World” the rot spread, the culture of deceit, of sweeping criminality under the carpet and promoting those who “got the results” over those who “got some results — honestly” spread.
OK, Murdoch also owns Fox news, and fox news is amongst the few who report the other side of global warming but ….
Perhaps if fox news hadn’t been so vociferously anti-global warming based on knew jerk editor bias, then the other MSM might have not bee so anti-fox: perhaps they would have done their job and reported the news accurately and not just done a left-wing gut instinct, react against anything that Fox news supports.

July 8, 2011 1:01 am

Who’d a thought that badly-sited instruments could skew temps upward. How convenient for the Warmista!
All credit to Anthony for identifying this issue years ago, but why has it taken so long for at least some of the so-called ‘scientific’ community to wake up and catch up?

The Ghost Of Big Jim Cooley
July 8, 2011 1:29 am

It does annoy me that the US is stuck using F when C just makes better sense! Although we here in the UK use C, it’s really funny listening to people talk. When it’s cold they talk C, but as soon as it gets hot they start talking F!

Ian W
July 8, 2011 2:21 am

It is quite simple – NOAA is not a quality agency. In any other area of science or engineering finding errors of this magnitude would be cause for immediate remedial action and removal of the bad data from any records. Staff could expect to be reprimanded or dismissed for their failure. But with NOAA we see a random slap-dash, unconcerned and often defensive approach. This is the mark of an agency that is unconcerned with the quality of its work and deliverables.
All NOAA records (not just temperatures) should now have an ‘asterisk’ added as not to be trusted or used until a formal technical assessment and quality audit is passed by each site and its data handling. If NOAA happily accepts invalid data and poor quality where they can easily be found, NOAA work in other areas must be equally suspect.
What kind of government is it that has formal congressional hearings on baseball players that may have taken performance enhancing drugs, but has no reaction to an agency generating invalid metrics on which the government is basing fundamental shifts in economic policy?

July 8, 2011 2:41 am

Hmmm. if you look at the temperature data for Ross-on-Wye, which the UK Met Office has kindly provided here:-
And you then plot the figures for Tmax in Excel or similar for the years 1930 to 2011 you find something strange. The average temperatures jump up by 2 Celsius after 1985. Well the monitoring station has been in the same suburban location since 1930, and its nowhere near an airport.
Could it be something to do with the re-opening if the weather monitoring station by goggle-eyed TV weatherman Iain McKaskill in 1985?????
I think so.
Full history of this montitoring station is shown here:-
Conclusion could be that changing from Mercury thermometers to new-fangled electronical thermometers puts yer temp readings up by 2Celsius? And may be this was done all over the Western world between 1980 and 2000? And all those CRU graphs just show an amalgamation of how electronic thermometers read higher than mercury thermometers together with airports getting busier?

July 8, 2011 2:48 am

“So we’d better stick to the satellite record instead?”
Great idea. Measure the temperature accurate to 0.1Celsius using a satellite a hundred miles up in the sky with equipment on board that only claims to be accurate to 0.5Celsius. Seems to be a reasonable approach for climate scientists but as an engineer I would dispute how this could ever give you sensible results.
What Team-AGW should be doing is pushing for better more reliable monitoring stations. I wonder why they aren’t?????

Michael Schaefer
July 8, 2011 2:49 am

It’s worse than we thought!
So placement of sensors HAS an impact in the measurement of global temperatures?
Who would have tought…
This gives “Man-Made Global Warming” a whole new meaning, doesn’t it…?

Don K
July 8, 2011 4:44 am

Nifty chart. Really.
Two things though.
1. Doesn’t the probability of setting a new high at any given site fall off with the length of the temperature record for that site? What I’m thinking, perhaps naively, is that if the temperature record is ten years long and everything is random, the probability of setting a new high (or low) in year eleven would be one in ten. Whereas if the record were 100 years long, the chances of setting a new high in year 101 would be one in a hundred.
2. If I’m right about point 1, doesn’t the data plot have entirely the wrong shape for temperatures being random? If they were random, the chart would sweep down to the x axis from a peak on the left and sort of approach the x axis asymptotically? That would suggest that there is a pattern to the data caused by some combination of temperature increases, changing site biases, and possibly other factors?

July 8, 2011 4:48 am

They continue to rely on fans for aspiration? How often do these get checked as mechanical devices, such as fans, fail regularly. — John M Reynolds

July 8, 2011 5:29 am

Thanks Pat, the background is helpful. It is hard to understand the hubris that emanates from NCDC.

Bruce Cobb
July 8, 2011 5:36 am

We have an electronic (LA CROSS TECHNOLOGY) thermometer set up in the back yard, under a tree, which is just in back of our house (south facing). One thing I’ve noticed is that our daily high temps are usually about 5 – 7° F lower than those recorded in Concord NH, roughly 10 or 12 miles West Southwest of us. I believe the thermometer there is at the local airport (surprise surprise). Additionally, the predicted highs by the local TV station are almost always 5 – 10° F higher than our actual highs.

Mike Bentley
July 8, 2011 7:30 am

Ian W,
Answer in short form – we have gone to a “bread and circuses” approach in government. Give the people a good show and forget the details. While I’m not a fan of wind power, you have to admire the technology when those huge blades are spinning…
But maybe, just maybe there is some light shining in the outhouse…

Doug Proctor
July 8, 2011 8:08 am

Assuming the site problem as suggested, what is the probable magnitude of the error, how many stations are you talking about, and what would be the impact on the US average? Or is this just a local problem, though it leads to local “records”, that really doesn’t impact the US annual/monthly averages?
Detail or substance?

Rod Everson
July 8, 2011 8:34 am

Regarding the comment about all the F readings ending in .9, .0 and .1, it is apparently quite a fluke. The C readings, as pointed out, appear reasonably randomly distributed, so if one assumes those are the original readings and calculate the resulting F readings, you do indeed get the numbers shown.
Between 10 and 22 degrees C, 30% of the converted F readings would end in .9, .0 or .1, so there’s a 7/10 probability of getting a F conversion in the .2 to .8 decimal range. The chance of getting 24 readings in a row without a .2 to .8 decimal is .7 to the 24th power or about 1 in 5,000, which is why it stood out. (The odds of getting any 3-decimal range (.3,.4,.5 say) would be 10 times that, or 1 in 500.)
This is a decent example of the result of doing data mining when doing research. Apparent conclusions crop up that weren’t proven. The thread a long ways back about research results reverting to the mean over several followup studies also relates to this example. Only the research results that are anomalous get published, but it turns out that many of them are just getting the occasional random outlier that, as illustrated here, will crop up regularly, if not frequently. Do enough studies and some will certainly yield outliers. Then the outliers get published because they seem to indicate a new reality. And if no followup replication studies are done, that new reality can stick around for a long time.

July 8, 2011 11:06 am

At 7:08 PM on 7 July, Interstellar Bill discusses his personal experience with thermal tests in solar water-heaters using various instruments, emphasizing the need for precise calibration and explicitly assessed limits of accuracy, as well as damned exacting measures undertaken to ensure that the effects of variables outside the effects being evaluated were themselves known with commensurate precision so that compensation could be incorporated in the analysis of the data obtained. He writes:

I took far more trouble to be accurate than 99.99% of today’s stations, just to get an rms of 0.1 C.
Pardon me as I laugh at the Alarmists’ absurd claims of surface-temperature accuracy, let alone the thermodynamic meaninglessness of any number purported to be an average global temperature or its ‘anomaly’ (They probably picked that term for its scariness connotations — did the Warmistas originate this usage?).

Precisely what I’ve never been able to find in all the alarmist propaganda. I don’t know about other readers, but one of my required undergraduate courses was in Instrumental Analysis, and from 9th Grade a fixed topic in all my high school science courses was limits of instrumental accuracy.
At all times henceforth and to this day, nothing of objective evidence in my professional life has been considered valid without conveying information on the “wiggle room” imposed by compounding error factors and other variables affecting the reliability of data collection.
How the hell do los warmistas get away with fraction-of-a-degree assertions of precision in local and worldwide temperature prevalence figures (and predictions of very long-range future developments) without stipulating the ranges of error necessarily imposed by confounding factors?
When the climatology quacks began pushing their “Cargo Cult Science” as the basis for political policy – in other words, violent force imposed upon peaceable people at gunpoint, because that’s the sum total of everything government anywhere, at any time, ever undertakes – they decided to keep responsible assessments of error out of the picture in order to give the impression of precision and inexorable authority by precluding reasoned understanding of just how nebulous and indefinite their “predictions” really were.

Mark T
July 8, 2011 11:23 am

Except that Fox is not “anti global warming,” not even a bit. They report the same AP news on the subject as everyone else, they just happen to throw in a few skeptical stories from time to time for balance. I wonder if people like you, scottish skeptic, actually read/watch Fox, or simply rely on what the rest of the left-leaning media says about them? Or, are you one of those that cannot fathom the difference between news reports and opinion pieces done by analysts? Either way, it comes across as ignorance.

July 8, 2011 12:24 pm

The Ghost Of Big Jim Cooley says on July 8, 2011 at 1:29 am
It does annoy me that the US is stuck using F when C just makes better sense!

On that basis, why not make the jump straight to Kelvin? No more messing around with a 273 degree ‘bias’ up the scale …
My suggestion would be to re-scale the Fahrenheit scale from zero for water freezing to the present 100 degree F point on the scale to give a better ‘instant’ recognition of the effects on plants, animals, humans and equipment; this would better emphasize the safe ‘working range’ for the majority of the life-forms that are 90 some percent water and with internal core body temperatures around that 100 deg F point … doesn’t it get confusing remembering what the correct deg C mark represents a comfortable room temperature or what your core temperature (98.6 deg F) should nominally measure? Common fever temperatures for us in the states measure something over normal in the 101 to 102 degrees F range for instance …
As I write this I notice 106 degrees is presently showing on the external thermometry here in north central Tejas … HOT!

Pat Frank
July 8, 2011 5:43 pm

Bernie, my inclination was to respond that nowhere else is there such a shameless violation of scientific standards as in climate science. If politically driven, it’s Lysenko-esque. I’ve written another Skeptic-style climate-science critical article that ends with that observation, but I’m having trouble getting it published.
But anyway, after thinking about the attack on scientific ethics under the AGW umbrella, I realized that the uncritical support given to this cadre of climate scientists by the AMS, the AGU, the NAS, and even the APS shows that organizational scientists who should know better have given their institutional support to the destruction of their own discipline.
It’s as though Congress, the Office of the President, and the Attorney General issued press releases supporting activity that is objectively criminal by existing legal standards. Given this institutional support, the criminals themselves are free to cavalierly dismiss the evidence delivered in protest by offended citizens. The criminals are encouraged in their feeling of righteousness (which even violent criminals often have), by the institutionally fostered normative false morality. Their undisturbed delusion justifies the dismissals of wrongdoing. What is an arrest, after all, but a jarring wake-up call to a person self-deluded into believing the life and property of others actually belongs to him?
So, Vose & co. can submit their peremptory dismissal with feelings of justified irritation that Davey and Pielke Sr. could dare to criticize where institutional icons have granted approval.
As I see it, therefore, the AMS, AGU, NAS and APS have colluded in the violation of science. They have constructed a culture where to be normatively moral is to be ethically vacant. As official guardians of the standards of science, like guardians of the rule of law, they are more guilty of scientific malfeasance than the individual scientists who have actively written fake science.

Alan D McIntire
July 9, 2011 6:00 am

Don K says:
July 8, 2011 at 4:44 am
1. Doesn’t the probability of setting a new high at any given site fall off with the length of the temperature record for that site?
You’re right. The probability of a new record is roughly proportional to the log of the time that records have been kept. For example, with 10 years of records, the log is 1. It would take roughly 100 years for a new record to be set – log 2, and after that you’d have to wait 1000 years for a third record. Of course in the US there are hundreds of sites, 365 days per year, and records for high and low so there are thousands of records, a few broken each day is not that remarkable.
I became aware of that problem when I first attended high school. There were stories in our local paper about school athletes breaking records every few weeks. Our school was only 4 years old when I started attending, so NATURALLY there’d be more school records set than at a school with 30,or more years of records.

kadaka (KD Knoebel)
July 9, 2011 6:12 am

Up above, there are complaints related to the ASOS stations. At the NOAA’s new normals post there’s a link to this announcement from NOAA. But while searching for data I found this June 2011 “State of the Climate: National Overview” report with some different wording:

Beginning August 1, 2011, the U.S. ASOS maps and the monthly mean maximum and minimum temperature anomaly maps will be using the newly updated 1981-2010 Normals to calculate anomalies. Other 30-year Normals anomaly-based monitoring products will not be affected by the Normals transition until early 2012.

Does it mean anything that they’re changing the normal for the ASOS maps ahead of others? Perhaps “hide the incline”?

bob gregg
July 9, 2011 10:40 am

ASOS measurements are a joke. They are put in for pilots safety, not climate data. Here in the west, Reno, Palm Springs and Las Vegas temperatures are so unreliable. The average minimums are what are way up. What else would you expect when the ASOS is put near a runway, taxiway or asphalt parking area. In asking San Diego, Reno and Las Vegas NWS about the problem their reply is simply “that’s the way it is”.

July 9, 2011 12:51 pm

The huge spike of record highs in (especially) 1994 sparks memories for me. I have been in electronic manufacturing for over 25 years now. Many electronic chips require fan cooling. The fans we adopted in the early 1990s had a huge failure rate after a couple of years. It turns out that in cost cutting, they made bearings that didn’t last well. The graph of record highs looks much like our product-return graphs.
So, we have a massive shift circa 1990 to smaller ASOS sensor enclosures that require active convection to get good airflow instead of the passive convection of the earlier larger enclosures. A few years later, there is a huge spike in record high temperatures. Does anyone have access to the repair records for these things?
REPLY: funny you should mention fans, as they ar part of the problem. See this article of mine

July 9, 2011 1:53 pm

If you go back to 1880 looking for all-time record highs set in each state of the USA, you will find 25 of those 50 records were set in the 1930s.
Going back as far as 1950, is disingenuous and lame.

Dave A
July 9, 2011 5:25 pm

The number of NOAA weather stations that generate rubbish as data is appalling
I am currently baselining this through 2011 as I try to get an independent measure of the vagaries of Earths Temperature
This is simply the average temperature of 2680ish (as some of the stations don’t “make the grade” every hour) NOAA weather stations taken every hour
I started with a database of ~7500 stations. After I took out the ones whose whereabouts is undisclosed and those that don’t produce any temperature measurement and then those not producing temperatures around the clock and then those stuck on a temp and then those regularly keybouncing 26C into 226C you are left with these 2680
Being land based with most of the land in the Northern Hemisphere it’s getting warmer 😉
At the end of 2011 I’ll have a look and see if any of the 2680 have “gone rogue” and strip them out of the data
This I will be able to do yearly so we are comparing apples with apples and not banana’s
By the 1st Jan 2011 I’ll have a Solar Annual temp taken hourly every hour…then we will be able to see if it goes up or given the Sun’s inactivity and a year of Volcanic note heaven forbid….down
Don’t worry I have all the raw data ready for the FOI request and reproducibility 🙂
Backed up daily off site. No dog ate my homework excuses required 🙂

Dave A
July 9, 2011 5:46 pm

Make that By the 1st Jan 2012
See how easy it is to make a typo 😉

%d bloggers like this:
Verified by MonsterInsights