NOAA: leaving bad data stand in Laredo

From Dr. Roger Pielke Sr.’s Climate Science blog, a story of a poorly sited weather station, followed by leaving in the bad data when it is known to be bad. I’ve located the Laredo airport AWIS, more photos below.

Guest Post By Richard Berler Chief Meteorologist KGNS TV Laredo, TX

With his permission, Richard Berler  who is Chief Meteorologist of KGNS TV in Laredo, Texas has allowed me to post his quite informative e-mail of October 6 2011 to me. It is reproduced below.

Dear Roger,

I e-mailed you several weeks ago concerning “bad” readings from the Laredo, TX AWIS instrumentation administered by the FAA. This has been a topic of interest to me for a number of years. In 2007 in a talk at the 14th Symposium on Meteorological Observation and Instrumentation, I noted (this was not the focus of my presentation) that the AWIS at the Laredo airport consistently read about 2F above the MMTS that I use as a NCDC cooperative observer. This bias was present at all hours of the day, even with a very well mixed atmosphere. I am about 4 miles from the AWIS and about 70 feet lower in elevation. I did note that this would be climatologically significant as it would make the AWIS site warmer than any Texas location by 2F on an annual basis, warmer than Death Valley, and warmer than any Florida station with the exception of the Florida Keys.

Recently (May 2011), the daytime maximum temperatures from the AWIS jumped to well over 4F above my MMTS. The NWS office responsible for our zone forecasts agreed that the readings appeared to be too high. They have a vested interest in this as they forecast to match the AWIS numbers, and verify their forecasts off of those numbers. The AWIS numbers are also what the public is exposed to as those numbers are generated each hour, and are picked up by media outlets such as The Weather Channel. With zero interest and cooperation by the FAA, The NWS came down to Laredo, and on camera, we approached the AWIS instrumentation and made our own temperature measurements and verified that the AWIS was running close to 5F too high during the afternoon hours. The heat of the NWS and a television news story finally prompted the FAA folks to put a replacement unit into operation at the AWIS site. The impact was immediate, and also verified that the earlier day and night 2F bias that I suspected earlier was also correct. Here’s the Tmax (AWIS-MMTS), Tmin (AWIS-MMTS) this year (July not shown as the replacement unit was deployed mid month: Jan 2.2,2.4  Feb 1.9,2.4        Mar 2.2,2.1  Apr 2.3,1.7  May 4.3,1.9  June 4.7, 1.6  Aug 0.6,0.0  Sep 0.5,-0.1.

Remarkably, when the NWS office wanted to go in and eliminate the myriad of record high temperatures that had been generated by the AWIS during the time that it had been commissioned/certified as official in 2009 (and especially the May and June data from 2011), the southern region office in Fort Worth told them to leave it alone, let it stand. I am astounded at this attitude. The experience over the last 7 or more years of lack of cooperation with various governmental agencies (FAA and NWS), and a current lack of interest by the southern region office in correcting verified systematic errors of significance is quite a disappointment. It doesn’t reflect well on the integrity of observational program.

On a different subject, I noted a reference in your recent paper that documented poorly sited thermometers. I do feel as if there should be a distinction made between thermometers used for climate change studies and those used for applied or operational purposes. With model grid boxes becoming smaller, a thermometer located over grass with no man made surfaces within 100 feet may not be representative of a grid box that is urban in nature. Likewise, such an ideal site may not be as useful to energy companies, architectural interests, ect. in an urban setting. I noticed that a poor exposure (a #5 type of site) was associated with temperature errors of >9F. I did see such numbers in the literature from a study conducted in Turkey. My experience has been quite different. I measured a surface temperature on a late June 2009 day with an 86 degree sun angle with an infrared thermometer of 143F on a blacktop parking lot. Three eigth’s of an inch above the parking lot surface (3/8″), an UNSHIELDED thermometer in full sun read 105F. My MMTS at the edge of the parking lot was reading 92F (the airport with it’s 2F bias was 94F). I note also that J. Basara found a slight daytime urban COOL island in Oklahoma City…he spoke on this at the 14th Symposium on Meteorological Observation and Instrumentation and at the 2011 AMS Broadcast Meteorology Conference that I co chaired in Oklahoma City. Do you see any evidence supporting a systematic error as large as 9F (day or night)  from the many #5 quality stations that you have studied?

Sincerely,

Richard “Heatwave” Berler, CBM

Chief Meteorologist

KGNS TV

Laredo, TX

======================================================

I’ve located the AWIS station at  27.551058, -99.461244, right next to the electronics building for the Instrument Landing System (ILS) which transmits the glide path radio beam down the runway. It is fairly common to put the weather stations near the ILS at many airports due to “one stop shopping” ease of maintenance in one location with one access road.

Here’s the view from Bing Maps looking west,

Weather stations and A/C heat exchanger exhaust vents – like tornadoes and trailer parks.

UPDATE: Using NCDC MMS metadata lat/lon I located the other station referenced in the email. Here is Richard Berler’s MMTS COOP station he references next to the TV station parking lot. Image from Google Earth street view. The white dot on the pole is it.

A similar situation occurred in Honolulu. See: FUBAR high temp/climate records from faulty sensor to remain in place at Honolulu

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
44 Comments
Inline Feedbacks
View all comments
Steve Keohane
October 11, 2011 9:17 am

If you’re maintaining a database to represent a system, it is necessary to remove data points that are not representative of the system if they are influenced by effects that are not part of the system, in this case climate. Refusal to do so means the data does not represent the system, and is therefore is meaningless. You might as well go home.

Ted
October 11, 2011 9:18 am

This is anecdotally interesting, but the important question is whether there’s an impact on the broader temperature signal.
Would this type of information incorporated into the Berkeley Earth Surface Temperature analysis? It was mentioned at Climate etc. that a few papers were about to be submitted. From the AGU Fall Meeting Abstracts, it would appear that the sensitivity of the overall signal to these issues really isn’t that big.
ABSTRACT FINAL ID: GC43B-0908
“Further, we automate the process of assessing station reliability to allow data of unknown reliability to be included in the analysis”

“Applying the Berkeley Earth techniques, we broadly confirm the temperature histories presented by prior groups. However, the improved methodology allows the uncertainties to be reduced (often by 50%) and also has allowed the instrumental temperature record to be extended back to 1800.”
ABSTRACT FINAL ID: GC44B-01
“We calculate the effect of poor station quality, as documented in the US by the team led by Anthony Watts by estimating the temperature trends based solely on the stations ranked good (1,2 or 1,2,3 in the NOAA ranking scheme). We avoid issues of homogenization bias by using raw data; at times when the records are discontinuous (e.g. due to station moves) we break the record into smaller segments and analyze those, rather than attempt to correct the discontinuity.

“The results we obtain are compared to those published by the groups at NOAA, NASA-GISS, and Hadley-CRU in the UK”

bill
October 11, 2011 9:29 am

If we can’t measure local temperature accurately, we can’t construct a meaningful global temperature; if we don’t have a long-time meaningful global temperature, we don’t know if in fact the globe is warming; while we don’t know that, then Arhennius’ theory must remain just that, a theory, not a basis on which to build public policy.

NetDr
October 11, 2011 9:30 am

Dallas Ft worth airport was a cow pasture prior to 1977. Since then tons of concrete and dozens of heat emitting buildings were built. A small city of 30 K inhabitants has been constructed around it.
I believe there is an Urban Heat Island effect around the weather station there. It has been shown that going from 100 people to 30,000 people causes more UHI than an equal increase in a larger city.
Such obvious UHI must be corrected lest we fool ourselves.

John F. Hultquist
October 11, 2011 9:35 am

This is the sort of issue for which the asterisk was invented. Wikipedia . . .
http://en.wikipedia.org/wiki/Asterisk
. . . reports that some folks call it a splat. The problems I see is that if you don’t actually know what the readings should have been – to what do you change them? Yes, you could subtract 2 from all the entries but that only makes them different. It does not make them correct.
So, the best response to this issue is to put a big splat on each of these readings and a researcher could apply whatever correction deemed appropriate for the use.
[For the young, look up the false report of an asterisk on Roger Maris’ home run record.]

Leon Brozyna
October 11, 2011 9:42 am

Sounds like Hawaii all over again … letting the records stand.
I wonder … if instrument failures had resulted in record low temperatures, would those record low temperatures be allowed to stand?

wws
October 11, 2011 10:15 am

I would bet anyone the amount of the Porkulus Package that if that weather station consistently read 2F *lower* than any other weather station, it would have been rated a Code Red Emergency at the highest levels of the NOAA, and the situation would have been “rectified” within 24 hours.
Still, they probably would have “rectified” it by simply writing in a +2.5F algorithm for all future readings from the station.
More and more, I think the entire Warmist approach towards Data Integrity comes down to, “why waste time and expence on the actual readings when we can just paper-whip it?”
And the great advantage of Paper-Whipping is that you *always* come up with the answer that best suits your purposes. No nasty surprises there!

Mike Davis
October 11, 2011 10:29 am

Unknowns can not be corrected. It is known there is and were errors but the extent on any given day is not known so any “correction” would just be magnifying the error. Discard the record for any site that has had an adjustment applied.

Dave Springer
October 11, 2011 10:33 am

I’m about 235 miles NNE of Laredo. My guess is the weather station there either melted or dried up and blew away.

David A. Evans
October 11, 2011 10:37 am

Which part of temperature alone is meaningless is also being missed?
When even the temperature is wrong…?
DaveE.

TERRY46
October 11, 2011 10:42 am

Positoned next to A/C vent .It’s like they put the A/C vent next to the tempurature station on purpose .We have the same problem in my area with the tempurature station next to the airport.Our tempuratures are almost always warmer than surrounding area and I live in the N Foothill of N C.I e-mailed Van Denton,with FOX 8 , and he stated he has seen what appears to be a warm bias on the Mt.Airy site as well.

Bloke down the pub
October 11, 2011 10:57 am

Look on the bright side. The 2°C drop in temperatures being recorded now will boost the cooling trend.

Olen
October 11, 2011 10:59 am

I disagree about bad data not being useful. Bad data is useful to a regulation and tax hungry politician.

More Soylent Green!
October 11, 2011 11:19 am

says:
October 11, 2011 at 9:18 am
GIGO

October 11, 2011 11:28 am

“Applying the Berkeley Earth techniques, we broadly confirm the temperature histories presented by prior groups. However, the improved methodology allows the uncertainties to be reduced (often by 50%) and also has allowed the instrumental temperature record to be extended back to 1800.”
Wow! I’ll bet those guys taking the daily temperature readings from 1800 to 1920 or so (when RECORDING DEVICES started to be used.) were REALLY accurate about the reading, the time of day, etc. No, wait, it was a voluntary, or millitary duty..to put SOMETHING into a record, with NO quality assurance at all.
Yes, I run all my 0.1 C judgements that way. Yeah, sure…I’ve go a bridge to sell you!

October 11, 2011 11:29 am

“GOT” a bridge to sell. DARNED INTERNET DUPLEX SIGNAL DROPS THINGS WHEN I AM MOVING AT “LIGHT SPEED”.

MarkW
October 11, 2011 11:32 am

Bloke down the pub says:
October 11, 2011 at 10:57 am
Look on the bright side. The 2°C drop in temperatures being recorded now will boost the cooling trend.

You wish. In all likelihood, they will add two degrees to the current temperature to make it consistent with the past.

FerdinandAkin
October 11, 2011 11:36 am

Let the record stand as is.
Do not give these people any more license to apply ‘corrections’ to the past temperature record!
Let the record stand.

henrythethird
October 11, 2011 11:37 am

12.Bloke down the pub says:
October 11, 2011 at 10:57 am
Look on the bright side. The 2°C drop in temperatures being recorded now will boost the cooling trend.
And that will be the thing to watch – when the anomalies are computed, and they show a cooling trend, will they go in and say the past temps or the current temps are faulty? If so, what adjustments will they use?

Stilgar
October 11, 2011 11:40 am

What should the correction be? 2 degrees in some cases, 4 degrees? If the number it is off by is not known, how can you correct it?
It would seem to me that whatever the difference, unless you can prove an external source was randomly changing the readings (car parked beside it one day and not the next) then in general the overall bias should be the same. If you are looking for a difference in the rate of change, as long as the bias is the same, the rate will be the same. In this case making a correction could mess up that calculation.
On the other hand, if you want to know things like record highs and such, then the absolute temps need to be adjusted.
I agree with the above poster JFH, flag the data as known to be off and allow the people who use the data to determine if a correction is needed.

Steve from rockwood
October 11, 2011 11:44 am

The care you take with your raw data is the same care you take with your published results.

Lance
October 11, 2011 11:48 am

I seem to recall that several years ago Maine recorded a record low, and then after much careful consideration, they disallowed it? Perhaps someone may recall better than I, what the store was?
I believe it was even a WUWT topic…?

Mike Davis
October 11, 2011 11:51 am

More Soylent Green:
I call it the GOZENTA Effect!
What Gozenta determines what Comzouta! GIGO, while true does not introduce the need to evaluate the quality of the Garbage being input or how much perfume has been applied in the form of “Corrections”.

Genghis
October 11, 2011 11:57 am

Actually though won’t this ultimately hurt the CAGW records? If the sensor is replaced with a more accurate device it will show declining temperatures.

Frank K.
October 11, 2011 12:04 pm

wws says:
October 11, 2011 at 10:15 am
“More and more, I think the entire Warmist approach towards Data Integrity comes down to, why waste time and experience on the actual readings when we can just paper-whip it?”
Actually the warmist approach to data integrity is as follows:
(1) If the data supports global warming, then it is incontrovertible evidence of impending doom.
(2) If the data does not support global warming, then it “doesn’t matter” (like the UHI effect).
Of course, one can argue that the temperature readings at any one location don’t matter since they represent only .0000000001% of the earth’s surface. (That, by the way, is a Hansen-ism).