NOAA: leaving bad data stand in Laredo

From Dr. Roger Pielke Sr.’s Climate Science blog, a story of a poorly sited weather station, followed by leaving in the bad data when it is known to be bad. I’ve located the Laredo airport AWIS, more photos below.

Guest Post By Richard Berler Chief Meteorologist KGNS TV Laredo, TX

With his permission, Richard Berler  who is Chief Meteorologist of KGNS TV in Laredo, Texas has allowed me to post his quite informative e-mail of October 6 2011 to me. It is reproduced below.

Dear Roger,

I e-mailed you several weeks ago concerning “bad” readings from the Laredo, TX AWIS instrumentation administered by the FAA. This has been a topic of interest to me for a number of years. In 2007 in a talk at the 14th Symposium on Meteorological Observation and Instrumentation, I noted (this was not the focus of my presentation) that the AWIS at the Laredo airport consistently read about 2F above the MMTS that I use as a NCDC cooperative observer. This bias was present at all hours of the day, even with a very well mixed atmosphere. I am about 4 miles from the AWIS and about 70 feet lower in elevation. I did note that this would be climatologically significant as it would make the AWIS site warmer than any Texas location by 2F on an annual basis, warmer than Death Valley, and warmer than any Florida station with the exception of the Florida Keys.

Recently (May 2011), the daytime maximum temperatures from the AWIS jumped to well over 4F above my MMTS. The NWS office responsible for our zone forecasts agreed that the readings appeared to be too high. They have a vested interest in this as they forecast to match the AWIS numbers, and verify their forecasts off of those numbers. The AWIS numbers are also what the public is exposed to as those numbers are generated each hour, and are picked up by media outlets such as The Weather Channel. With zero interest and cooperation by the FAA, The NWS came down to Laredo, and on camera, we approached the AWIS instrumentation and made our own temperature measurements and verified that the AWIS was running close to 5F too high during the afternoon hours. The heat of the NWS and a television news story finally prompted the FAA folks to put a replacement unit into operation at the AWIS site. The impact was immediate, and also verified that the earlier day and night 2F bias that I suspected earlier was also correct. Here’s the Tmax (AWIS-MMTS), Tmin (AWIS-MMTS) this year (July not shown as the replacement unit was deployed mid month: Jan 2.2,2.4  Feb 1.9,2.4        Mar 2.2,2.1  Apr 2.3,1.7  May 4.3,1.9  June 4.7, 1.6  Aug 0.6,0.0  Sep 0.5,-0.1.

Remarkably, when the NWS office wanted to go in and eliminate the myriad of record high temperatures that had been generated by the AWIS during the time that it had been commissioned/certified as official in 2009 (and especially the May and June data from 2011), the southern region office in Fort Worth told them to leave it alone, let it stand. I am astounded at this attitude. The experience over the last 7 or more years of lack of cooperation with various governmental agencies (FAA and NWS), and a current lack of interest by the southern region office in correcting verified systematic errors of significance is quite a disappointment. It doesn’t reflect well on the integrity of observational program.

On a different subject, I noted a reference in your recent paper that documented poorly sited thermometers. I do feel as if there should be a distinction made between thermometers used for climate change studies and those used for applied or operational purposes. With model grid boxes becoming smaller, a thermometer located over grass with no man made surfaces within 100 feet may not be representative of a grid box that is urban in nature. Likewise, such an ideal site may not be as useful to energy companies, architectural interests, ect. in an urban setting. I noticed that a poor exposure (a #5 type of site) was associated with temperature errors of >9F. I did see such numbers in the literature from a study conducted in Turkey. My experience has been quite different. I measured a surface temperature on a late June 2009 day with an 86 degree sun angle with an infrared thermometer of 143F on a blacktop parking lot. Three eigth’s of an inch above the parking lot surface (3/8″), an UNSHIELDED thermometer in full sun read 105F. My MMTS at the edge of the parking lot was reading 92F (the airport with it’s 2F bias was 94F). I note also that J. Basara found a slight daytime urban COOL island in Oklahoma City…he spoke on this at the 14th Symposium on Meteorological Observation and Instrumentation and at the 2011 AMS Broadcast Meteorology Conference that I co chaired in Oklahoma City. Do you see any evidence supporting a systematic error as large as 9F (day or night)  from the many #5 quality stations that you have studied?

Sincerely,

Richard “Heatwave” Berler, CBM

Chief Meteorologist

KGNS TV

Laredo, TX

======================================================

I’ve located the AWIS station at  27.551058, -99.461244, right next to the electronics building for the Instrument Landing System (ILS) which transmits the glide path radio beam down the runway. It is fairly common to put the weather stations near the ILS at many airports due to “one stop shopping” ease of maintenance in one location with one access road.

Here’s the view from Bing Maps looking west,

Weather stations and A/C heat exchanger exhaust vents – like tornadoes and trailer parks.

UPDATE: Using NCDC MMS metadata lat/lon I located the other station referenced in the email. Here is Richard Berler’s MMTS COOP station he references next to the TV station parking lot. Image from Google Earth street view. The white dot on the pole is it.

A similar situation occurred in Honolulu. See: FUBAR high temp/climate records from faulty sensor to remain in place at Honolulu

0 0 votes
Article Rating
44 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Steve Keohane
October 11, 2011 9:17 am

If you’re maintaining a database to represent a system, it is necessary to remove data points that are not representative of the system if they are influenced by effects that are not part of the system, in this case climate. Refusal to do so means the data does not represent the system, and is therefore is meaningless. You might as well go home.

Ted
October 11, 2011 9:18 am

This is anecdotally interesting, but the important question is whether there’s an impact on the broader temperature signal.
Would this type of information incorporated into the Berkeley Earth Surface Temperature analysis? It was mentioned at Climate etc. that a few papers were about to be submitted. From the AGU Fall Meeting Abstracts, it would appear that the sensitivity of the overall signal to these issues really isn’t that big.
ABSTRACT FINAL ID: GC43B-0908
“Further, we automate the process of assessing station reliability to allow data of unknown reliability to be included in the analysis”

“Applying the Berkeley Earth techniques, we broadly confirm the temperature histories presented by prior groups. However, the improved methodology allows the uncertainties to be reduced (often by 50%) and also has allowed the instrumental temperature record to be extended back to 1800.”
ABSTRACT FINAL ID: GC44B-01
“We calculate the effect of poor station quality, as documented in the US by the team led by Anthony Watts by estimating the temperature trends based solely on the stations ranked good (1,2 or 1,2,3 in the NOAA ranking scheme). We avoid issues of homogenization bias by using raw data; at times when the records are discontinuous (e.g. due to station moves) we break the record into smaller segments and analyze those, rather than attempt to correct the discontinuity.

“The results we obtain are compared to those published by the groups at NOAA, NASA-GISS, and Hadley-CRU in the UK”

bill
October 11, 2011 9:29 am

If we can’t measure local temperature accurately, we can’t construct a meaningful global temperature; if we don’t have a long-time meaningful global temperature, we don’t know if in fact the globe is warming; while we don’t know that, then Arhennius’ theory must remain just that, a theory, not a basis on which to build public policy.

NetDr
October 11, 2011 9:30 am

Dallas Ft worth airport was a cow pasture prior to 1977. Since then tons of concrete and dozens of heat emitting buildings were built. A small city of 30 K inhabitants has been constructed around it.
I believe there is an Urban Heat Island effect around the weather station there. It has been shown that going from 100 people to 30,000 people causes more UHI than an equal increase in a larger city.
Such obvious UHI must be corrected lest we fool ourselves.

John F. Hultquist
October 11, 2011 9:35 am

This is the sort of issue for which the asterisk was invented. Wikipedia . . .
http://en.wikipedia.org/wiki/Asterisk
. . . reports that some folks call it a splat. The problems I see is that if you don’t actually know what the readings should have been – to what do you change them? Yes, you could subtract 2 from all the entries but that only makes them different. It does not make them correct.
So, the best response to this issue is to put a big splat on each of these readings and a researcher could apply whatever correction deemed appropriate for the use.
[For the young, look up the false report of an asterisk on Roger Maris’ home run record.]

Leon Brozyna
October 11, 2011 9:42 am

Sounds like Hawaii all over again … letting the records stand.
I wonder … if instrument failures had resulted in record low temperatures, would those record low temperatures be allowed to stand?

wws
October 11, 2011 10:15 am

I would bet anyone the amount of the Porkulus Package that if that weather station consistently read 2F *lower* than any other weather station, it would have been rated a Code Red Emergency at the highest levels of the NOAA, and the situation would have been “rectified” within 24 hours.
Still, they probably would have “rectified” it by simply writing in a +2.5F algorithm for all future readings from the station.
More and more, I think the entire Warmist approach towards Data Integrity comes down to, “why waste time and expence on the actual readings when we can just paper-whip it?”
And the great advantage of Paper-Whipping is that you *always* come up with the answer that best suits your purposes. No nasty surprises there!

Mike Davis
October 11, 2011 10:29 am

Unknowns can not be corrected. It is known there is and were errors but the extent on any given day is not known so any “correction” would just be magnifying the error. Discard the record for any site that has had an adjustment applied.

Dave Springer
October 11, 2011 10:33 am

I’m about 235 miles NNE of Laredo. My guess is the weather station there either melted or dried up and blew away.

David A. Evans
October 11, 2011 10:37 am

Which part of temperature alone is meaningless is also being missed?
When even the temperature is wrong…?
DaveE.

TERRY46
October 11, 2011 10:42 am

Positoned next to A/C vent .It’s like they put the A/C vent next to the tempurature station on purpose .We have the same problem in my area with the tempurature station next to the airport.Our tempuratures are almost always warmer than surrounding area and I live in the N Foothill of N C.I e-mailed Van Denton,with FOX 8 , and he stated he has seen what appears to be a warm bias on the Mt.Airy site as well.

Bloke down the pub
October 11, 2011 10:57 am

Look on the bright side. The 2°C drop in temperatures being recorded now will boost the cooling trend.

Olen
October 11, 2011 10:59 am

I disagree about bad data not being useful. Bad data is useful to a regulation and tax hungry politician.

More Soylent Green!
October 11, 2011 11:19 am

says:
October 11, 2011 at 9:18 am
GIGO

October 11, 2011 11:28 am

“Applying the Berkeley Earth techniques, we broadly confirm the temperature histories presented by prior groups. However, the improved methodology allows the uncertainties to be reduced (often by 50%) and also has allowed the instrumental temperature record to be extended back to 1800.”
Wow! I’ll bet those guys taking the daily temperature readings from 1800 to 1920 or so (when RECORDING DEVICES started to be used.) were REALLY accurate about the reading, the time of day, etc. No, wait, it was a voluntary, or millitary duty..to put SOMETHING into a record, with NO quality assurance at all.
Yes, I run all my 0.1 C judgements that way. Yeah, sure…I’ve go a bridge to sell you!

October 11, 2011 11:29 am

“GOT” a bridge to sell. DARNED INTERNET DUPLEX SIGNAL DROPS THINGS WHEN I AM MOVING AT “LIGHT SPEED”.

MarkW
October 11, 2011 11:32 am

Bloke down the pub says:
October 11, 2011 at 10:57 am
Look on the bright side. The 2°C drop in temperatures being recorded now will boost the cooling trend.

You wish. In all likelihood, they will add two degrees to the current temperature to make it consistent with the past.

FerdinandAkin
October 11, 2011 11:36 am

Let the record stand as is.
Do not give these people any more license to apply ‘corrections’ to the past temperature record!
Let the record stand.

October 11, 2011 11:37 am

12.Bloke down the pub says:
October 11, 2011 at 10:57 am
Look on the bright side. The 2°C drop in temperatures being recorded now will boost the cooling trend.
And that will be the thing to watch – when the anomalies are computed, and they show a cooling trend, will they go in and say the past temps or the current temps are faulty? If so, what adjustments will they use?

Stilgar
October 11, 2011 11:40 am

What should the correction be? 2 degrees in some cases, 4 degrees? If the number it is off by is not known, how can you correct it?
It would seem to me that whatever the difference, unless you can prove an external source was randomly changing the readings (car parked beside it one day and not the next) then in general the overall bias should be the same. If you are looking for a difference in the rate of change, as long as the bias is the same, the rate will be the same. In this case making a correction could mess up that calculation.
On the other hand, if you want to know things like record highs and such, then the absolute temps need to be adjusted.
I agree with the above poster JFH, flag the data as known to be off and allow the people who use the data to determine if a correction is needed.

Steve from rockwood
October 11, 2011 11:44 am

The care you take with your raw data is the same care you take with your published results.

Lance
October 11, 2011 11:48 am

I seem to recall that several years ago Maine recorded a record low, and then after much careful consideration, they disallowed it? Perhaps someone may recall better than I, what the store was?
I believe it was even a WUWT topic…?

Mike Davis
October 11, 2011 11:51 am

More Soylent Green:
I call it the GOZENTA Effect!
What Gozenta determines what Comzouta! GIGO, while true does not introduce the need to evaluate the quality of the Garbage being input or how much perfume has been applied in the form of “Corrections”.

Genghis
October 11, 2011 11:57 am

Actually though won’t this ultimately hurt the CAGW records? If the sensor is replaced with a more accurate device it will show declining temperatures.

Frank K.
October 11, 2011 12:04 pm

wws says:
October 11, 2011 at 10:15 am
“More and more, I think the entire Warmist approach towards Data Integrity comes down to, why waste time and experience on the actual readings when we can just paper-whip it?”
Actually the warmist approach to data integrity is as follows:
(1) If the data supports global warming, then it is incontrovertible evidence of impending doom.
(2) If the data does not support global warming, then it “doesn’t matter” (like the UHI effect).
Of course, one can argue that the temperature readings at any one location don’t matter since they represent only .0000000001% of the earth’s surface. (That, by the way, is a Hansen-ism).

October 11, 2011 12:06 pm

Why is it that these people have no instinct for the empirical observation? Everything they do, each step they take, has the effect of obfuscating the empirical observations. It seems that we have statisticians who have been hog tied and forced to observe something that they loathe.
Now that computers are in widespread use, there is no excuse for not insisting on accurate measurement and recording of the raw observations.

Gail Combs
October 11, 2011 12:29 pm

FerdinandAkin says:
October 11, 2011 at 11:36 am
Let the record stand as is.
Do not give these people any more license to apply ‘corrections’ to the past temperature record!
Let the record stand.
___________________________
But do not forget to include an increase in the error. (snicker)
Speaking of error, AJ Strata’s look at the error in the Temp. measurements is enlightening: http://strata-sphere.com/blog/index.php/archives/11420

Jean Parisot
October 11, 2011 12:33 pm

Shouldn’t all airport stations be biased hot – to ensure density altitude and altimeter settings are biased to the safe side?

Phil
October 11, 2011 12:44 pm

“Applying the Berkeley Earth techniques, we broadly confirm the temperature histories presented by prior groups. However, the improved methodology allows the uncertainties to be reduced (often by 50%) and also has allowed the instrumental temperature record to be extended back to 1800.”

There is no temperature record back to 1800. The Berkeley Earth effort is a reconstruction. The Central England Temperature (CET) “record” originally published by Manley is a reconstruction. Today’s Global Average Temperature is a construction.
None of the reconstructions or constructions are data and should not be confused with data nor should their uncertainties be calculated like those of actual data. There is an uncertainty due to the various methodologies used to make the reconstructions and constructions that should be taken into account. The methodology uncertainty should be added to that of the data. The methodology uncertainty does not replace data uncertainty.
In most other disciplines, when a sensor used in an experiment is discovered to be uncalibrated or miscalibrated, the data obtained is usually not used, as calculating the uncertainty associated with the uncalibrated or miscalibrated sensor becomes very difficult if not impossible. To me, the most important result of the surface station survey is it showed conclusively that virtually none of these sensors were subjected to even the most minimal quality controls in their entire history. Almost without exception (and I don’t recall a specific exception) these sensors were not calibrated, recalibrated or otherwise had their calibration checked, nor were these calibrations checked or evaluated taking into account siting. Everything is theoretical, IIRC.
The uncertainty issues associated with temperature data over the past century are numerous. Attempts to minimize these uncertainties by asking a miracle of one or more statistical saints is and should remain an academic argument unless and until it can be shown that these miracles have actually been granted in full. Usually that is done by repeating an experiment with properly located and calibrated sensors, but the “experiment” of collecting temperature data over the past century cannot be repeated.

wws
October 11, 2011 12:49 pm

“Now that computers are in widespread use, there is no excuse for not insisting on accurate measurement and recording of the raw observations.”
The “excuse” is painfully obvious – doing that might not give them the numbers they want!

nutso fasst
October 11, 2011 12:54 pm

Was the email read before being reported on here? It’s not “a story of a poorly sited weather station.” In a nutshell, the story is of a defective FAA-owned instrument that was recently shown to be reading almost 5F too warm. The FAA management was advised but didn’t care, and it took pressure from the local NWS office along with a network news story to finally prompt the FAA to replace the defective instrument. And then, when the local NWS office asked to delete the record high temperatures generated by the faulty instrument, the southern region NWS office refused.

REPLY:
Yes but did you look at the photos I added before commenting? Poorly sited in addition to all that. – Anthony

Andrew Harding
Editor
October 11, 2011 12:56 pm

Stilgar says:
October 11, 2011 at 11:40 am
“What should the correction be? 2 degrees in some cases, 4 degrees? If the number it is off by is not known, how can you correct it?
It would seem to me that whatever the difference, unless you can prove an external source was randomly changing the readings (car parked beside it one day and not the next) then in general the overall bias should be the same. If you are looking for a difference in the rate of change, as long as the bias is the same, the rate will be the same. In this case making a correction could mess up that calculation”
Stilgar, wouldn’t temperatures be affected disproportionately? For example during the night the buildings and asphalt on the ground would radiate heat, because the absorb significantly more heat than natural objects such as grass, trees and even rocks with a high albedo such as dolomite, limestone and white sand. The result would be higher night temperatures than would be expected. The air- conditioners night time on/off status would need to be known. This is pure speculation on my part and I would be very grateful if someone more knowledgable would correct me or not as they see fit. if what I said is not the case, then the temperature differential is the important factor and should show either warming or not.

Steve Oregon
October 11, 2011 12:57 pm

“Climate records from faulty sensors”.
Is that an real science professor conducting climate research?
Dr. Faulty Sensor, PhD

crosspatch
October 11, 2011 1:18 pm

I suspect the problem with faulty sensors is more widespread than people realize. I also believe that faulty readings are much more common with electronic rather than with mechanical thermometers. The main inaccuracy with mechanical devices would be in a reading error by the person recording the measurement. I believe a better solution is a hybrid, that is a mechanical thermometer that is read by electronic means.

KnR
October 11, 2011 1:33 pm

In one way this is not a ‘badly sited ‘ or ‘badly functioning’ station it is in fact OK for the job its meant to do , given weather information for the Airport for use in air traffic movements .
The trouble is its being used for something else where these issues are a real problem . But that is the issue with airport based station , they were designed to provide information for use in a certain way , not designed to have their results pimped out as the ‘word of god’ for something else .

jbunt
October 11, 2011 1:56 pm

I disagree with the folks who say that you cannot “correct” the data, because you do not KNOW with certainty the correct number. Okay, so if it said 300 degrees, you would just let it stand? Probably not. So the answer is to either delete it, or make a reasonable “adjustment,” but do not call it a “correction.” Footnote it as an adjustment of faulty data from faulty equipment.

Gail Combs
October 11, 2011 2:16 pm

Given that the US government requires corporate labs, such as those in drug and aircraft manufacture, to calibrate their equipment including balances and thermometers, on a monthly bases in house with an outside calibration on a yearly basis, it is rather appalling to think the taxpayer is picking up the tab for such shoddy practices.
It seems that it is more important for tax paid employees like Hansen and Mann to “Get out the message” than it is to do creditable science. There are plenty of grad students available (and on the government dole) to do the calibration site checks, so there is absolutely no excuse.
If the thermometers are not calibrated when installed not calibrated yearly and are not sited correctly then how the heck can these buffoons claim to measure temperature accurately enough to say http://www.pnas.org/content/103/39/14288.long“>Global warming is now 0.6°C in the past three decades and 0.8°C in the past (J.Hansen 2006) when this thermometer is at off at least 2 to 5 degrees. Statistics can not add accuracy to flawed data with no quality checks.
Thermometer Calibration
…The validation, verification reassessment section of the Hazard Analysis and Critical Control Point (HACCP) system stated in the Code of Federal Regulations (9CFR 3:417.4) specifies that
instruments used for monitoring critical control points must be calibrated….
The National Institute of Standards and Technology (NIST) is an agency within the U. S. Department of Commerce which provides certification and calibration for thermometers and other precision instruments. When purchasing a thermometer check for the “NIST” label. The NIST label indicates a certified instrument that will maintain accuracy within specified limits, for at least one year. Each year, an NIST thermometer must be recertified to assure accuracy….”

http://www.ncagr.gov/meatpoultry/pdf/Thermometer%20Calibration.pdf

Gail Combs
October 11, 2011 4:59 pm

crosspatch says:
October 11, 2011 at 1:18 pm
I suspect the problem with faulty sensors is more widespread than people realize…..
__________________________________-
Having done calibrations on equipment including thermometers I can tell you, that you are correct. Unless the thermometer was NIST certified, it was a real crap shoot as to whether the blasted things were even close to accurate. All the thermometers in the lab had calibration papers in the storage tubes and very few had zero as the correction.
We also ran into the problem of someone splicing additional wire onto a pyrometer so it would reach the top of the oven and that really screwed up the readings.
No calibration and no routine QC??? The data is probably only good to +3 to 5 C, if you are lucky.

October 11, 2011 7:12 pm

The NWS reaction in this case has occurred in other branches of government. When I found a serious error in surface ozone measurements in San Antonio that nearly breached the EPA’s bizarre +/- 20% tolerance for such data, the EPA required that the defective data be retained. The data then became part of regional modeling that attempts to forecast future ozone levels. When a contractor was asked to include measured aerosol optical depth in EPA-mandated modeling, I was told that the EPA would not allow the real data and required that their default value be used, which was far too low. Possibly all these kinds of issues should be appealed under the Federal Data Quality Act.

David in Georiga
October 12, 2011 6:59 am

Maybe I’m alone in thinking that it’s a good thing that the temperature records are all manipulated and higher than reality. The measurements they are making only matter to their failed science, and when we continue to see the “global temperature anomaly” rise without seeing any effects from the apparent rise in temperature, people will begin to realize that either a) the rise in temperature is not a bad thing, or b) the temperature isn’t really rising as much as they are claiming.
I am pretty sure that the global temps are flat (within some natural variability) and will stay flat for several years, or even decline slightly. It makes perfect sense to me that we are right at about the same temperature that we were before the LIA cooled everything off, and maybe even a bit cooler than the MWP. Since the MWP and the LIA were not caused by our “out of control burning of fossil fuels” and they were quite obviously global events, the natural variability of the Earth is larger than the AGM crowd would like to believe.
We didn’t all die during the glacial periods, we didn’t die out during the early part of the holocene when the temps were clearly much higher than today, so I don’t think that we’re in any danger of global collapse now. When the global collapse doesn’t happen, all the climate scientists will look like Chicken Little, and the skeptics will have been proven right.
I know that a lot of you guys are concerned about policy changes (and things like the Carbon Tax that Australia just passed), but I doubt these kinds of policies will make a large dent in productivity our productivity or economies in the long run. We’ve got bigger problems than that already, due to our politician’s love of spending money they don’t have. A little “carbon trading” won’t make much difference.

October 12, 2011 10:31 am

I love the satellite view that you put up there. Beautiful place in my opinion. Definitely gotta visit Laredo!

woodNfish
October 12, 2011 12:30 pm

Richard “Heatwave” Berler: “It doesn’t reflect well on the integrity of observational program.”
Mr. Berler you need to realize that the government has no credibility. None. Not in any of its departments, agencies, employees, politicians, justices, nowhere, no one. The government (at all levels) lies, cheats, steals, murders, spreads propaganda, and gets away with all of it.
You trust they may have some credibility at your own risk.

October 12, 2011 12:49 pm

Gail Combs says October 11, 2011 at 2:16 pm

Thermometer Calibration
…The validation, verification reassessment section of the Hazard Analysis and Critical Control Point (HACCP) system stated in the Code of Federal Regulations (9CFR 3:417.4) specifies that
instruments used for monitoring critical control points must be calibrated….

Well, here’s the exculpatory clause (the assumption here seems to be that HAACP applies to _all_ aspects of _any_ process using thermometry _anywhere_ and that is not true):

Hazard analysis and critical control points, or HACCP, is a systematic preventive approach to food safety and pharmaceutical safety that addresses physical, chemical, and biological hazards as a means of prevention rather than finished product inspection. HACCP is used in the food industry to identify potential food safety hazards, so that key actions can be taken to reduce or eliminate the risk of the hazards being realized. The system is used at all stages of food production and preparation processes including packaging, distribution, etc.

You might be surprised to learn that thermometry and ‘control processes’ exist outside the food processing and preparation industry and that other standards and requirements for the accuracy of instrumentation exist, outside the area of your expertise.
Metrology – Metrology is the science of measurement. Metrology includes all theoretical and practical aspects of measurement. Metrology is a very broad field and may be divided into three subfields: (brief excerpt below)

1) Scientific or fundamental metrology – concerns the establishment of quantity systems, unit systems, units of measurement, the development of new measurement methods, realisation of measurement standards and the transfer of traceability from these standards to users in society.
2) Applied or industrial metrology – concerns the application of measurement science to manufacturing and other processes and their use in society, ensuring the suitability of measurement instruments, their calibration and quality control of measurements.
3) Legal metrology – concerns regulatory requirements of measurements and measuring instruments for the protection of health, public safety, the environment, enabling taxation, protection of consumers and fair trade.

A core concept in metrology is (metrological) traceability, … Traceability is most often obtained by calibration, establishing the relation between the indication of a measuring instrument and the value of a measurement standard. These standards are usually coordinated by national metrological institutes: National Institute of Standards and Technology, National Physical Laboratory, UK, Physikalisch-Technische Bundesanstalt, etc.

.

Owen
October 12, 2011 2:56 pm

If I am in a heavily laden flying machine landing at an airport, I would rather my lift calculations were made with T a few degrees high than a few too low. T too high means a long roll out on stop from higher landing speed to compensate, T too low can mean a broken airplane from a stall on short final causing hard impact. (Ok we usually calculated a safety margin, but some of the shorter fields were a little dicey.) I’d rather hit the long barrier at 15 mph than the ground doing 150.
That being said, I wish they would not use those for climate data unless they put them out with the outer marker beacon.