How not to measure temperature, part 83: No smoking please

The USHCN Climate station of record in Milton-Freewater, Oregon. Note the beige smoking stand.

The casual way that NOAA treats quality control of the measurement environment of the surface network has been evident for some time. The above photo is of course just one of many examples. Now before anyone jumps to a conclusion thinking that I’m suggesting heat from cigarettes might affect the temperature reading, let me be clear, I am not.

But a couple of guys hanging around the temperature sensor on a cold day shooting the bull and puffing, maybe. Body heat carried by wind to then nearby MMTS sensor “could” be an issue in making Tmax just a bit higher than it might normally be.

But that is likely swamped by the larger local signal near the temperature sensor –

Click for larger image

– the waste heat from the sewage treatment plant.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

69 Comments
Inline Feedbacks
View all comments
Jon H
February 20, 2009 1:40 pm

GLOBAL WARMING – Smokers got moved outside in the 90s.
GLOBAL COOLING – much fewer smokers now than 20 years ago.

February 20, 2009 1:59 pm

RE: Peter (12:03:43) :
Since the butt container is right there by the door, I would guess it more likely that it would be smokers – not workers – who would be in proximity, Peter.
And us “less analytical” folks didn’t jump to conclusions like you did. We only “asked” if it was – and observed that it very well might be. We also readily admit our error – if error it be. Perhaps you should read more carefully and conclude a bit less.
And us “less analytical” folks have seen all the quantitative data presented here and at Surface Stations on these definitely discredited stations. Data that indicate – for the most part unambiguously – that the placement, relocation, and the maintaining of these stations is a serious factor in the accuracy of the data.
Oh – and BTW – the absolute data from these stations are affected (as has been demonstrated by both experiment and a huge SS data base), and definitely affect the “trends” resulting from the data.
That’s the whole point.

jorgekafkazar
February 20, 2009 2:06 pm

Holy…er…waste treatment, Batman!
But seriously, folks, the asphalt is probably the major fly in the punchbowl, here. Any warming trend, from whatever cause, will be amplified by the low albedo surface. UHI is NOT a static phenomenon; the root causes of UHI interact with other trends. Asphalt convective cooling rates will rise by the temperature differential, dT; asphalt heating will rise by a d(T^4) factor.
Okay, guys. Don’t forget; on March 1st, everybody moves their BBQ another foot closer to the sensor.

Tim Clark
February 20, 2009 2:20 pm

Peter (12:03:43) :
Maybe the butt container is there so that anyone working near the gauge will extinguish their smokes. As far as large, immobile objects influencing readings goes, as long as they are always present, they may influence absolute readings but the trends from the data would be unaffected.

That’s the problem, Peter. Do you know how long its been there? When they dump it, do they return it to the same place? Look at all the other junk scattered round; pieces of molding, trash container, fertilizer buggy (organic fertilizer?), a white sign or something next to the butt disposal unit, etc. How long have they been there. Granted, they won’t influence the temp as much as the sewage unit, but it’s not proper. Think about this- see how the unit is leaning? I’ll bet you anything the workers lean against it while puffing, maybe even blow smoke rings in it for a laugh. Can that account for some error?

D Caldwell
February 20, 2009 2:43 pm

Peter,
The point is that this reporting location was probably in a site in the past that would not affect the readings like a waste water treatment plant. That kind of environment change would, indeed, produce an incorrect warming trend. Waste water treatment is a rather recent development in a long term temperature record.
This kind of siting issue could also affect the trend if many well-sited rural locations dropped out of the recent record leaving sites like this to continue reporting. The aggregate trend would increase artificially.

kagiso
February 20, 2009 2:44 pm

I suspect both body heat and cigarette heat would both be negligible compared to the heat coming of those substantial percolating filter beds.
Not only is the incoming sewage close to room temperature, percolating filters actually generate heat when the bugs are munching well.
http://books.google.co.uk/books?id=5lfTnwR1HEgC&pg=PA102&lpg=PA102&dq=trickling+filters+heat+generation&source=web&ots=-Xk-CiudoA&sig=a95jcoF9KvkInZCkz_SVf2F2aMA&hl=en&ei=FDKfSZGbCtrG-Qa_u9m1Dg&sa=X&oi=book_result&resnum=1&ct=result
Percolating heaters are well ventilated with natural ventilation, so the local micro climate will be both significantly warmer and way more humid than the general surrounding area.
A very, very poor place to site a weather station.

Ed Scott
February 20, 2009 2:48 pm

Warning: This paper has been rejected by Science and Nature. It May be hazardous to your research grant.
Satellite Data Show No Warming Before 1997. Changes Since Not Related to CO2
http://anhonestclimatedebate.wordpress.com/2009/02/20/satellite-data-show-no-warming-before-1997-changes-since-not-related-to-co2/
Fatal computer errors in IPCC climate models derive from the fact that none of the abrupt warmings and coolings on the record, especially since 1998, can be attributed to the greenhouse effect. Hence, all IPCC models purporting to predict (project??) climate a hundred years into the future are invalid and their predictions/projections must be discarded. To summarize: existing theory used by the IPCC can neither explain the observed climate nor predict the future. Carbon dioxide warming has been shown to be non-existent in the eighties and nineties, and the warming since 1998 is not carbonaceous in origin. It follows that Quijotic carbon dioxide policies like the Kyoto Protocol and the cap-and-trade laws should be abandoned.

Hank
February 20, 2009 3:00 pm

As the town grows so does the throughput of waste treatment plant. I would say that the waste treatment plant not only corrupts absolute temperatures but also trends.

novoburgo
February 20, 2009 3:57 pm

Peter (12:03:43) :
Actually Peter the standard rain gauge can make a great albeit temporary heater by removing the funnel then the inner tube, pouring in a cup of kerosene and voila’, at least 10 minutes of comfort (kids, don’t try this at home!).

Paul Penrose
February 20, 2009 4:21 pm

I don’t know if it makes a difference or not, but generally I don’t think it’s a good idea to encourage people to stand near the temperature sensor. Just another example of how this system was never intended as a high-quality climate monitor – just local weather. And it’s still treated that way.

climatebeagle
February 20, 2009 4:32 pm

Thanks for pointing out it’s a rain gauge. I visit sites like this to understand more about alleged AGW and learn something new with every article.
As a new question is there any writeup on how global temperature is calculated from all the ground stations? I found Hansen’s original document but didn’t see details there, found one paper that said a global temp is a bogus concept. My real question is from living in the SF Bay area, is the area over which a ground station might represent an accurate temperature taken into account? For example, I would guess that the station in Berkeley represents a valid temp for less than 10 (ten) sq. miles around it (temps can differ by 20F in under two miles here) where a station in the central valley might represent several hundred square miles. Thus a simple average of the two would be meaningless, an area weighted one might have more value related to a global measure of energy in the system.

A.Syme
February 20, 2009 6:21 pm

This brings up a point that has been made before. In the past the temp reading was not so critical. The location at the airport was critical for the power settings on jet aircraft. That was the most important factor for location. 40 years ago if you had told a weatherman that we would be arguing over .01degrees in a temperature average, he would give you a strange look.
Computer modeling has changed all that, the historical records are not that accurate and weren’t intended to be.

Earle Williams
February 20, 2009 6:34 pm

climatebeagle,
You are right about averaging temps between stations. I apologize if the following explanation is something you already know, but I’ll put it out there just in case. The global mean temperature that gets calculated doesn’t just average temperatures though, it averages what is called the ‘temperature anomaly’. This means that for each temp station the monthly average is compared against the average of all prior monthly averages. The difference is the anomaly. So if the average January temp for a given site is 24.0 degrees F, and the average for last January was 24.8 F, then the anomaly for the station is calculated to be 0.8 F.
If a site inland has a historic average January temp of 44.0 and the temp there last month was 44.5, then the anomaly for the inland station would be 0.5 F.
It’s not an ideal way to measure the ‘global temperature’ but it is in common use.
The GISTEMP metric takes anomaly data and applies some ‘corrections’ to account for urban heat island effects. The code that is presumably used to calculate that metric is available at the GISTEMP web site.

juan
February 20, 2009 6:55 pm

Does anyone know how that (settling?) pond is managed? We have a lot of percolation ponds around here to manage the water table. I think they are alternately filled and drained to interrupt mosquito breeding. In that case the thermal effect of the pond could be highly variable.

juan
February 20, 2009 7:08 pm

Second thought. Could it be an aeration pond? _That_ could have some interesting effects.

climatebeagle
February 20, 2009 7:55 pm

Thanks Earle, that useful. I’ve had my issues with trying to get a single value from a set of results in the past when I was trying to automatically determine if a software product had a performance regression based upon nightly results. It’s a similar problem in coming up with a single value of if the product is overall faster or slower based upon N performance tests compared to their previous values.

Neil Crafter
February 20, 2009 8:04 pm

Interesting how a thread like this attracts very little attention from the merry band of AGWers who visit here such as Joel Shore, Simon Evans, Foinavon, Flanagan, Mary Hinge et al, who are usually all over the other topics. Siting a weather/climate station adjacent to the settling ponds of a waste water treatment plan cannot help but add a warming bias to the temperatures recorded here. And I think rather than argue the unarguable (which they seem fairly proficient at) they then rather skip these threads.
Well done to Anthony and his team of contributors. And the tally of poorly sited stations continues to mount. Anthony, is this a category 4 or a 5 do you think?

February 20, 2009 8:29 pm

Where’s the barbeque?

Mike Bryant
February 20, 2009 8:36 pm

The cigarette diposal vessel was placed near MMTS pole and rain guage so smokers could lean against pole and rain guage, which are now leaning to left. That’s killing two birds with stone! No need to buy chair… 🙂

climatebeagle
February 20, 2009 8:43 pm

Earle:

This means that for each temp station the monthly average is compared against the average of all prior monthly averages.

I assume that since historically the ground stations have recorded daily max & min that the monthly average is really (at worst):
monthly arithmetic mean of daily (min +max)/2
or possibly
monthly arithmetic mean of daily smartfunction(min, max)
and better
monthly arithmetic mean of daily mean of 24 hourly readings
Though I did find this link that indicated (min+max)/2 is close to 24 hourly readings overall.
http://www.engr.udayton.edu/weather/source.htm
Sorry if this is covering basics, but I haven’t been able to find a clear definition of this.

Leon Brozyna
February 20, 2009 9:29 pm

Wonder how the MMTS reacts to regular jarring from smokers leaning on it or ‘falling’ into it on their backsides as they relax and puff away. Probably still not as much as the aromatic bubble of heat such facilities normally generate.

Pamela Gray
February 20, 2009 9:30 pm

This is near water! Even has a flotation device (and floaters based on the travel brochure). Where is the silver boat turned upside down near by??? Out of compliance. tsk tsk.

Editor
February 20, 2009 10:50 pm

climatebeagle (16:32:45) : As a new question is there any writeup on how global temperature is calculated from all the ground stations? I found Hansen’s original document but didn’t see details there, found one paper that said a global temp is a bogus concept.
The first problem is exactly that. What is a global average mean? There is a high and low for the day that are averaged together (that I suppose tells us a little bit about the temperature for the day, but 60F and fog all day is not the same as 30 at night and 90 at 4pm under full sun …
Then these ‘averages’ are averaged over larger areas with unequal distributions of thermometers. What does it mean when you average 4 New York City with 3 rural Ohio? If we add 2 more stations to Ohio, did the temperature change? (The average did…) When the jet stream has New York getting gulf air, but Ohio with a Canada Express, does the one with the most thermometers ‘count’ more? (It does to the average…)
My real question is from living in the SF Bay area, is the area over which a ground station might represent an accurate temperature taken into account? For example, I would guess that the station in Berkeley represents a valid temp for less than 10 (ten) sq. miles around it (temps can differ by 20F in under two miles here) where a station in the central valley might represent several hundred square miles. Thus a simple average of the two would be meaningless, an area weighted one might have more value related to a global measure of energy in the system.
BINGO! and then some… That Central Valley thermometer can be used to ‘adjust’ the Berkeley thermometer based on the notion that the CV station is ‘rural’ and can adjust out the ‘Urban Heat Island’ in Berkeley. This is clearly so broken that the typical person in California would laugh their self silly, but that is what is done.
But wait, there’s more: Stations of 1000km away are considered ‘nearby’ for purposes of rewriting station data…
But wait, there’s even more: An ‘offset’ is calculated based on whatever part of the last 10 years is available. This is applied to all history for all time so the temperature in the central valley from 1998-2008 (when massive growth was happening) can be used to ‘adjust’ the temperatures in Berkeley in 1889 …
This is taken from reading the GIStemp code directly. I can provide more detail if anyone has the stomach for it. (I don’t advise it…)
GIStemp pasteurized homogenized processed data food product are useless. (IMHO of course). They contain critical failures such as the above. It would be far better to simply take the UHCN and antarctic data sets and merge them (as is done in the first 1/2 step of GIS) and skip all the rest of their data fantasy processes.

Editor
February 20, 2009 11:15 pm

Earle Williams (18:34:15) : The global mean temperature that gets calculated doesn’t just average temperatures though, it averages what is called the ‘temperature anomaly’. This means that for each temp station the monthly average is compared against the average of all prior monthly averages.
Long before the zonal averaging of anomalies the temperature data have been so manipulated that they have lost much contact with reality. Before the anomaly stage, the data have had deletions, infills, offset slides and other things done to them. And I don’t remember GIStemp averaging all prior temps… but I’ve only done a rough cut at the zonal steps.
The GISTEMP metric takes anomaly data and applies some ‘corrections’ to account for urban heat island effects.
This is far too charitable. The notion used is that a ‘nearby’ ‘rural’ station can be used to correct for the UHI at an urban location. Ignoring the fact that the way urban and rural are picked is fraught with error… This ‘reference station method’ is based on an over generalization implemented badly.
The assertion is that a ‘correlation’ exists between urban and nearby rural that lets you make an offset. This offset is then used to ‘adjust’ the urban station data.
The Problems:
1) The ‘offset’ is calculated as a simple subtraction. The idea of ‘correlation’ has mutated into ‘linear offset’. This is broken.
2) UHCN and GHCN data are compared. If no GHCN data exists, the UHCN are accepted. If GHCN exists, an ‘offset’ to UHCN is calculated and applied, and this resultant record is used (tossing both original UHCN and GHCN data). The ‘offset’ is calculated from a most recent 10 years data (that portion that is available.) This may be reflecting, for example, recent changes of equipment or time of observation. This ‘offset’ is then applied to all prior data from 1880 onward. (For some unexplained reason, all data prior to 1880 are tossed out.) What does a change of thermometer in 2005 have to do with UHI in Berkeley in 1885 ?
3) A ‘reference station’ can be up to 1000 km away. What does Reno have to do with Berkeley? How about Lodi? Fort Brag? Pismo Beach?
4) Only after all this are the anomalies calculated and the zonal averages averaged and offset adjusted some more. In this step, the ‘reference stations’ can be up to 1500 km away. Furthermore, you may have applied the Reference Station Method 3 or 4 times. What possible rational is there for this recursion? Is there any peer reviewed literature for repetitive application? Or is it just a case of bald assertion that it’s ok to keep on passing the data through the grinder because it was shown that ‘nearby’ station data ‘correlates’ as raw data?

Glenn
February 20, 2009 11:20 pm

Mike D. (20:29:26) :
“Where’s the barbeque?”
Right behind the cigarette disposal in the first picture.