How not to measure temperature, part 87: Grilling in the cornhusker state

One of the common themes seen with the surfacestations.org project has been the proximity of BBQ grills to official NOAA thermometers used in the United States Historical Climate Network (USHCN). Despite now having surveyed over 77% of the 1221 station network, some truths continue to be self evident.

Hartington_NE_USHCN
USHCN climate station of record, Hartington, NE

This station was photographed by our prolific volunteer, Eric Gamberg. The proximity to the concrete patio earns this station a CRN4 rating, it may be a CRN5 when they wheel out the BBQ away from the house. But who knows? The grilling schedule is not part of the metadata.

But fear not, NASA GISS adjusts for such problems of concrete and BBQ grills. Consider the following blink comparator:

Notice how the past is adjusted cooler, increasing the trend
Notice how the past is adjusted cooler, increasing the trend

Source: NASA GISS

USHCN RAW:

http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=425744450020&data_set=1&num_neighbors=1

GISS Homogenized:

http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=425744450020&data_set=2&num_neighbors=1

I’m not sure why the hinge point is 1978, perhaps that’s when the homeowner acquired the BBQ? Sure, that is an absurd claim, but certainly no more absurd than the GISS homogenization adjustment itself. Adjusting the past increases the overall positive slope of the temperature trend.

For those new to the whole concept of USHCN stations, the NOAA thermometer is the white slatted object on the post in the center of the photo. It is known as an MMTS thermometer and a cable goes from it into the home where the volunteer observer will write down the high and low into the B91 logbook and send in the report once a month to the National Climatic Data Center (NCDC).There are more photos of this station which you can see in my online station database.

The Gallery of photos can be seen here

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

77 Comments
Inline Feedbacks
View all comments
Derek Walton
May 29, 2009 7:06 am

Just a thought….
Here in the UK, the Met Office has told us to prepare for a summer with frequent BBQs. I wonder if the same advice has been offered to the Volunteer Observers in the USHCN? That’ll help the summer temperatures climb.

May 29, 2009 7:09 am

The “hinge point” for ALL of the Hansen’s (GISS) “adjustments” to the historical temperature records IS the decade between 1968 and 1978:
Look at the (older) WUWT chart of how many times the individual year’s “average” have been re-calculated from 1860 through 2008. (Last year, June’s archives.) Records before 1968 were changed an AVERAGE of 36 times, with almost ALL of the historical temperatures for each year changing between 36 and 46 times.
Records AFTER 1978 were changed as well, but even more frequently.
The 1935-1945 years, previously at +0.4 to +0.5 degrees above the arbitrary zero point of 1973 (right at the hinge point => coincidence? => NOT !) are now substantially cooler by 1/4 degree, which CREATES the entire global warming hysteria.
Of course, if 1935-1945 were as hot as actually recorded, then the entire premise of Hansen’s global warming is destroyed, and the government loses the opportunity to get 1.3 trillion in taxes.

May 29, 2009 7:12 am

I thought we agreed this photo needed a “big red arrow” – that is now clearly and distinctly a “small black arrow” ….
Nebraska fans are disappointed.

John F. Hultquist
May 29, 2009 8:27 am

Every one of these reports strikes like the slow clang of a bell as family and friends watch the body of a loved one lowered into the ground. It is time to move on and use the current satellite technology for climate studies. Local sensors such as these shown in this series ought not to be used for anything other than locally in newspapers, radio, and television reports. In such venues the interest is whether it will be hot or cold, windy or calm, and rain, snow, or not. All to within a few percent accuracy. Anyone, anywhere, who shows any digits to the right of the decimal point when reporting such numbers should be publicly whacked with a rubber facsimile of a MMTS.
Also, I think one of the folks that has made these temperature corrections (adjustments, fixes, whatever!?) ought to be required to explain what they are doing and why? If the assumption is that the previous sensor was biased on the high side then maybe it makes sense to “adjust” those numbers down. If there is an urban heat effect then the recent numbers (since growth began) could be gradually brought back down. In the first case a simple reduction for all the readings could be done if the two instruments were operated side-by-side for a few months. In the second case the adjustment would have to be on a sliding scale based on the rate of urban (heat producing) growth. How? Population growth. Area of concrete and asphalt and black roofs. Number of vehicles registered. If a sensor location is moved – what then? An increase in elevation or a decrease adds another intractable issue. Combine several of these problems and the data becomes useless for larger scale climate analyses.
I now toss the first shovel full of dirt on to the coffin of this departed friend.
Clang.

David Segesta
May 29, 2009 8:45 am

Is Hansen’s adjustment method explained anywhere? If not has anyone filed a FOIA to force him to release it?

cbone
May 29, 2009 9:11 am

OT, but I love the google ad for grills right after this entry. How appropriate. Perhaps Mr. Watts could develop a line of NOAA approved grills with heat deflectors to be used in proximity to MMTS temperature sensors!

May 29, 2009 9:26 am

The only thing I could find in the “station history” file was a move of 0.8 miles South on May 1, 1983, which resulted in a change of -12 ft’ in elevation.
Another move of x miles on 11/1/43 resulted in no elevation change.
http://cdiac.ornl.gov/ftp/ushcn_monthly/station_history

Jeff Alberts
May 29, 2009 9:29 am

Bill seems to (willfully?) forget that such microsite biases can change the anomaly on any given day/week/month/year, thus skew(er)ing the resulting data.
What if they had a party, lots of people standing around the BBQ and MMTS, you’d surely get some strange readings as the ambient air temp did some strange things for several hours.
I can’t help but think the fence affects things too. notice the unmelted snow right at the base of the MMTS and all along the fence. But the MMTS itself is in the sunlight. What does that do to the ambient air temp being recorded?

AnonyMoose
May 29, 2009 9:30 am

I think the tilt of the MMTS pole is a good bit of engineering. I’m sure it helps keep the MMTS off the ground each time that the fence collapses onto it.
And for you British chaps unfamiliar with American architecture, the style of the house has been popular since the 1960s. The slightly more complex shape around the garage increases the chance that it is younger than the 1970s. The large house across the street is probably less than 25 years old. This house has probably been built on site with 2×4 lumber framing on a concrete slab foundation or cinder block basement. This is prairie country on glacial till, so basements are not hard to dig. Some houses a block north, closer to downtown, look like they are from the 1930-1940s, so this area was probably built as individual houses from 1960-1980. Yup, pretty meager when we can only describe houses by the decade rather than the century.

May 29, 2009 9:41 am

As I scrolled past the advertisement block, what did my wondering eyes behold? GrillsDirect.com! Ahhhh, the irony is as delicious as a T-bone grilled to perfection beneath a NOAA thermometer.

Keith
May 29, 2009 9:54 am

“I’m not sure why the hinge point is 1978…”
Perhaps because starting in 1979 there is satellite data available, so you can’t fudge data since it would be obvious against the independent satellite measurements. Prior to 1979, there is no independent check. I’ve often thought that we need to be less concerned with what GISS is doing with current data than with what they are doing with data prior to 1979 to inflate global temperature increase estimates.
REPLY: That might make sense, except that GISS doesn’t use satellite data, only surface data. – Anthony

May 29, 2009 9:57 am

I love the Google Automatic Advertising.
Did anyone else note the advertisment for “Outdoor Grills” what showed up with this article. Aren’t robots wonderful!
Funny they don’t advertize Weather Stations/Temperature sensors.
Or better yet statisitical analysis packages, as MiniTAB (TM) or the like.
Mark Hugoson

E.M.Smith
Editor
May 29, 2009 10:07 am

I’m not sure why the hinge point is 1978, perhaps that’s when the homeowner acquired the BBQ? Sure, that is an absurd claim, but certainly no more absurd than the GISS homogenization adjustment itself. Adjusting the past increases the overall positive slope of the temperature trend.
Well, it ought to be 1979 if I’m reading the code right, then again, it is a somewhat adaptive program and might have “decided” that 1979 was OK. A “hinge point” on or about 1979 in GIStemp comes from STEP0 where they “unoffset” the data.
Basically, they mix together USHCN and GHCN in an absurd way. The two datasets are compared from 1980 forward and an ‘offset” is computed. This “offset” is then used to rewrite all prior history. The “notion” is to remove the “adjustments” from the data they want to use (that has NOAA adjustments in it) but comparing it to an un-adjusted series to get an offset.
This is seriously broken and, IMHO, accounts for much of their bogus warming trend by making history colder
The idea is that if the equipment changed in 1990 and NOAA adjusted for that, we want to remove the NOAA adjustment. The reality is that the new equipment adjustment is applied to all history for the site. Exactly what would a new thermometer in 1990 have to do with temperatures recorded in 1940? It is just a seriously broken step and ought to be attacked by someone with academic credentials with rigor and relish.
From:
http://chiefio.wordpress.com/2009/03/01/gistemp-step0-the-process/
Begin Quote:
The script then runs dump_old.f on the GHCN US-only data (to create a 1980 and newer subset for use in calculating an ‘offset’ between GHCN and USHCN), then dif.ushcn.ghcn.f (creates that odd ‘offset’ between GHCN and USHCN), cmb2.ushcn.v2.f (to ‘find the offset caused by adjustment’ in the original data and to remove it via the ‘offset’ from the prior step), then runs hohp_to_v2.f and cmb.hohenp.v2.f to add in a better version of the data for hohenpeissenberg (from input_files). It then creates the directories to_next_step and work_files, moves files into them, and finishes.
That whole offset un-offsetting process just seems strange to me. Why not just use the non-adjusted data series OR use the already adjusted series? Why blend 1/3 of one with 1/3 of the other with 1/3 being unoffsetterized? This just looks like a great chance to introduce to the series that which is not in the original data sets.

Rod Smith
May 29, 2009 10:12 am

I wish no one would think of such sites as “weather stations.” All they “measure,” if we can use that term loosely, are surface temperatures.
Weather, and climate for that matter, are described by a few more parameters than surface temperature.
I wonder what “homogenized” weather or climate would be like, and where it might exist?

May 29, 2009 10:36 am

Rod Smith, precipitation is also measured.

George E. Smith
May 29, 2009 10:58 am

Well I like the blink comparator more than I like the owl box.
I find the 1978 jitter point to be interesting, because it is somewhere around there when they put out the first ocean buoys that measured sea air and water temperatures simultanously; which showed that all the sea water data taken before then was useless for constructing lower troposphere air temperatures.
So i don’t believe any “proxy” temperature data before the 1980 time frame, and those buoys showed that sea water temperature from some arbitrary ocean depth is not a satisfactory proxy for lower troposphere temperatures, at 60 inches from the concrete; or wherever the owl boxes are sited.
I still believe that UHI are not a problem for temperature data gathering; after all the planet lives with UHIs all the time, without any problems at all.
The problem is that the data modellers insist on using those UHI temperature measurments far away from where they have any validity; in other words UHIs are a problem of Nyquist sampling errors; they are actually good for cooling the planet, since if they are at a higher temperature, then they radiate infra-red at a faster clip, thereby cooling the planet faster than a colder place does.
That south facing natural wood picket fence looks like another local adjustment knob too.
George

E.M.Smith
Editor
May 29, 2009 11:02 am

bill (05:02:15) : you are not trying for absolute accuracy –
Well some of us are. Most kinds of tomatoes don’t set fruit below about 50F. I don’t care, nor do they care, if you are reporting 55 F when it isn’t. Both my tomatoes, and I, care about the real temperature and is it really over 50F.
In the real world the real people who use this “product” for real decisions want real temperatures because real things behave in real ways. Only anomalies are anomalous.
just difference. It could be placed at the centre of a 10km diameter section black tarmac. The temperature will be wrong – the max and min and average will be wrong – but the changes in the max/min will reflect changes in temperature.
Unless it changes not just the offset, but the slope of the temperature curve as well. It just isn’t as simple as you make it out to be. A large “heat storage system” of tons of black tarmac will make night time temperatures significantly higher in sunny places. Under snow, not so much. You now have seasonal discontinuities. A green field will maintain a much more nearly constant temperature due to the plants homeostasis. Concrete, not so much…
Can you prove that the adjustment is always one way ?
Yes.
From GIStemp STEP0 program dif.ushcn.ghcn.f
Notice that mo0 is the USHCN data while mo it the GHCN data (cryptic eh?) and that these are always subtracted in the same direction. Recent “changes” always create a subtracted offset. This is then used to create a “difference” dif that in the next step gets subtracted.
C**** get last years of USHCN data (1980-last available year)
25 if(iyr0.gt.last_yr.or.id0.ne.id) go to 30
do m=1,12
mo0(iyr0-1979,m) = itmp0(m)
end do
read(1,'(i3,i9,i4,12i5)’,end=30) icc0,id0,iyr0,itmp0
go to 25
C**** got yrs 1980-last_available_year of USHCN – now get GHCN-data
30 id0=id
31 if(iyr.gt.last_yr.or.id0.ne.id) go to 40
do m=1,12
mo(iyr-1979,m) = itmp(m)
end do
read(2,'(i3,i9,i4,12i5)’,end=100) icc,id,iyr,itmp ! read GHCN
go to 31
C**** compare data
40 id=id0
do m=1,12
nok=0 ; di=0
do n=last_yr-1979,1,-1
if(mo0(n,m).gt.-9000.and.mo(n,m).gt.-9000) then
nok=nok+1
di=di+(mo0(n,m)-mo(n,m))
end if
if(nok.eq.10) go to 50
end do
50 dif(m)=di/nok
if(nok.lt.1) dif(m)=-9999
end do
write(10,'(i10,12f7.0)’) id,dif
backspace (1)
backspace (2)
go to 10
c**** No more USHCN data – copy data for remaining non-USHCN stations
100 write(*,*) ‘done with ushcn’
200 stop
From: cmb2.ushcn.v2.f (the next program in the process)
C*** replace GHCN by USHCN data if present
if(iddiff.ne.id0) stop ‘wrong iddiff’
do m=1,12
if(itmp0(m).gt.-9000) itmp0(m)=itmp0(m)-dif(m)
end do
write (12,'(i3,i9.9,i4,12i5)’) icc0,id0,iyr0,itmp0
go to 10
Notice that the “dif” is always subtracted.
This step, which hinges at about 1979, always subtracts from the past.
Now the theory is that NOAA has put in UHI adjustments (negative) in the recent data and that this ought to be subtracted it to “unadjust” it.
The effect is that all adjustments (TOB, equipment, you name it) get subtracted. But NOT just from the times they existed, FROM ALL HISTORY.
This, IMHO, is where the axe ought to be lowered first on GIStemp.
(I will provide free programming consulting to any academic who would like to make a paper out of this but needs a FORTRAN guy to translate the code.)

Bernie
May 29, 2009 11:14 am

I am having a discussion on another website about mountain pine beetles and the threat to the grizzzly bear population in the Greater Yellowstone Ecosystem. (see http://switchboard.nrdc.org/blogs/mskoglund/global_warming_dead_forests_im_1.html ) The author sent me this presentation ( http://www.fs.fed.us/psw/cirmount/meetings/agu/pdf2007/redmond_talk_agu2007.pdf ) which prompted me to go look at weather stations in the Western States. It includes some charts I am not very comfortable. I went to http://www.surfacestations.org/USHCN_stationlist.htm which is very handy to check on the distribution of weather stations in the Western States particular those above 5000 feet. Unfortunately I couldn’t find an explanation for the columns labeled trend_raw, trend_tobs, and trend_filnet. I can guess but it would help if somebody could provide official definitions including start and end years for the trend calculation if it is not the full record.
Also if anyone is an expert on mountain pine beetles any detailed references on their sensitivity to temperature would be very helpful.
Many thanks

Jeff Alberts
May 29, 2009 11:26 am

Tom, I don’t think the MMTS measures anything except temp (maybe humidity, don’t know). A separate rain gauge has to be installed to measure precip…

Curt
May 29, 2009 11:45 am

Any possibility of a time-of-observation (TOBS) change circa 1979? A switch from early morning to early evening at that time could (possibly) justify this type of correction. I can’t find anything that records this type of change.

E.M.Smith
Editor
May 29, 2009 11:48 am

Robert A Cook PE (07:09:52) : The “hinge point” for ALL of the Hansen’s (GISS) “adjustments” to the historical temperature records IS the decade between 1968 and 1978:
As noted in the prior technical (code) listing, the inflection point 1979 / 1980 is hard coded into the program. The only thing it can do is “hinge” at that point to greater or lesser degree. (The degree depending only and entirely on the difference between the GHCN and USHCN data. A simple “delta” of the two data sets ought to give you an immediate guide to exactly which sites will be “hinged” the most and which will be most “unhinged” 😉
Look at the (older) WUWT chart of how many times the individual year’s “average” have been re-calculated from 1860 through 2008. (Last year, June’s archives.) Records before 1968 were changed an AVERAGE of 36 times, with almost ALL of the historical temperatures for each year changing between 36 and 46 times.
This is an essential artifact of the structure of the code. Every single time the code is run with added data, the “diff” changes New data (a new month or year) means the “dif” is calculated over a different chunk of time ( 1980 to “present” and present keeps changing). Then every single time the code is run this changes the past in a new way,
Of course, if 1935-1945 were as hot as actually recorded, then the entire premise of Hansen’s global warming is destroyed, and the government loses the opportunity to get 1.3 trillion in taxes.
And this, IMHO, is the nub of it. A clear decision was made in writing the code to rewrite the entire past history every time the code is run. Any programmer with half a brain (right half! 😉 would resist that. A clear decision was made to alter all of past history to remove a small bit of recent “adjustment” – on the face of it that is bogus and any decent programmer would point out the absurdity of it.
(Though I must admit that as a contract programmer I’ve taken such things to “the boss” and been told, basically, “just code it to the spec, OK?” … so fault more likely goes to “management” than to the coder / analyst / programmer.)
Oh, FWIW, the blink comparisons that have some bits moving and some bits not have an incomplete data set for use. The program replaces GHCN by “unadjusted” USHCN unless it doesn’t have any to work with, then it just accepts whatever it’s got… Could be GHCN. Could be un-un-adjusted USHCN. Toss what you’ve got into the mulligan stew…
So when you ponder why some sites, like this one, have a complete hinge, while others have little bits bounce and others hold stable, it is this same chunk of code just putting different bits of “specialty meats” in the sausage… (Don’t eat the yellow /green bits! 8-0

Gary Plyler
May 29, 2009 11:57 am

There is nothing magic about 1979 being a shift point. Different stations have been assigned different shift or tilt points for one reason and one reason only. I t is to rewrite the record so that 1934 no longer is the highest anual temperature in the continental U.S.
Shame on positioning, shame of changing the past 1985 style. He who controls the past, controls the future.

Gary Plyler
May 29, 2009 12:10 pm

Sorry, meant that to read “1984 style”

Gary Plyler
May 29, 2009 12:12 pm

Winston, please correct and drop the older version down the “memory hole”.

E.M.Smith
Editor
May 29, 2009 12:14 pm

Keith (09:54:48) :Perhaps because starting in 1979 there is satellite data available, so you can’t fudge data since it would be obvious against the independent satellite measurements.
Interesting point… I have wondered why they chose that date in the code.
REPLY: That might make sense, except that GISS doesn’t use satellite data, only surface data. – Anthony
To fend of the inevitable troll who will assert that the sea temperature anomaly map is based on satellite data, the last time this can around I went down the rabbit hole and sorted it out. It uses a hypothetical synthetic sea surface temperature anomaly product based on ships, boats, buoys, oh and a bit of satellites including estimates of polar ice (broken satellite sensors anyone?) used to make synthetic polar temperatures that are then interpolated and projected…. not satellite data.
http://chiefio.wordpress.com/2009/03/05/illudium/
http://chiefio.wordpress.com/2009/02/28/hansen-global-surface-air-temps-1995/