Guest post by John Goetz
The GISS temperature record, with its various adjustments, estimations, and re-estimations, has drawn my attention since I first became interested in the methods used to measure a global temperature. In particular, I have wondered how the current global average can even be compared with that of 1987, which was produced using between six and seven times more stations than today. Commenter George E. Smith noted accurately that it is a “simple failure to observe the standard laws of sampled data systems.” GISS presents so many puzzles in this area, it is difficult to know where to begin.
My recent post on the June, 2009 temperature found that the vast majority of temperatures were taken from airports and urban stations. This would cause some concern if the urban heat island (UHI) effect were not accounted for in those stations. GISS does attempt to filter out UHI from urban stations by using “nearby” rural stations – “nearby” meaning anything within 1000 KM. No attempt is made to filter UHI from airports not strictly listed as urban.
If stations from far, far away can be used to filter UHI, then it stands to reason some stations may be used multiple times as filters for multiple urban stations. I thought it would be amusing to list which stations were used the most to adjust for UHI. Fortunately, NASA prints that data in the PApars.statn.use.GHCN.CL.1000.20 log file.
The results were as I expected – amusing. Here are the top ten, ranked in order of the number of urban stations they help adjust:
| Usage | Station Name | Location | From | To | Note |
| 251 | BRADFORD/FAA AIRPORT | PA / USA | 1957 | 2004 | Airport |
| 249 | DUBOIS/FAA AIRPORT | PA / USA | 1962 | 1994 | Airport |
| 249 | ALLEGANY STATE PARK | PA / USA | 1924 | 2007 | Admin Building |
| 246 | PHILIPSBURG/MID-STATE AP | PA / USA | 1948 | 1986 | Airport |
| 243 | WELLSBORO 4SSE | PA / USA | 1880 | 2007 | Various Farms |
| 243 | WALES | NY / USA | 1931 | 2007 | Various Homes |
| 241 | MANNINGTON 7WNW | WVa / USA | 1901 | 2007 | Various Homes |
| 241 | PENN YAN 8W | NY / USA | 1888 | 1994 | Various Homes |
| 237 | MILLPORT 2NW | OH / USA | 1893 | 2007 | Various Farms |
| 235 | HEMLOCK | NY / USA | 1898 | 2007 | Filtration Plant |
Unfortunately, having three of the top four stations located at airports was the the sort of thing I expected.
Looking a little further, it turns out all of the top 100 stations are in either the US or Canada, and none of those 100 stations have reported data since 2007. (By the way, #100 is itself used 147 times.) Several of the top-100 stations have been surveyed by surfacestations.org volunteers who have documented siting issues, such as the following:
- Mohonk Lake, N.Y. (197 times) – much too close to ground, shading issues, nearby building
- Falls Village, Conn. (193 times) – near building and parking lot
- Cornwall, Vt. (187 times) – near building
- Northfield, Vt. (187 times) – near driveway, building
- Enosburg Falls, Vt. (180 times) – adjacent to driveway, nearby building.
- Greenwood, Del. (171 times) – sited on concrete platform
- Logan, Iowa (164 times) – near building, concrete slabs
- Block Island, R.I. (150 times) – adjacent to parking lot and aircraft parking area.
The current state of a rural station, however, is an insufficient criterion for deciding to use it to adjust the history of one or more other urban stations. The rural station’s history must be considered as well, with equipment record and location changes being two of the most important considerations.
Take for example good ‘ole Crawfordsville, which came in at #23, having been used 219 times. As discussed here, Crawfordsville’s station lives happily on a farm, and does seem to enjoy life in the country. However, up until 16 years ago the station lived in the middle of Crawfordsville, spending over 100 years at Wabash College and at the town’s power plant.
| Mohonk Lake, N.Y. (197 times) – much too close to ground, shading issues, nearby building |
| Falls Village, Conn. (193 times) – near building and parking lot |
| Cornwall, Vt. (187 times) – near building |
| Northfield, Vt. (187 times) – near driveway, building |
| Enosburg Falls, Vt. (180 times) – adjacent to driveway, nearby building. |
| Greenwood, Del. (171 times) – sited on concrete platform |
| Logan, Iowa (164 times) – near building, concrete slabs |
| Block Island, R.I. (150 times) – adjacent to parking lot and aircraft parking area. |
GISS missed its calling. It should have been a massage parlour!
Send a link for this article to the UK Met Office here: enquiries@metoffice.gov.uk I have!
quick question John. How did you determine which stations were rural?
Reply: I left it up to GISS. Their report lists the rural stations they used to adjust urban stations.
Just in case we have forgotten – June 2009 saw the warmest ocean temperatures on record. They were just 0.1C cooler than the land. The Southern Hemisphere was the warmest on record (and the rather unpopulated Antarctic had thumping anomalies).
That’s from your good folks at NOAA.
PS Anthony you’ve got 12 months to save face and join the consensus. Odds are that 2009 will come in with a thumping annual global anomaly and 2010 – well that’s too horrible to think about. Sea level will spike sharply in the coming months – will easily surpass the peaks associated with recent El Nino’s event.
REPLY: One would think that somebody from BoM would be aware that an adjustment change was recently put in place by “the good folks at NOAA” but hey, don’t question confirmation bias there. It would be not be a fitting scientific thing to do for BoM’s worst alarmist. Please note our policy page. – Anthony
OT: Fish size and global warming !!!!!! [Now on Drudge]
http://www.breitbart.com/article.php?id=CNG.d672f9d7f0f64fefdf0b21e696b41e21.7a1&show_article=1
But where is that warming ???????
tty
“As for 16th century maps of Antarctica, may I point out that nobody visited Antarctica until 1820.”
Oh I do like people who can prove a negative.
The archeologocal remains of the Viking’s time in Greenland are there to study, but STILL UNDER THE PERMAFROST. Then it was supposed to have been ‘slightly warmer?’
This all seems like a smokescreen to divirt attention away from the fact that there interpretation of June’s data actually seems to be spot on, judging by the particularly large (and curiously unreported story on this blog) increase in near surface and lower atmosphere temperatures as recorded by AMSU. http://discover.itsc.uah.edu/amsutemps/execute.csh?amsutemps
Maybe someone can explain why this particular story hasn’t been highlighted?
REPLY: Maybe you can explain why you can’t use the search box to find it yourself?
Here’s the “curiously unreported story on this blog”: http://wattsupwiththat.com/2009/07/17/pielke-sr-hypothesis-on-daily-uah-lt-records/
Typical alarmist. Denounce first, ask questions later. Bugger off. – Anthony
Jim Papsdorf (02:30:23) :
Al Gore wrote in “Earth in the Balance” that salmon and rabbits in Patagonia were going blind from the effects of more UV through the hole in the ozone layer. I suspect that there were fewer than a half dozen instrumental UV recorders giving anything close to continuous UV flux records on the whole Earth at the time he wrote. I could not find any records then. Certainly, no trend had been established for Patagonia.
It’s not even certain if salmon are blinded by UV light in the course of a normal day. They might just live a bit deeper. And rabbits are more often out and around by night.
Silly bugger.
Geoff Sherrington (18:46:45) :
Agreed, but remember the days before continuous recording. The exhaust of an aircraft would produce a transient high on a max-min thermometer, a spike that might be filtered out these days.
Just saw a book that was on a desk next to a cup holding a magnifying glass/reader. There was a wonderfully neat slice through the cover, which was slightly open, and a brown mark on page one. About 2 inches long. Too close to the sunlight from the window. Theoretical question. Did the magnifying glass (repeated n0000 times to make the figures seem less trivial) cause any global warming? Do solar farms create overall global warming, or does everything cancel? Not a trick question, I really do not know the answer.
I dont understand why you good people in America put up with this sort of crap.
Here is a noble institution called NASA that has a unit called the Goddard Instution of Space Studies being run by an environmental activist of the worst kind, whose job it is includes being, to create and manage the GISSTEMP temperature recordings.
But to accomodate the corruption of the data by the UHI and to help incorporate previous data stations that stopped functioning in the past they use about 100 so called “reliable”to calibrate against.
But now it turns out that these calibration stations are also tainted by the UHI and other anomalies– the effect of which is to raise the temperatures recorded.
Why do you put up with such blatant manipulation and incompetence.?
Its the same as Madoff, GFC and Algore –they all get found out in the end, but its the damage they do on the way through thats the concern
John F. Hultquist (21:26:21) :
……………………
“those that sailed and landed there were called ‘greenlanders’. This comes from a book from one of the last captains of the ships of the British tea trade and I’ve loaned it to a friend in another state. Best I can do at the moment.”
Please, they were not British! Therefore not using English idioms!
And then —of course— there’s the Piri Reis map which predates all other known maps.
http://www.uwgb.edu/dutchs/PSEUDOSC/PiriRies.HTM
Gavin Menzies 1421: http://en.wikipedia.org/wiki/1421_Hypothesis
I have Charles Hapgood’s book and the claims of accuracy for the ancient maps are unsustainable when the maps are inspected.
Far too many people are far too certain about far too many things.
FWIW, I’ve got an update at: http://chiefio.wordpress.com/gistemp/
that gives some detail on the fact that, as of about 2 hours ago, I got all of the GIStemp code to compile in a Linux box. After a great deal of slogging, I’m at a point where I can start to:
a) Test it.
b) Characterize it.
c) Fix it.
There was one systematic issue (a use of a non-standard “extension” to the language that some systems allow, but most do not) and there were a couple of cases of “type mismatch” in parameters. A couple of those were just parameters being passed to “write” statements, so they ought not to cause too much mischief. Another was more worrying in that a REAL data type was passed to a significant zonal subroutine with an INTEGER type for the variable.
That might be a significant bug, and certainly is a terrible coding practice. I’ll have to take a couple of hours to work it through, though, before tossing rocks at it (sometimes that “technique” can be an obscure feature… most of the time it’s a flat out error…)
While making it “go”, I took the opportunity to clean up the structure a bit. The “source code” (programs people read and write, like FORTRAN) now lives in source code repository directories. The executables (what the computer actually runs – binaries) are in a separate directory as well. I’ve also written “Make” files that generate the binaries from the source (and removed the “in line compile / run / delete” from the scripts…)
So now it’s at least structurally cleaner and much easier to follow the flow of what is going on.
I expect in the next week or so to have it simplified even a bit more and to have run some test data through it “end to end”. (I also made a script to ftp the data to the box so you don’t need to fetch it “longhand” quite so much…)
I’m not changing the “logic” or data processing any at this time. Just making it do what it does in a cleaner and clearer way that’s easier for folks to follow (and use). At some future point, I’ll translate any bits that really need it into a better language and / or fix any “bugs” I find. For now it’s just things like putting the “scratch” or “temp” files in somewhere other than the source code “archive”…
Frankly, the hardest bit so far was getting a FORTRAN 90 or newer compiler to run on my older Linux box 😉 Along the way I got to compile the gcc tool chain too, as g95 (the free FORTRAN compiler) needed some libraries from a newer gcc to run… If you started with a newer box with the compiler already there, it’s not hard to make go at all…
Carl Yee (15:29:27) : Another question is how far back did they start using that station for correction? Back 80 years ago (for example) it might have been a good reference station to adjust others, but from 19xx it might have been so mutated as to be useless for that purpose.
There is no selection of stations based on their great character. If a station is missing data, it is “filled in” based on what’s available, and not much more “thought” than that. This, IMHO, is one of the two great “bogosities” of GIStemp. The other is the fact that the most recent 10 years or so of “difference” between GHCN and USHCN for a given station is is used to rewrite all the past history of that station to “uncorrect” the corrected data… When NOAA gives you the choice of corrected, or not, data to download in the first place… If an equipment change in 1970 lowered temp 2 F, then in 1980 it was fixed, raising it 2F, that +2F will be subtracted from all history prior to 1980. This is right how?!?…
“Why? Don’t ask why. Down that path lies insanity and ruin. – emsmith”
It is what it is. ONLY the 10 most recent years with data from the period starting in 1980 are used to calculate the “offset” between corrected an uncorrected data, then that “offset” is applied to ALL HISTORY. Not very bright, but now you know why history keeps changing.
Want to change the temperature in Kansas in 1880 to 1980? Update the equipment today… Every year for the next 10 years, the history will slowly change more and more as the new equipment “offset” adds to the 10 year average “offset” that changes the past…
So, when does a station change a “nearby” station? Whenever there is data missing from the target station. The code just goes looking for whatever is handy to “fill in” via the “reference station method”… Doesn’t matter if the station with missing data is middle of the (nearly desert and darned hot in summer) Central Valley of California and the reference station is on the (almost always inversely related cool to cold) coast. It will be used.
I *think* that the “magic sauce” for combining data from one site into another happens in STEP1/comb_records.py
But that’s a Python script and I’m only now learning python. This is a fragment from it, and as you can see, it ranks stations based on a list of attributes, ending with “UNKNOWN”. So basically the code tries to use a station ranked higher based on MCDW vs USHCN vs … but will settle for anything if that’s all it’s got…
From comb_records.py
def get_best(records):
ranks = {‘MCDW’: 4, ‘USHCN’: 3, ‘SUMOFDAY’: 2, ‘UNKNOWN’: 1}
best = 1
rids = records.keys()
rids.sort()
for rec_id in rids:
record = records[rec_id]
source = record[‘dict’][‘source’]
rank = ranks[source]
if rank > best:
best = rank
best_rec = record
best_id = rec_id
if best > 1:
return best_rec, best_id
END PROGRAM QUOTE
For what it’s worth, the “reference station method” is applied several times in several bits of the program… Yes, it just keeps smearing what little data it has around in an attempt to get global coverage out of data with massive holes in it. In some parts the limit is 1000 km. In others 1500 km. And in the Anomaly phase it’s measured in degrees of arc… I think it was up to 10 degrees of arc, but I ought to check that.
To the best of my knowledge, no one has EVER evaluated the validity of “the reference station method” when applied recursively like that. (Or perhaps is should be called “applied serially”). A technique may be valid done once, but invalid done repeatedly… (One sleeping pill is fine, a dozen…)
So, while I’m fairly certain that GIStemp is substantially useless since at best it gives you numbers dancing in the (very wide) error bars of it’s calculations, I’m doing the work of porting it “for the masses”. When I’m done, you ought to be able to run it yourself, should you wish, and find for yourself where it does “odd things”…
Hey, just think of it as a weather video game… without the video 😎
Geoff Sherrington (18:46:45) : Does it make much difference if an airport is used for jets or pistons? Two aircraft of about the same weight would throw out about equal heat on takeoff. The question might be whether jet use implies bigger aircraft and more frequent flights.
You got it! Jets are very big, and jetports have lots of traffic.
Piston aircraft (at least since about 1970) have been mostly very small and used much less frequently.
Reply: I think it is more an issue of whether or not the measurement instrument is close to the tarmac or not. Year round, that black surface will have a warmer temperature than the surrounding ground. Some may believe it is small … fractional. But in this (now settled) debate, we are arguing about fractional degrees. – John
And don’t forget that the “standard” commercial jet runway is about 10,000 feet long while a private piston runway is more like 1,000 to 2,000. (I’ve seen folks take off on 300 feet!). You also tend to have one runway in a rural setting, but several with loads of taxiways in the larger commercial jet facilities. Heck, one airport I used some years back was a few hundred feet of grass(!). (Glider port… it’s a LOT nicer to land a glider with a skid on grass… seems scary the first time, but it’s really rather comfortable…)
So the UHI from an airport will, IMHO, increase geometrically as the typical traffic goes from Gliders, to Piston private planes, to commercial Jets; mostly based on the squared or more function of tarmac surface area… but the added cars, carparks, hangars, etc. will add to the mix…
For people who are concerned that the evidence for climate warming is just an artifact of the surface temperature measurements, I think that much of the best evidence for warming comes from lake temperatures and data on the duration of ice on lakes.
Check this on Google scholar. Quite a few of these studies are available as PDFs. Some US lakes, such a Lake Mendotae (WI) have data on the date of ice cover and ice out since about 1850. One study, cited below, has long term data on ice duration for 62 lakes in the Great lakes region( see below). These studies of lakes in the US and also Europe and Asia show a trend for later ice formation and earlier ice melting that accelerated in the last 40 years. In general, these change in ice duration match the warming trends in the surface temperature record very well.
PDF] ►Spatial analysis of ice phenology trends across the Laurentian Great Lakes region during a …
… , BJ Benson, JJ Magnuson, VM Card, MN … – Limnology and Oceanography, 2007 – fw.msu.edu
… 1). These records include 62 lakes, 1 bay of Lake Superior, 1 … 2000a; Futter 2003;
Magnuson et al. … For ice duration, the quantification of these no-ice years is …
It appears even CO2 researchers have similar sensor location issues, based on this picture and associated project. Will they use the collected temperature data to provide any menaingful correlation?
http://www.ldeo.columbia.edu/outr/LACOP/aboutlacop.html
Surprised you are getting so tetchy and have misquoted me, note I said “highlighted” not “reported”. When using quotations please try to keep it accurate. Presumably you know what ‘Bugger’ actually means, if not then read its literal meaning and try not to insult people.
The AMSU story is very significant whichever the side of the divide you are on especially in the context of the ongoing attempted assasination of GISS. My point, which you haven’t answered during your red mist moment, is that it looks like, on satellite evidence (ie. no UHI up there!) that GISS are absolutely spot on and the large increase in surface temperature anomolies they recorded last month are now showing up in the lower atmospheric temperatures.
Your rational and hopefully polite answer to this would be appreciated.
Bill D (06:32:29) : It’s been said before, but I’ll say it again … evidence of warming is not the same thing as evidence that man made CO2 is responsible for that warming. It also is not an established fact that warming is bad. To take it even further, I would say cooling is bad. In fact, cooling is much, much worse than warming. So, what’s your point?
If the Allegany State Park one is what I think it is, its in NY instead of PA (suggested by the spelling of Allegany/Allegheny, near the parking lot & shaded by trees.
Have to look it up on surface stations
EM Smith – Thank you for the service(s)!
“There was one systematic issue (a use of a non-standard “extension” to the language that some systems allow, but most do not) and there were a couple of cases of “type mismatch” in parameters. A couple of those were just parameters being passed to “write” statements, so they ought not to cause too much mischief. Another was more worrying in that a REAL data type was passed to a significant zonal subroutine with an INTEGER type for the variable.
That might be a significant bug, and certainly is a terrible coding practice. ”
—
If I recall my FORTRAN properly, this would “replace” the numeric (eight digit powers of ten format) result from the calculation of the various input numbers with an “integer” (single digit precision) in the output.
Could be significant. Could be insignificant. Could be trivial. Could be disastrous.
—
A few fundamental questions: We now have GISS’s “official” list of 100 “rural” (uncontaminated!) stations.
1) What are their (uncorrected!) temperature histories?
2) What are their (contaminated (er, corrected)) temperature histories?
3) Since these 100 stations are supposedly correcting for UHI in thousands of other heat-affected stations, why does he (Hansen) not just simply use these stations by themselves – no area adjustments at all?
3) Why “fill in” data in missed days, months, years – which will invisibly contaminate previously valid rural station data with the nearest URBAN data for the missing days – and not just plot and use what is actually present in the record?
4) Why back date old records if the algorithm for filling in continuously and erroneously adjusted literally hundreds of years of historical records with continuously adjusted new data? (Like the “continuously increasing time-of-observation bias that alwasy adds fractions to every temperature record – regardless of when any measurement was taken – neither process can be logically nor scientifically supported by the numbers, the methods, the process, or any fictional scenario of ancient station keepers crawling out to record minimum thermometer readings in the freezing snow at 11:59 each night.)
Bradford & Allegany (Salamanca ) aren’t that far apart (10-15 mi?)
OT: Scrolling thru the “archiv” on the North of 80 temps in the sidebar, I note that 2009 is in fact the coolest temperature in the series from 1952 to the present. Also back to no sunspots
Mary Hinge (07:27:35):
From my on-line dictionary:
Bugger vulgar slang, chiefly Brit; noun: A contemptible or pitied person, typically a man. A person with a particular negative quality or characteristic.
Is it an insult if it’s true?
Ah, the same old smokescreen smokey!
Still talking the same bar room talk and still not actually getting to the meat. So I put it to you, is GISS right, AMSU says they are…or do you blame UHI or even soot for that?