Guest post by John Goetz
The GISS temperature record, with its various adjustments, estimations, and re-estimations, has drawn my attention since I first became interested in the methods used to measure a global temperature. In particular, I have wondered how the current global average can even be compared with that of 1987, which was produced using between six and seven times more stations than today. Commenter George E. Smith noted accurately that it is a “simple failure to observe the standard laws of sampled data systems.” GISS presents so many puzzles in this area, it is difficult to know where to begin.
My recent post on the June, 2009 temperature found that the vast majority of temperatures were taken from airports and urban stations. This would cause some concern if the urban heat island (UHI) effect were not accounted for in those stations. GISS does attempt to filter out UHI from urban stations by using “nearby” rural stations – “nearby” meaning anything within 1000 KM. No attempt is made to filter UHI from airports not strictly listed as urban.
If stations from far, far away can be used to filter UHI, then it stands to reason some stations may be used multiple times as filters for multiple urban stations. I thought it would be amusing to list which stations were used the most to adjust for UHI. Fortunately, NASA prints that data in the PApars.statn.use.GHCN.CL.1000.20 log file.
The results were as I expected – amusing. Here are the top ten, ranked in order of the number of urban stations they help adjust:
Usage | Station Name | Location | From | To | Note |
251 | BRADFORD/FAA AIRPORT | PA / USA | 1957 | 2004 | Airport |
249 | DUBOIS/FAA AIRPORT | PA / USA | 1962 | 1994 | Airport |
249 | ALLEGANY STATE PARK | PA / USA | 1924 | 2007 | Admin Building |
246 | PHILIPSBURG/MID-STATE AP | PA / USA | 1948 | 1986 | Airport |
243 | WELLSBORO 4SSE | PA / USA | 1880 | 2007 | Various Farms |
243 | WALES | NY / USA | 1931 | 2007 | Various Homes |
241 | MANNINGTON 7WNW | WVa / USA | 1901 | 2007 | Various Homes |
241 | PENN YAN 8W | NY / USA | 1888 | 1994 | Various Homes |
237 | MILLPORT 2NW | OH / USA | 1893 | 2007 | Various Farms |
235 | HEMLOCK | NY / USA | 1898 | 2007 | Filtration Plant |
Unfortunately, having three of the top four stations located at airports was the the sort of thing I expected.
Looking a little further, it turns out all of the top 100 stations are in either the US or Canada, and none of those 100 stations have reported data since 2007. (By the way, #100 is itself used 147 times.) Several of the top-100 stations have been surveyed by surfacestations.org volunteers who have documented siting issues, such as the following:
- Mohonk Lake, N.Y. (197 times) – much too close to ground, shading issues, nearby building
- Falls Village, Conn. (193 times) – near building and parking lot
- Cornwall, Vt. (187 times) – near building
- Northfield, Vt. (187 times) – near driveway, building
- Enosburg Falls, Vt. (180 times) – adjacent to driveway, nearby building.
- Greenwood, Del. (171 times) – sited on concrete platform
- Logan, Iowa (164 times) – near building, concrete slabs
- Block Island, R.I. (150 times) – adjacent to parking lot and aircraft parking area.
The current state of a rural station, however, is an insufficient criterion for deciding to use it to adjust the history of one or more other urban stations. The rural station’s history must be considered as well, with equipment record and location changes being two of the most important considerations.
Take for example good ‘ole Crawfordsville, which came in at #23, having been used 219 times. As discussed here, Crawfordsville’s station lives happily on a farm, and does seem to enjoy life in the country. However, up until 16 years ago the station lived in the middle of Crawfordsville, spending over 100 years at Wabash College and at the town’s power plant.
Mohonk Lake, N.Y. (197 times) – much too close to ground, shading issues, nearby building |
Falls Village, Conn. (193 times) – near building and parking lot |
Cornwall, Vt. (187 times) – near building |
Northfield, Vt. (187 times) – near driveway, building |
Enosburg Falls, Vt. (180 times) – adjacent to driveway, nearby building. |
Greenwood, Del. (171 times) – sited on concrete platform |
Logan, Iowa (164 times) – near building, concrete slabs |
Block Island, R.I. (150 times) – adjacent to parking lot and aircraft parking area. |
Maybe it’s time a major global mapping project of heatsinks (concrete, roads, highways, air-conditioning units, buildings, skyscrapers, the works) is done to find out how much of the global surface warming in the last century is caused by the creation of such sinks and other artificial surfaces that absorb and release heat and their fallout regions (where forces like wind blow the heat released by such surfaces), something that would go far beyond the surfacestation project.
Over the past 100 years 1000’s of square miles of land has been paved and urbanized, in some areas like China for example the UHI effects can be huge considering there’s cities like Chongqing with nearly 30 million people. The study mentioned above may have to be done to get the full picture of warming in the last century.
What a mish mash malarkey. Are those top three big airports for jets?
Another question is how far back did they start using that station for correction? Back 80 years ago (for example) it might have been a good reference station to adjust others, but from 19xx it might have been so mutated as to be useless for that purpose. However, knowing how gov’t. works, they most likely just blindly kept using it until the present. IOW, good for one time, good for any time or year. Considering the sorry state of the great majority of stations, I doubt if anyone in NASA has the faintest idea what the actiual conditions are at these “golden” stations being used for reference and adjustments.
1000km would mean that John o’ Groats could “correct” Lands End or even Swedish stations.
DaveE.
I don’t understand. None of the top 100 have reported data since 2007? Is this right?
So to get rid of any UHI contamination that may have been present at existing sites, and to make use ofthe data that came from staions previously shut down in the 1980;s they use stations that are themselves contaminated.
Does it get dumber
How this organisation put men on the moon beggars belief –and they now want to go to Mars !!!!
I wouldnt send them down to the local to get some milk.
Adam from Kansas (15:22:35) : The ironic thing is that UHI have been known to spawn clouds. They will help dissipate the UHI-generated heat, just as (I suppose) clouds do elsewhere. I have to wonder if there will be a net effect on the global average temperature. Hey! Maybe we could model it!!
Say what?! The top 100 “rural” stations that are being used to adjust 100s of urban stations for the UHI effect haven’t reported since 2007?
Can someone explain or provide a link as to how the rural stations are selected and how these data can be used effectively to adjust for urban stations 2 years later? Do they simply create a formula that represents the differences between past rural and urban temps and apply that formula as the adjustment into the future?
Are the same rural stations used to adjust for the same urban stations forever or can they be changed? If so, what criteria are used to change and who gets to decide when to apply it?
Sounds like yet another area that could be subjected to monkey-business as usual.
“File name PApars…”Given the locations of the first five stations, the file naming comvention makes sense.
Note that I only count 9 stations listed.
Reply: A big Homer Simpson “Doh!” I fixed and added the tenth station. – John
Anthony already made perfectly clear what’s real problem here:
“How not to measure temperature!”
It would be great if we came to a real clean non bias temperature map.
I think that really would the end of AGW?Climate Change and all the planned policies.
The current AGW doctrine is based on “Junk Science”.
Questions – maybe they can be answered with filtering the data.
What is the most heavily sampled CURRENT station that is used to adjust for others (stations ending in 2009)?
Others might want to filter this data, ranked by FROM and TO dates, by location (what area of the US had the heaviest sampling), and even bumping this list against the surfacestations.org files.
Are those heavily used stations CRN 1/2, or CRN 3,4,5?
The not so interesting thing about these 9 stations is that they are in part of the US that has towns about 15 miles apart from each other and sometimes much closer. Thus, I doubt that the 1,000 km measure is ever reached.
An interesting question is what has happened to these stations, such as Bradford and Dubois, where airports still operate (I think – haven’t been to either in years)? Are the airport monitors gone, just dropped from usage, or just no longer using that name? On this last thing, I’m thinking of the American versus English spelling discussed recently when a station got ‘lost’.
Reply: I am working on figuring that one out, but I will point out that there are roughly 2000 stations in the US that have ever been a part of the GISS record, most of them not rural, so a station that is used to adjust 250 non-rural stations is reaching pretty far. – John
This is so bad that you just won’t have to look a Giss anymore.
Weather stations for these two places (from fuzzy photos)
taken from Google Earth
Bradford 41. 798260 N, 78.635408 W
Dubois 41.179509 N, 78.893184 W
http://www.ibdeditorials.com/IBDArticles.aspx?id=291423153272209
At least the business press, such as IBD, will give the skeptics a bit of support. I think Willis should be asked to provide a graph of the Argos buoy data for the last five years, which, he admits, shows no warming. Considering the thousands of buoys scattered over the oceans, the data should be hard to deny.
Six or seven times as many stations in 1987. Top 100 “best” stations shut down. What happened? Was the funding cut off?
Or has the funding skyrocketed in recent years because of the Pending Doom of Global Warming?
The whole thing smacks of deliberate monkey wrenching. Sorry, but when the magnitude of failure reaches this level, I have a hard time blaming incompetence. What does the Director of GISS have to say about this situation? Oh yeah, that would be James E. Hansen …
OK, so we risk our economy on two bit tin horn stations with known unreliable siting issues and data. What’s the problem. It’s only a few trillion dollars.
You would have really hoped someone other than a bunch of rank amateurs were doing this, because if they aren’t then what are they trying to do. The last thing that comes to mind is science.
A simple question: What do they do with the money they get?
So this is manufactured warming for Copenhagen.
I don’t think I’ve ever read a darker comedy.
“Looking a little further, it turns out all of the top 100 stations are in either the US or Canada, and none of those 100 stations have reported data since 2007.”
OK, please let me know if I understand this correctly Anthony. In order to adjust properly, they use stations that do not report data, and so these stations actually ‘report’ the data that is filled into them, and then this data is used to adjust for UHI? Is that correct?
Reply: Some of the stations continue to report data, but it is not currently captured by NOAA’s GHCN V2 record and therefore does not get reported to GISS. – John
What a curious choice by GISS! The Crawfordsville record has more gaps than nearby Rocksville, and is only marginally longer. Might the choice be related to the much shallower dip seen in 1979? The attenuation of that dip–which produced the lowest average temperatures of the 20th century in the USA–seems to be a persistent feature of GISTEMP analysis.
And a fair number of the “best” stations were tarmac bakeovens anyway. The whole system stinks of crock-itude. And we are supposed to sacrifice $trillions based on this snake oil? $Trillions that the “universal consensus” admits will have no effect on global temperatures?
These are the times that try men’s souls. The summer soldier and the sunshine patriot will, in this crisis, shrink from the service of their country; but he that stands it now, deserves the love and thanks of man and women. Tyranny, like hell, is not easily conquered…. If there be trouble, let it be in my day, that my child may have peace. ~Thomas Paine
This is an issue I often deal with in my role as an engineer. People often use automated algorithms / SQL’s etc… to process large data sets automatically. The problem is a computer can not solve problems like a human and as soon as you start to delve in it, the problems start appearing everywhere.
The interesting thing is what happens when these stations stop reporting, how is the UHI then adjusted for? Linear extrapolation of the 1978 – 1998 trend? mmm
I think I’ve figured this out. The people who run GISS obviously work for the Yu Wan Mei Amalgamated Salvage Fishery and Polymer Injection Co. http://www.yuwanmei.com/ . Who, btw, just bought out The Onion.
So Anthony – Now that you are over 80%, what will it take to get a usable temperature record, based on raw data from the CRN 1 and 2 stations?
David (17:12:51) :
‘OK, please let me know if I understand this correctly Anthony. In order to adjust properly, they use stations that do not report data, and so these stations actually ‘report’ the data that is filled into them, and then this data is used to adjust for UHI? Is that correct?’
Bingo We Have A Winner.
As a tax payer, I all for spending money for climate research. I want to thank all of you that have spent their own time and dime doing their own research.
The tip is in the jar.
I hope that everyone would fax this artical to their Senators.
You are getting close, the details are more hideous than most of us expected. It’s hard to imagine that people spend their lives working on ground temp. and don’t know the detail.
I wonder why Jones at Hadcrut won’t come clean.
Does it make much difference if an airport is used for jets or pistons? Two aircraft of about the same weight would throw out about equal heat on takeoff. The question might be whether jet use implies bigger aircraft and more frequent flights. But, in any case, I’ve been unable to find a difference between airport and non-aitport locations in a subset of stations from Australia. It’s a rural subset, so the big internationals are not included. It’s also a fairly small set and it does not resolve the questions. It just does not show a difference I can see. (Until the adjusters get to it).
Reply: I think it is more an issue of whether or not the measurement instrument is close to the tarmac or not. Year round, that black surface will have a warmer temperature than the surrounding ground. Some may believe it is small … fractional. But in this (now settled) debate, we are arguing about fractional degrees. – John
Does this mean that, say, the 243 stations are adjusted the same amount, or do they all get corrected to become parallel with the reference? Here I thought that that the existence of microclimates made differences of even a few degrees within a 100s of metres. What if you have a stationary cold front between the reference station and a number of the adjusted stations? I’ve driven from winter in Canada, across the snow belt south of the Great Lakes and into spring weather within 1000 km in a day.
I guess if you can adjust the data any way you like you can be an expert on “what’s going on in the depth of the oceans, the lands, the ice caps and the upper limits of the stratosphere” or whatever it is they say they are knowledgeable about.
Reply: The amount of correction (or the amount of a station’s “influence” is inversely proportional to the distance of the rural station from the urban station it is correcting. – John
It’s ironic that GISS has greatly reduced the number of stations in its calculation process. Subsequent to 1940 (the mid-century high as well as the start of increasing CO2 concentrations), the number of stations was increased from ~ 3000 to ~ 6,000 (http://data.giss.nasa.gov/gistemp/station_data/). In a sense, these additional stations were taken out of context. That is to say their period of record was incomplete. Out of those 3,000 I found only 400+ stations
outside the US with reasonably looking records. Their composite temperature increase from their 1940 high to 2005 was only 0.3 deg C. And that without any UHI correction. US increase from USHCN was the same. My conclusion: the only significant net temperature increase occurred before CO2 rose.
Reply: Well … on the one hand GISS does not “collect data”. They use data collected from other sources, primarily NOAA, and they tend to take a transparent attitude that their analysis is only as good as the data collected by their “subcontractors”. This somehow seems to work for them but not for the corporate world.
My feeling now is – don’t even bother collecting temperature records other than for local interest. There are just too many problems with trying to take the temperature from everywhere and have it mean something. You just end up with junk information.
Geoff Sherrington (18:46:45) :
Here in Taranaki, New Zealand, we moved the weather station used at our air port for the district Temperatures, because it was felt that it measured too low and was adverse to domestic tourism. LOL. It was only about twenty yards from the gas fire training facility… Anthony would have loved it! 😉 I think local topography can overide other factors.
Again, I find more and more that the debate about TEMPERATURES is meaningless! The ONLY thing that counts. The ONLY thing that is important is RADIATIVE BALANCE.
And the ONLY reliable information on that comes from SATELLITES!
Heaven help us from the people who do not understand the OXYMORON of “average temperature”.
Reply: Some of the stations continue to report data, but it is not currently captured by NOAA’s GHCN V2 record and therefore does not get reported to GISS. – John
Oh. Now I am confused. the stations are active, but not included in GISS, I got that part, but does GISS check the values on the stations or just fill them in with the algorithm? And if they are valuable enough to use to correct stations, why not include them in the record in the first place?
Reply: I was going to say “GISS does not fill in missing data” but then stopped myself.
After a station “stops reporting” (which means NOAA has a back-level copy), GISS will not create more recent entries. There are cases, however, where up to six months of a twelve month record might be missing and GISS will magically create an estimate.
Example: A station reports data from 1940 to 1990, but in 1985 the station curator lost his job and was preoccupied. The curator missed reporting data for Feb / Mar / Apr / May / Jun and Oct. GISS is capable of calculating an annual average nonetheless for that station. – John
“I have wondered how the current global average can even be compared with that of 1987, which was produced using between six and seven times more stations than today.”
I would have thought that if they were serious about getting a good measure on ‘average global temperature’ – which, when you think about it, is quite a concept in itself – that they would have used more temperature measuring stations and not thrown away the vast majority of them?
When you combine the information you have reported here with your surfacestation project statistical analysis you’ll come up with quite a tangled web. My brain hurts just to think about it all.
It seems the influential stations are centered on … central Pennsylvania … home of Penn State … and Michael Mann …
As much as I do not believe in conspiracy theory, this is just too funny !!
I’m puzzled by the number of stations this appears to indicate. If the 100 most used reference stations average about 200 stations each that are adjusted to them, many more than 20,000 stations (probably more than 40,000) are monitored, unless the adjusted stations rely on several reference stations which would be an even greater nightmare. If there are indeed 30 or 40 thousand stations that are being monitored by the weather service, then an idea for an alternative array of stations would be to select at random, say, 2,000 stations and use these as a base for monitoring trends. Possibly one could reject the worst located ones and end up with 1200 (the number used by GISS).
Reply: Multiple rural stations are “averaged” together before they are used to adjust an urban station. And that average depends on how far away the rural stations are from the urban station.
As an example, a station A in Connecticut and a Station B in New Jersey can be used to adjust both New York City and Philadelphia. However, A will have a larger influence on NYC than B, while the opposite is true on Philadelphia.
Now that we have a station survey which rates each according to the CRN standards, perhaps NOAA and GISS will factor that into their correction algorithms.
Right, that’s gonna happen!
So – at the risk of being redundant – UHI is adjusted using sites that themselves are susceptible to UHI? Priceless. Is this a case where these people never leave their cubbies to see what the field really looks like?
Statistically – if the population of sites keeps shrinking, and the sites themselves are contaminated by their proximity to heat sinks/radiators, I can’t see anything of value coming from this data. The adjustments themselves stink to high heaven.
Mark Hugo (18:56:30) :
“Again, I find more and more that the debate about TEMPERATURES is meaningless! The ONLY thing that counts. The ONLY thing that is important is RADIATIVE BALANCE.”
RB may be more interesting to atmospheric physicists but surely you are not saying that if the average temperature were to fall 10C over the next 10 years, that this would not be interesting. Heck its a perfect proxy for radiative balance. This kind of viewpoint among sceptics is why the AGW crowd have been winning the all important public relations war despite bad science.
The exploration of global climate change is of necessity the domain of scientists with an appropriate background, armed with accurate data to synthesize and support their hypotheses. However, the current debate is often dry and lacks historical depth. Information has been uncovered that places global warming square in the middle of startling revelations grounded in fact.
Have you wondered about the origin of the name Greenland? Wikipedia admits that ice-core studies in Greenland reveal rapid shifts in the climate of the Northern Hemisphere going back 100,000 years, but chooses to frame the information in the context of catastrophic global warming. This paragraph, which appears to contradict mainstream AGW, survives:
“Scientists who probed two kilometers (1.2 miles) through a Greenland glacier to recover the oldest plant DNA on record said the planet was far warmer hundreds of thousands of years ago than is generally believed… That view contrasts sharply with the prevailing one that a lush forest of this kind could only have existed in Greenland as recently as 2.4 million years ago.”
Wikipedia claims that the name Greenland was chosen to entice people to settle there, however other sources indicate that the region was quite warm during the time from 900 to 1200 AD, and named appropriately:
“…The next migration came from the east, following “Erik the Red” Thorwaldsson’s exploration of the southern coast of Greenland between 982 and 985 AD…The climate at this time was very warm, much warmer than it is today, and crops were able to do well. It seems likely that the name “Greenland” was given to the country, not just as wishful thinkful, but because it was a climatic fact at that time.”
http://explorenorth.com/library/weekly/aa121799.htm
Far stranger is the case of the map of Oronteus Finaeus. A good picture of it is available here:
http://www.anomalies-unlimited.com/Finaeus_Map.html
“This map was found in the Library of Congress, Washington DC in 1960 by Charles Hapgood. It was drawn by Oronteus Finaeus in 1531. As with the Piri Reis map, Antarctica is shown to be ice free with flowing rivers, drainage patterns and clean coastline. Some of the mountain ranges shown were only discovered recently. The deep interior didn’t show any rivers or mountains which some believe means it was already covered in ice at the time. The Oronteus Finaeus map is more accurate than any other map of the same time. In fact, it is more accurate than any map made anywhere up to the year 1800.
Another tidbit of proof is the Ross sea. Today huge glaciers feed into it, making it a floating ice shelf hundreds of feet thick. Yet this map and the Reis map show estuaries and rivers at the site.
In 1949 coring was done to take samples of the ice and sediment at the bottom of the Ross Sea. They clearly showed several layers of stratification, meaning the area went through several environmental changes. Some of the sediments were of the type usually brought down to the sea by rivers. Tests done at the Carnegie Institute in Washington DC, which date radioactive elements found in sea water, dated the sediments at about 4000 BC, which would mean the area was ice free with flowing rivers up until that time – exactly what is recorded on the Reis and Finaeus maps.”
http://www.timstouse.com/EarthHistory/Antarctic/oronteusfinaeus.htm
Much information regarding the map is available on the Internet. I stumbled upon it at the skeptical website Pete’s Place, Ancient map Disproves Global Warming by Allen Quist July 17, 2009 http://petesplace-peter.blogspot.com/
Many have tried to dispute the map of Oronteus Finaeus, but there is no definitive argument to disprove it. How it achieved such accuracy remains a mystery.
It is often claimed by proponents of AGW that we are entering “uncharted territory” with current levels of CO2. The geologic record shows that levels of CO2 were much higher in the past, and other records give weight to climatic swings beyond what we are currently experiencing, all without anthropogenic influence. And hey, the lousy GISS temperature record doesn’t help any.
OT but I just saw this posted at CA under “Christy et al 2009: Surface temperature….”
http://www.agu.org/pubs/crossref/2009/2009GL038777.shtml
I actually bought the paper and it sent my BS meter a-clanging. Any chance Anthony that this could be discussed on WUWT?
Brandon Dobson (20:28:27) : “ Have you wondered about the origin of the name Greenland?”
There is some evidence that the voyage to what is called Greenland was not pleasant and seasoned sailors preferred not to go. Thus the crews were mainly young and inexperienced, that is to say ‘green’ and those that sailed and landed there were called ‘greenlanders’. This comes from a book from one of the last captains of the ships of the British tea trade and I’ve loaned it to a friend in another state. Best I can do at the moment.
We can officially change the name to GAS.
Goddard Airport Studies.
Seriously they use light measurements to choose the rural stations to adjust out the urban heat island.
That might also making them in charge of GASlighting.
Svante Arrhenius mused a lot;
Is the Earth warmed, or not,
By carbonic acid in the cloud?
Would that, in physics, be allowed?
Was this the way the ice retreated?
Could such a thing be repeated?
If not by Nature, then, by Man.
And so, the scientist began
To think about how he could best
Perform the perfect climate test.
To start, he said, we must be sure
About the present temperature
Of everywhere around the Earth
From Nunavut, clear through to Perth.
But, sad to say, upon reflection,
He was consumed by dejection;
The task was greater than he’d thought
To find the numbers that he sought
And though he really, really tried
He’d made no progress when he died.
There have been some others, since,
Whose attempts have made us whince;
Perhaps, there is no way to say:
Two hundred eighty-seven K
The picture I get is that the worse the urban station’s heat problem, and the more erratic the reporting, the more GISS uses it to overwrite rural stations that give the correct temperature.
GISS when you thought that they couldn’t be that bad, it get’s worse.
Dead sensor, no problem.
Are their Surface Station reports on these multiply used stations? Whatever they have wrong is multiplied by hundreds of times due to their usage to correct other stations.
As for why Greenland is called Greenland it is told in the Icelandic sagas (which are the only contemporary sources) that the name was chosen by Erik the Red to attract colonists (sounding decidedly better than Iceland).
And yes, it was warmer there during the MWP, and yes, it was possible (though marginally profitable) to grow barley there (according to the Konúngs Skuggsja, a very reliable 13th century norwegian source).
As for 16th century maps of Antarctica, may I point out that nobody visited Antarctica until 1820. And as for it having been ice-free then, forget it. In the Ross Sea area for example it is possible to date the fluctuations in the ice cover (and there have been such) by dating the deep-frozen remains of penguin colonies which go back many thousands of years, wouldn’t they have rotted if temperaturres had ever gone much over freezing during that time?
Mike d,
Yes there were 6 or 7 times more weather reporting in 1987, before th East Bloc collapsed. Now consider this: What if a terrorist organization arranged to destroy all the weather stations North of the Mason Dixon line in Maryland and the only reports that you continued to receive were from places South of it, like Florida, Texas, Arizona and Southern California. Do you think a plot of temperatures just might show it a smidgen warmer. Wouldn’t you say that the entire time series was destroyed, and rendered useless?
Well that is EXACTLY what Herr Hansen and his cronies do, and more as well. Global Warming based on Weather Station data is a complete farce. Plus the extrapolations of the weather stations is used to create nonfactually precise climate data. If I have a thermometer that can read accurately to a degree, then averaging a bundle of such measurements and reporting changes to a few hundredths of a degree accuracy from the computed averages is pure nonsense. Yet the Global Warming is meassured in such imprecise changes.
tty,
I suggest you acquire Gavin Mencies “1421”, which records the maritime expeditions of the Ming dynasty, including their mapping of the entire World including Africa, the E and W coast of the Americas, Australia, New Zealand, the Antarctic, the Arctic Ocean, a mapping of all Greenland that could only have been accomplished by circumnavigation in an ice free environment . This was done around 1420, at the middle-end of the Medieval Warm Period after all the ice had melted. Maps of the Ming expeditions circulated in Europe from Arab traders. Indeed Magellan had a copy of a Chinese map that showed the “Straights of Magellan ” before he sailed from Europe.
Been checking this map daily and today it seems like the temperatures went down all over the north.
http://www.intelliweather.net/imagery/intelliweather/tempcity_nat_320x240.jpg
GISS missed its calling. It should have been a massage parlour!
Send a link for this article to the UK Met Office here: enquiries@metoffice.gov.uk I have!
quick question John. How did you determine which stations were rural?
Reply: I left it up to GISS. Their report lists the rural stations they used to adjust urban stations.
Just in case we have forgotten – June 2009 saw the warmest ocean temperatures on record. They were just 0.1C cooler than the land. The Southern Hemisphere was the warmest on record (and the rather unpopulated Antarctic had thumping anomalies).
That’s from your good folks at NOAA.
PS Anthony you’ve got 12 months to save face and join the consensus. Odds are that 2009 will come in with a thumping annual global anomaly and 2010 – well that’s too horrible to think about. Sea level will spike sharply in the coming months – will easily surpass the peaks associated with recent El Nino’s event.
REPLY: One would think that somebody from BoM would be aware that an adjustment change was recently put in place by “the good folks at NOAA” but hey, don’t question confirmation bias there. It would be not be a fitting scientific thing to do for BoM’s worst alarmist. Please note our policy page. – Anthony
OT: Fish size and global warming !!!!!! [Now on Drudge]
http://www.breitbart.com/article.php?id=CNG.d672f9d7f0f64fefdf0b21e696b41e21.7a1&show_article=1
But where is that warming ???????
tty
“As for 16th century maps of Antarctica, may I point out that nobody visited Antarctica until 1820.”
Oh I do like people who can prove a negative.
The archeologocal remains of the Viking’s time in Greenland are there to study, but STILL UNDER THE PERMAFROST. Then it was supposed to have been ‘slightly warmer?’
This all seems like a smokescreen to divirt attention away from the fact that there interpretation of June’s data actually seems to be spot on, judging by the particularly large (and curiously unreported story on this blog) increase in near surface and lower atmosphere temperatures as recorded by AMSU. http://discover.itsc.uah.edu/amsutemps/execute.csh?amsutemps
Maybe someone can explain why this particular story hasn’t been highlighted?
REPLY: Maybe you can explain why you can’t use the search box to find it yourself?
Here’s the “curiously unreported story on this blog”: http://wattsupwiththat.com/2009/07/17/pielke-sr-hypothesis-on-daily-uah-lt-records/
Typical alarmist. Denounce first, ask questions later. Bugger off. – Anthony
Jim Papsdorf (02:30:23) :
Al Gore wrote in “Earth in the Balance” that salmon and rabbits in Patagonia were going blind from the effects of more UV through the hole in the ozone layer. I suspect that there were fewer than a half dozen instrumental UV recorders giving anything close to continuous UV flux records on the whole Earth at the time he wrote. I could not find any records then. Certainly, no trend had been established for Patagonia.
It’s not even certain if salmon are blinded by UV light in the course of a normal day. They might just live a bit deeper. And rabbits are more often out and around by night.
Silly bugger.
Geoff Sherrington (18:46:45) :
Agreed, but remember the days before continuous recording. The exhaust of an aircraft would produce a transient high on a max-min thermometer, a spike that might be filtered out these days.
Just saw a book that was on a desk next to a cup holding a magnifying glass/reader. There was a wonderfully neat slice through the cover, which was slightly open, and a brown mark on page one. About 2 inches long. Too close to the sunlight from the window. Theoretical question. Did the magnifying glass (repeated n0000 times to make the figures seem less trivial) cause any global warming? Do solar farms create overall global warming, or does everything cancel? Not a trick question, I really do not know the answer.
I dont understand why you good people in America put up with this sort of crap.
Here is a noble institution called NASA that has a unit called the Goddard Instution of Space Studies being run by an environmental activist of the worst kind, whose job it is includes being, to create and manage the GISSTEMP temperature recordings.
But to accomodate the corruption of the data by the UHI and to help incorporate previous data stations that stopped functioning in the past they use about 100 so called “reliable”to calibrate against.
But now it turns out that these calibration stations are also tainted by the UHI and other anomalies– the effect of which is to raise the temperatures recorded.
Why do you put up with such blatant manipulation and incompetence.?
Its the same as Madoff, GFC and Algore –they all get found out in the end, but its the damage they do on the way through thats the concern
John F. Hultquist (21:26:21) :
……………………
“those that sailed and landed there were called ‘greenlanders’. This comes from a book from one of the last captains of the ships of the British tea trade and I’ve loaned it to a friend in another state. Best I can do at the moment.”
Please, they were not British! Therefore not using English idioms!
And then —of course— there’s the Piri Reis map which predates all other known maps.
http://www.uwgb.edu/dutchs/PSEUDOSC/PiriRies.HTM
Gavin Menzies 1421: http://en.wikipedia.org/wiki/1421_Hypothesis
I have Charles Hapgood’s book and the claims of accuracy for the ancient maps are unsustainable when the maps are inspected.
Far too many people are far too certain about far too many things.
FWIW, I’ve got an update at: http://chiefio.wordpress.com/gistemp/
that gives some detail on the fact that, as of about 2 hours ago, I got all of the GIStemp code to compile in a Linux box. After a great deal of slogging, I’m at a point where I can start to:
a) Test it.
b) Characterize it.
c) Fix it.
There was one systematic issue (a use of a non-standard “extension” to the language that some systems allow, but most do not) and there were a couple of cases of “type mismatch” in parameters. A couple of those were just parameters being passed to “write” statements, so they ought not to cause too much mischief. Another was more worrying in that a REAL data type was passed to a significant zonal subroutine with an INTEGER type for the variable.
That might be a significant bug, and certainly is a terrible coding practice. I’ll have to take a couple of hours to work it through, though, before tossing rocks at it (sometimes that “technique” can be an obscure feature… most of the time it’s a flat out error…)
While making it “go”, I took the opportunity to clean up the structure a bit. The “source code” (programs people read and write, like FORTRAN) now lives in source code repository directories. The executables (what the computer actually runs – binaries) are in a separate directory as well. I’ve also written “Make” files that generate the binaries from the source (and removed the “in line compile / run / delete” from the scripts…)
So now it’s at least structurally cleaner and much easier to follow the flow of what is going on.
I expect in the next week or so to have it simplified even a bit more and to have run some test data through it “end to end”. (I also made a script to ftp the data to the box so you don’t need to fetch it “longhand” quite so much…)
I’m not changing the “logic” or data processing any at this time. Just making it do what it does in a cleaner and clearer way that’s easier for folks to follow (and use). At some future point, I’ll translate any bits that really need it into a better language and / or fix any “bugs” I find. For now it’s just things like putting the “scratch” or “temp” files in somewhere other than the source code “archive”…
Frankly, the hardest bit so far was getting a FORTRAN 90 or newer compiler to run on my older Linux box 😉 Along the way I got to compile the gcc tool chain too, as g95 (the free FORTRAN compiler) needed some libraries from a newer gcc to run… If you started with a newer box with the compiler already there, it’s not hard to make go at all…
Carl Yee (15:29:27) : Another question is how far back did they start using that station for correction? Back 80 years ago (for example) it might have been a good reference station to adjust others, but from 19xx it might have been so mutated as to be useless for that purpose.
There is no selection of stations based on their great character. If a station is missing data, it is “filled in” based on what’s available, and not much more “thought” than that. This, IMHO, is one of the two great “bogosities” of GIStemp. The other is the fact that the most recent 10 years or so of “difference” between GHCN and USHCN for a given station is is used to rewrite all the past history of that station to “uncorrect” the corrected data… When NOAA gives you the choice of corrected, or not, data to download in the first place… If an equipment change in 1970 lowered temp 2 F, then in 1980 it was fixed, raising it 2F, that +2F will be subtracted from all history prior to 1980. This is right how?!?…
“Why? Don’t ask why. Down that path lies insanity and ruin. – emsmith”
It is what it is. ONLY the 10 most recent years with data from the period starting in 1980 are used to calculate the “offset” between corrected an uncorrected data, then that “offset” is applied to ALL HISTORY. Not very bright, but now you know why history keeps changing.
Want to change the temperature in Kansas in 1880 to 1980? Update the equipment today… Every year for the next 10 years, the history will slowly change more and more as the new equipment “offset” adds to the 10 year average “offset” that changes the past…
So, when does a station change a “nearby” station? Whenever there is data missing from the target station. The code just goes looking for whatever is handy to “fill in” via the “reference station method”… Doesn’t matter if the station with missing data is middle of the (nearly desert and darned hot in summer) Central Valley of California and the reference station is on the (almost always inversely related cool to cold) coast. It will be used.
I *think* that the “magic sauce” for combining data from one site into another happens in STEP1/comb_records.py
But that’s a Python script and I’m only now learning python. This is a fragment from it, and as you can see, it ranks stations based on a list of attributes, ending with “UNKNOWN”. So basically the code tries to use a station ranked higher based on MCDW vs USHCN vs … but will settle for anything if that’s all it’s got…
From comb_records.py
def get_best(records):
ranks = {‘MCDW’: 4, ‘USHCN’: 3, ‘SUMOFDAY’: 2, ‘UNKNOWN’: 1}
best = 1
rids = records.keys()
rids.sort()
for rec_id in rids:
record = records[rec_id]
source = record[‘dict’][‘source’]
rank = ranks[source]
if rank > best:
best = rank
best_rec = record
best_id = rec_id
if best > 1:
return best_rec, best_id
END PROGRAM QUOTE
For what it’s worth, the “reference station method” is applied several times in several bits of the program… Yes, it just keeps smearing what little data it has around in an attempt to get global coverage out of data with massive holes in it. In some parts the limit is 1000 km. In others 1500 km. And in the Anomaly phase it’s measured in degrees of arc… I think it was up to 10 degrees of arc, but I ought to check that.
To the best of my knowledge, no one has EVER evaluated the validity of “the reference station method” when applied recursively like that. (Or perhaps is should be called “applied serially”). A technique may be valid done once, but invalid done repeatedly… (One sleeping pill is fine, a dozen…)
So, while I’m fairly certain that GIStemp is substantially useless since at best it gives you numbers dancing in the (very wide) error bars of it’s calculations, I’m doing the work of porting it “for the masses”. When I’m done, you ought to be able to run it yourself, should you wish, and find for yourself where it does “odd things”…
Hey, just think of it as a weather video game… without the video 😎
Geoff Sherrington (18:46:45) : Does it make much difference if an airport is used for jets or pistons? Two aircraft of about the same weight would throw out about equal heat on takeoff. The question might be whether jet use implies bigger aircraft and more frequent flights.
You got it! Jets are very big, and jetports have lots of traffic.
Piston aircraft (at least since about 1970) have been mostly very small and used much less frequently.
Reply: I think it is more an issue of whether or not the measurement instrument is close to the tarmac or not. Year round, that black surface will have a warmer temperature than the surrounding ground. Some may believe it is small … fractional. But in this (now settled) debate, we are arguing about fractional degrees. – John
And don’t forget that the “standard” commercial jet runway is about 10,000 feet long while a private piston runway is more like 1,000 to 2,000. (I’ve seen folks take off on 300 feet!). You also tend to have one runway in a rural setting, but several with loads of taxiways in the larger commercial jet facilities. Heck, one airport I used some years back was a few hundred feet of grass(!). (Glider port… it’s a LOT nicer to land a glider with a skid on grass… seems scary the first time, but it’s really rather comfortable…)
So the UHI from an airport will, IMHO, increase geometrically as the typical traffic goes from Gliders, to Piston private planes, to commercial Jets; mostly based on the squared or more function of tarmac surface area… but the added cars, carparks, hangars, etc. will add to the mix…
For people who are concerned that the evidence for climate warming is just an artifact of the surface temperature measurements, I think that much of the best evidence for warming comes from lake temperatures and data on the duration of ice on lakes.
Check this on Google scholar. Quite a few of these studies are available as PDFs. Some US lakes, such a Lake Mendotae (WI) have data on the date of ice cover and ice out since about 1850. One study, cited below, has long term data on ice duration for 62 lakes in the Great lakes region( see below). These studies of lakes in the US and also Europe and Asia show a trend for later ice formation and earlier ice melting that accelerated in the last 40 years. In general, these change in ice duration match the warming trends in the surface temperature record very well.
PDF] ►Spatial analysis of ice phenology trends across the Laurentian Great Lakes region during a …
… , BJ Benson, JJ Magnuson, VM Card, MN … – Limnology and Oceanography, 2007 – fw.msu.edu
… 1). These records include 62 lakes, 1 bay of Lake Superior, 1 … 2000a; Futter 2003;
Magnuson et al. … For ice duration, the quantification of these no-ice years is …
It appears even CO2 researchers have similar sensor location issues, based on this picture and associated project. Will they use the collected temperature data to provide any menaingful correlation?
http://www.ldeo.columbia.edu/outr/LACOP/aboutlacop.html
Surprised you are getting so tetchy and have misquoted me, note I said “highlighted” not “reported”. When using quotations please try to keep it accurate. Presumably you know what ‘Bugger’ actually means, if not then read its literal meaning and try not to insult people.
The AMSU story is very significant whichever the side of the divide you are on especially in the context of the ongoing attempted assasination of GISS. My point, which you haven’t answered during your red mist moment, is that it looks like, on satellite evidence (ie. no UHI up there!) that GISS are absolutely spot on and the large increase in surface temperature anomolies they recorded last month are now showing up in the lower atmospheric temperatures.
Your rational and hopefully polite answer to this would be appreciated.
Bill D (06:32:29) : It’s been said before, but I’ll say it again … evidence of warming is not the same thing as evidence that man made CO2 is responsible for that warming. It also is not an established fact that warming is bad. To take it even further, I would say cooling is bad. In fact, cooling is much, much worse than warming. So, what’s your point?
If the Allegany State Park one is what I think it is, its in NY instead of PA (suggested by the spelling of Allegany/Allegheny, near the parking lot & shaded by trees.
Have to look it up on surface stations
EM Smith – Thank you for the service(s)!
“There was one systematic issue (a use of a non-standard “extension” to the language that some systems allow, but most do not) and there were a couple of cases of “type mismatch” in parameters. A couple of those were just parameters being passed to “write” statements, so they ought not to cause too much mischief. Another was more worrying in that a REAL data type was passed to a significant zonal subroutine with an INTEGER type for the variable.
That might be a significant bug, and certainly is a terrible coding practice. ”
—
If I recall my FORTRAN properly, this would “replace” the numeric (eight digit powers of ten format) result from the calculation of the various input numbers with an “integer” (single digit precision) in the output.
Could be significant. Could be insignificant. Could be trivial. Could be disastrous.
—
A few fundamental questions: We now have GISS’s “official” list of 100 “rural” (uncontaminated!) stations.
1) What are their (uncorrected!) temperature histories?
2) What are their (contaminated (er, corrected)) temperature histories?
3) Since these 100 stations are supposedly correcting for UHI in thousands of other heat-affected stations, why does he (Hansen) not just simply use these stations by themselves – no area adjustments at all?
3) Why “fill in” data in missed days, months, years – which will invisibly contaminate previously valid rural station data with the nearest URBAN data for the missing days – and not just plot and use what is actually present in the record?
4) Why back date old records if the algorithm for filling in continuously and erroneously adjusted literally hundreds of years of historical records with continuously adjusted new data? (Like the “continuously increasing time-of-observation bias that alwasy adds fractions to every temperature record – regardless of when any measurement was taken – neither process can be logically nor scientifically supported by the numbers, the methods, the process, or any fictional scenario of ancient station keepers crawling out to record minimum thermometer readings in the freezing snow at 11:59 each night.)
Bradford & Allegany (Salamanca ) aren’t that far apart (10-15 mi?)
OT: Scrolling thru the “archiv” on the North of 80 temps in the sidebar, I note that 2009 is in fact the coolest temperature in the series from 1952 to the present. Also back to no sunspots
Mary Hinge (07:27:35):
From my on-line dictionary:
Bugger vulgar slang, chiefly Brit; noun: A contemptible or pitied person, typically a man. A person with a particular negative quality or characteristic.
Is it an insult if it’s true?
Ah, the same old smokescreen smokey!
Still talking the same bar room talk and still not actually getting to the meat. So I put it to you, is GISS right, AMSU says they are…or do you blame UHI or even soot for that?
Mary Hinge (07:27:35) :
Are you one of the famous Spoonerisms? (The other being ‘Betty Swollocks’.)
It looks like Allegany has not been surveyed, I wonder about the others. I’ll have to take a stab at it while I’m there in August.
Take a temperature reading – record it
Add 37
Multiply by 8
Divide by 12
Add 16
Throw the result away
Pick a number that fits your theory – report it
Call it science
Mary Hinge (08:05:32),
I simply provided the dictionary definition of “bugger.” That’s what you asked for, isn’t it? So why the emotional response? Was it due to hormones?
Mary Hinge (03:29:38) : The current uptick in AMSU may or may not be meaningful, but that does not negate the fact that GISS shows a much greater uptrend over the past several years than does either RSS or UAH. It also does not validate the GISS methodology. Anyway, warm is better than cold any day. Warmist are wrong headed on so many levels.
@Mary Hinge,
What makes you think satellites would not be sensing effects from UHI?
E.M.Smith (04:41:58) :
“There was one systematic issue (a use of a non-standard “extension” to the language that some systems allow, but most do not) and there were a couple of cases of “type mismatch” in parameters. A couple of those were just parameters being passed to “write” statements, so they ought not to cause too much mischief. Another was more worrying in that a REAL data type was passed to a significant zonal subroutine with an INTEGER type for the variable.”
This is interesting. I’ve been retired for nearly two decades, but all the Federal Government software contracts during my working days specified that compliers used must meet ANSI Standards. Surely the extensions you noticed don’t meet ANSI standards.
I remember vividly that compilers for code developed for NASA had to meet that specification.
Smokey . And moderators.
This ‘Mary Hinge’ character is just trying to be funny. His/her ‘name’ is a Spoonerism. If you ‘undo’ the spoonerism, you’ll see what I mean, with the ‘name’ being UK slang for an intimate part of the female anatomy; a part which is not mentioned in polite discussion.
I’d get him/her to come up with a real name to prove he/she is a serious contributor.
“Check this on Google scholar. Quite a few of these studies are available as PDFs. Some US lakes, such a Lake Mendotae (WI) have data on the date of ice cover and ice out since about 1850. One study, cited below, has long term data on ice duration for 62 lakes in the Great lakes region( see below). ” – Bill D.
Gosh….
That’s what a lot of people thought when they headed to Northern Minnesota for the fishing opening last year, only to find the lakes solid with ice in mid-May.
This year, we are wearing jackets in July.
When is the media going to recognize and acknowledge that climate changes from decade to decade?
The “average” for freeze-up and ice-out includes ’98, a very odd year.
Jim (07:28:15) :
Bill D (06:32:29) : It’s been said before, but I’ll say it again … evidence of warming is not the same thing as evidence that man made CO2 is responsible for that warming. It also is not an established fact that warming is bad. To take it even further, I would say cooling is bad. In fact, cooling is much, much worse than warming. So, what’s your point?
Jim:
Of course, I agree that evidence for climate warming and the cause of climate warming are separate issues. The main concern on this particular post is whether temperature increases indicated by surface stations in the US are real or are they artifacts due to poor siting and UHI effects. All that I am saying is that data such as lake temperature and studies of the ice out dates on lakes (in the US and worldwide) are fully supported of the temperature increases suggested by the surface station data. The ice out data are esecially good for documenting warminng during winter and early spring.
You provided one definition but not the original meaning but bye the bye…
I was actually hoping you might have something relevant to say…still waiting!
Thanks for your response, at last an actual response to the original point.
Can you tell me and others here the source for this greater uptrend in the past years? I can offer you this link to the superb site wood for trees and I think you will find that there is no difference in trends at all. http://www.woodfortrees.org/plot/gistemp/from:1980/normalise:+2.4/plot/uah/from:1980/plot/wti/from:1980
In the context of recent articles written here the current AMSU uptick is very meaningful. It fully supports GISS’s figures from last month and shows that the now seemingly feeble attempts to discredit GISS are not worth the blogosphere bits they are written on.
The coolest July in Kentucky history is on the way. Hats off to Roy Spenser, the clouds roll in every afternoon. We are saving money and using less electricty Mr. Gore, no AC needed and it’s July! WooHoo, I love your global warming! Can you turn it off however in Oct. we don’t want snow that soon.
Robert A Cook PE (07:30:02) :
EM Smith – Thank you for the service(s)!
You are most welcome! (Back after 5 hours sleep… hoping to get the first STEP0 run / debugged shortly…)
“Another was more worrying in that a REAL data type was passed to a significant zonal subroutine with an INTEGER type for the variable. ”
If I recall my FORTRAN properly, this would “replace” the numeric (eight digit powers of ten format) result from the calculation of the various input numbers with an “integer” (single digit precision) in the output.
Thanks for that. You’ve saved me a bit of time. I thought it was something like that, but your statement joggled a few 30 year old memories and it’s easy enough to test with a stub.
Could be significant. Could be insignificant. Could be trivial. Could be disastrous.
And that pretty much sums up GIStemp.
From the odd coding style (scribbling temp data in with the source files, in line compile, run delete…) to the strange algorithmic choices (why ‘unadjust’ via the last 10 years “offset” when NOAA has an ‘unadjusted’ dataset available? why re-write history? why fill in missing data with fantasies based on something that happened 1500 km away? ignoring Nyquist… How much does Fargo really reflect Dallas?) to the lack of basic math skills (exactly HOW do you get 1/100 C precision out of 1 F data?) it is just full of such “stuff”.
—
A few fundamental questions: We now have GISS’s “official” list of 100 “rural” (uncontaminated!) stations.
Realize that the “list” is not official, it is an artifact of the particular “run” of the program. It will change each time GIStemp is run. Perhaps slowly, perhaps fast. If some stations are deleted from the record, or some added, or just some artifact of this year data changing the ranking: the list will change based on the new data in the run.
These are not well selected “uncontaminated” or “rural” stations. They are selected based on the character of the data fed into GIStemp (and AFTER the earlier steps of GIStemp have already partly re-imagined the data… So, for example, the GHCN – USHCN blending / “de-offsetterizing” step gets done in STEP0 before the “reference station method” gets applied the first time IIRC (of several…). So it’s a computer selected list partly selected based on things the computer has already done to the data, that changes with each run…
1) What are their (uncorrected!) temperature histories?
2) What are their (contaminated (er, corrected)) temperature histories?
You can directly download this data from NOAA:
GHCN = Global Historical Climate Network (NOAA)
USHCN = US Historical Climate Network (NOAA)
Basic data set: GHCN – ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2
v2.mean.Z (data file)
v2.temperature.inv.Z (station information file)
For US: USHCN – ftp://ftp.ncdc.noaa.gov/pub/data/ushcn
hcn_doe_mean_data.Z
station_inventory
Note that GHCN also has v2.mean_adj.Z (and max and min) datasets available. And USHCN has doe_min and doe_max sets too.
Since the USHCN data are reflected in the GHCN set, it really is a bit murky why we need to go through the machinations of “unadjusting” the data in the first place. It just introduces a big change in valid historical data for no good reason, IMHO.
BTW, for most browsers you can just paste the “ftp://ftp…” line from above into the browser and get the directory presented for drag / drop copy of the data.
3) Since these 100 stations are supposedly correcting for UHI in thousands of other heat-affected stations, why does he (Hansen) not just simply use these stations by themselves – no area adjustments at all?
IMHO, the “game” being played here is all about making up data where there are none due to the coverage of thermometers being too sparse both in time and in space. Disjoint bits of data for a given station are glued together from USHCN and GHCN, missing chunks are made up based on “nearby” stations, “anomaly” maps are made with zones who’s content are made up based on what’s available within 1500km (or more) and with the ocean boxes filled in via an “anomaly map” that is already based on gluing together several sets of measurements (ships, buoys, an already processed satellite anomaly map, and more…) etc.
All of this to give the impression that we have a global coverage for 100 years when the reality is that we have a lot of coverage for the U.S.A. and Western Europe for fragmented chunks of time, and darned near nothing for most of the planet for most of time / history.
http://chiefio.wordpress.com/2009/02/24/so_many_thermometers_so_little_time/
It is all an attempt to ignore the fact that they violate Nyquist and their results are meaningless because of that.
3) Why “fill in” data in missed days, months, years – which will invisibly contaminate previously valid rural station data with the nearest URBAN data for the missing days – and not just plot and use what is actually present in the record?
I think I covered that above. I suspect, but can never prove, that they started with plotting the real data. Then discovered that they didn’t have the data needed to make any conclusions. So they headed down the path of “filling in the gaps with guesses” and believe that their guesses are valid (when they are not).
So we need to blend data sets with different “adjustment” histories… (make up a way to conform these to each other, but ignore the side effects). We need to fill in missing blocks of data (guess based on what you do have, and don’t look too close at the quality of the guesses…) We need to fill in long spans of history of large parts of the globe (make “zone boxes” that are really big, then blend whatever data you have in the box over the whole thing, ignore that Kona may have little to do with Hilo and the rainy side of Kauai – Kilauea may be dramatically different from both, just use them to make up data for the surrounding 1500 km radius of ocean…).
4) Why back date old records if the algorithm for filling in continuously and erroneously adjusted literally hundreds of years of historical records with continuously adjusted new data?
This, IMHO, is a very important place to “stick a fork in it, Pablo!”
Simply take a GISSified station, blink it with the GHCN adjusted data, and ask Why?
All the difference is from the GIStemp magic sauce, and nothing more. They start with GHCN (that gets it’s data for US stations from USHCN as I understand it) then “adjust it” in strange and wondrous ways. So want to know how much GISS is “making up”? Just compare the two sets.
On my “someday” goal list is to take the GIStemp code and shut off parts of the “homogenized processed data food product” process one at a time to find the sensitivity of the output to the different parts of the GIStemp blender.
I’m well on my way (now that it compiles) but it will likely take me a few more months to get there. But, I have to make money some time or other or my kids don’t eat… so I work it in around the edges. (If any big oil company would like to hire me to work on GIStemp, I’d love to do it, but that magic money spigot the AGW crowd keep saying is buying folks off just never seems to be running when I’m around 8-}
Just goes to show – some averages are more average than others.
Rod Smith (09:09:19) : This is interesting. I’ve been retired for nearly two decades, but all the Federal Government software contracts during my working days specified that compliers used must meet ANSI Standards. Surely the extensions you noticed don’t meet ANSI standards.
I remember vividly that compilers for code developed for NASA had to meet that specification.
I believe that the compiler has to provide all the features specified in ANSI, but can have “extensions” to is that are ignored. Programmers are “encouraged” not to use those non-ANSI extensions, but folks use them anyway.
In this case, it’s just an easy way to preload an array with data.
The standard is:
INTEGER FOO(3)
DATA FOO / 1, 2, 3 /
The extension is:
INTEGER FOO(3)/ 1, 2, 3 /
so you can see how folks would take the “shortcut”. Once I figured it out, it didn’t take long to “fix” even if it was in about 1/3 of the programs… The g95 compile error messages pointed me right at it in each program.
Judging by the names and locations, I’d expect to find most of these airports are for small, generally private planes, are grassy, don’t see a lot or activity or generate a lot of heat and would qualify as rural or semi-rural (quite different from the big city airports we use to get around this great country).
My question is more basic and you have probably already addressed it somewhere. Is there any validity at all to these “adjustments”? Is there any reason to believe that taking a temperature reading and then smoothing it out based on other area temperature readings will get you a more meaningful (or even just meaningful) number?
Mary Hinge (10:10:56) : That was a nice try, but typical of warmers, you did a shake and bake on the data. You can’t use normalize only on GISS … we’ve been down this road before.
Normalise – Scales and offsets all samples so they fall into the range 0..1
See the farily rendered chart instead:
http://www.woodfortrees.org/plot/gistemp/from:1980/plot/uah/from:1980/plot/wti/from:1980
E. M. Smith:
This question has to be asked: have you found anything in GISTEMP that is done right?
Mary Hinge (13:40:59):
Let’ s look closely at the June global anomalies: UAH=.001, RSS=.075, GISS=.640, GISS(land)=.730. Satellites cover virtually the entire globe; land stations and SST reports from ships of opportunity do not. So where do you get the unhinged notion that “GISS was right?”
My posts are based in science and my own personal interpretations. Perhaps rather than slaps you deal with the substance.
Sure it was fun being a “sceptic” while the La Nina and solar minimum coincided (and the Eurasian snow storms of Jan 2008 were a well timed freak event on which to anchor silly stories of ice ages), but all the data are going in the wrong direction and fast.
REPLY: And people like yourself are driving it there, fast. Take a look at all the adjustments being made in the POSITIVE direction. No complaints from you about those, why is that. Confirmation bias. Its OK then by your view to remove datasets that go against what you believe? NCDC thinks so.
http://www.ncdc.noaa.gov/oa/climate/research/sst/papers/merged-product-v3.pdf
“In the ERSST version 3 on this web page we have removed satellite data from ERSST and the merged product. The addition of satellite data caused problems for many of our users (WHO???). Although, the satellite data were corrected with respect to the in situ data as described in reprint, there was a residual cold bias that remained as shown in Figure 4 there. The bias was strongest in the middle and high latitude Southern Hemisphere where in situ data are sparse. The residual bias led to a modest decrease in the global warming trend and modified global annual temperature rankings.”
if you were a scientist of integrity you’d
1- use you name here instead of hiding behind BoM IP addresses while at work
2- take the issue of dataset removal seriously.
But you won’t do either of those things. Feel free to prove me wrong. – Anthony Watts
To tty, AllanM et al, regarding medieval maps and the colonization of Greenland:
If any significant research is done on the topic, the statement “As for 16th century maps of Antarctica, may I point out that nobody visited Antarctica until 1820.”
has no more weight than Colombus’ claim that he “discovered America”, even though there were clearly long-established Native Americans prior to his landing. Such attitudes reflect the ignorance of Europeans in the Dark Ages, and the widely held belief that only white-skinned Christians have any valid claims to discovery. A thorough discussion of the medieval maps can be found at:
http://www.saudiaramcoworld.com/issue/198001/piri.reis.and.the.hapgood.hypotheses.htm
The alleged “discovery” of Antarctica in 1820 is, of course, the main point of contention. Quoting from the above website:
“The cartography of the Age of Discovery, for instance, often seems to have been independent of the voyages themselves; that is, certain early maps of America contain features before their supposed date of discovery.
The most notable example of this is the map of America made by Glareanus, a famous Swiss poet, mathematician and theoretical geographer, in the year 1510. This map, which was probably based on the 1504 de Canerio map, clearly shows the west coast of America 12 years before Magellan passed through the strait that bears his name. In other words, Piri Reis was not the only one to include anachronous information.
The map of Glareanus, furthermore, was reproduced in Johannes de Stobnicza’s famous 1512 Cracow edition of Ptolemy and is unquestionably similar to the map of Piri Reis. Did Piri Reis have a copy of this early printed edition of Ptolemy before him when he drew his map? Is this what Piri Reis meant by “maps drawn in the time of Alexander the Great”?
Aware that ideas that deviate from traditional scientific beliefs get short shrift in the scientific community – as did, for instance, Wegener’s theory of continental drift, now widely accepted, Hapgood, therefore, pointed out in Maps of the Ancient Sea Kings that civilizations have vanished before. No one knew where Sumer, Akkad, Nineveh and Babylon were until 19th-century archeologists dug them up. And as late as 1970 – only 10 years ago – no one even suspected the existence of a civilization called Ebla (See Aramco World, March -April l978). It had existed. It was real. But it vanished without a trace. Why then, argue Hapgood advocates, couldn’t there have been other civilizations that vanished?
The same is true of Hapgood’s unspecified advanced technology. Greek fire – something like napalm – was developed in the ninth century but its composition has never been duplicated. Arab scientists of the Golden Age were able to perform delicate eye surgery – using advanced instruments – but these skills were later lost. And in 1900, according to Scientific American, archeologists discovered an astoundingly advanced gearing system in a Greek navigational instrument. It dated back to 65 B.C. and its existence had never been suspected.
Although unquestionably an amateur theoretician, he did do his homework and had it thoroughly checked by professionals. The U.S. Air Force SAC cartographers, for example, worked with him for two years and fully endorsed his conclusions about Antarctica.
Furthermore, the Hapgood team identified 50 geographical points on the Finaeus map, as re-projected, whose latitudes and longitudes were located quite accurately in latitude and longitude, some of them quite close to the pole. “The mathematical probability against this being accidental,” says Hapgood, “is astronomical””
Scientific studies have been done in the Ross Sea, which show the effects of river-born sedimentation:
“In 1949 coring was done to take samples of the ice and sediment at the bottom of the Ross Sea. They clearly showed several layers of stratification, meaning the area went through several environmental changes. Some of the sediments were of the type usually brought down to the sea by rivers. Tests done at the Carnegie Institute in Washington DC, which date radioactive elements found in sea water, dated the sediments at about 4000 BC, which would mean the area was ice free with flowing rivers up until that time – exactly what is recorded on the Reis and Finaeus maps.”
If science means anything anymore, these findings cannot be dismissed by armchair theorists.
Whether Greenland’s naming was to promote colonization, or as other sources claim, as an apt description, the point is moot because no one disputes that Greenland’s climate was warmer around 1000 AD, and that farming once supported colonists. The ruins of Hvalsey Church are clearly visible, and are one of the best-preserved signs of Middle Age settlements in Greenland.
And I should point out that the central theme of WUWT, that climate change is natural, cyclical, and not necessarily human-caused, is a “deviation from traditional scientific beliefs” at this period of time. To dismiss anything that is outside the realm of consensus undermines the whole purpose of finding the truth.
To calm down the gentleman who calls himself the m-word, the b-word is only indecent when it is used alone or in combination with -you or -up. In combination with -OFF it simply means “go away”.
There is a new report that provides evidence connecting the earth’s magnetic field to warming: http://www.appinsys.com/GlobalWarming/index.htm?nn=2009072101
You can download two PDF Files, the original report and the latest magnetic field map.
The next cold wave:
http://www.accuweather.com/news-story.asp?partner=rss&article=5
Is this going to be included in the publication you wont talk about, Anthony?
What is the difference between entire-network trends, and class 1,2 station only trends?
IOW – does this matter at all to the analysis? You imply here that this is FUBAR, you have the data and say you are doing the analysis to show whether it is FUBAR. Is it FUBAR?
The CRN ratings of airports averge almost a full point better than non-airports. But in the last 30 years, airports have warmed significantly faster than the better-sited non-airport stations.
What is the difference between entire-network trends, and class 1,2 station only trends?
A disproportionate number are out west and north-west/central where there has been a far greater natural warming over the last century than in the rest of the country (the Southeast has cooled significantly). This is going by raw (and TOBS-adjusted) data.
Also, all but three CRN1 stations and a full third of CRN2 stations are located in airports.
GISS showed a large anomoly last month, AMSU is showing this anomoly in near ground/lower atmosphere now. http://discover.itsc.uah.edu/amsutemps/execute.csh?amsutemps
That heat must have come from the surface, as recorded by GISS. Are you saying they are wrong despite this totally conclusive evidence?
Mary Hinge (01:40:19) : You are confusion (weak) correlation with causation.
I suggest we start a “Fantasy Weather League.”
Like Fantasy Baseball, we could all draft different sites from the Surfacestations project. 5’s on rooftops next to air conditioning condensers would rate the highest cost. Then the “team” with the greatest temperature anomaly wins each season.
Like baseball, the league would have records and even doped-up players.
With the first pick, I want to draft Marysville, CA and bring the old dog out of retirement.
E.M.Smith (05:02:15) : E.M.Smith (04:41:58) : E.M.Smith (11:48:38) :
America, there’s a man working behind the veiled secrecy of the internet, seeking truth where there is none. From the AGW cooled back deck of his home near San Francisco, overlooking his non-producing garden, he diligently seeks to find the gimmicks inappropriately introduced in Gisstemp by agenda driven government coders. Yes, it’s “Mr. cobbled together, ancient computer language code debugging guy”. Ignoring the slings and arrows of psychologically broken and scientifically challenged AGW fearmongers, he squanders his leisure hours in the shady netherworld of electronic iniquity, arduously dissecting unbounded filaments of defectively programmed code, passionately seeking to discern where it all went appallingly unethical. America, we all owe him a debt of gratitude. So hats off to “Mr. cobbled together, ancient computer language code debugging guy”. This foreign-owned Buds for you.
Tim Clark (06:24:21) : So this is YOUR contribution? I would say E.M. Smith has you on the floor and handcuffed already. He actually has some interesting things to say, as opposed to your “humorous” piece. Maybe you should contract out to Bill Maher. He like’s that kind of stuff.
Calm down Jimbo. I think EM knows where I’m coming from, a tongue in cheek complement. We’re cut from the same mold, and he’s got a sense of humor. And actually, I should contract out as an advertising writer. I thought it would make a pretty good Bud commercial, or maybe you’ve never seen one.
Mary Hinge (1:40:19):
You seem to think that the GISS monthly anomalies are predictors of NEXT month’s satellite anomalies. I doubt that you can demonstrate that on any consistent basis. Until you do, all you’ve shown is how little it takes for true believers of AGW to claim “conclusive evidence.”
BTW, I’m willing to bet that July’s GISS anomaly will again be more than ~.15K above the satellite results. (That figure represents the offset introduced by the different “base periods,” according to GISS’ own annual series.) Are you willing to bet against me?
Tyler (06:15:53):
With the second pick, I take Mohonk Lake, before Gavin grabs it.
Tim Clark (06:24:21) :
E.M.Smith (05:02:15) : E.M.Smith (04:41:58) : E.M.Smith (11:48:38) :
America, there’s a man working behind the veiled secrecy of the internet, seeking truth where there is none. […] Yes, it’s “Mr. cobbled together, ancient computer language code debugging guy”.
What a hoot!
Yup, that’s me, Intellectual Hip Waders pulled up to my armpits trudging through the muck… “Remember, that a journey through the ocean of most mens code will scarcely get your feet wet!”
Jim (07:29:51) :
Tim Clark (06:24:21) : So this is YOUR contribution? I would say E.M. Smith has you on the floor and handcuffed already. He actually has some interesting things to say, as opposed to your “humorous” piece.
Thanks for the compliment Jim! Tim’s humor was subtile enough that it took me about 1/2 way through to “wake up an smell the satire”.. And that is just the way I like it! (Mom was from England, so I have that love of humor just on the edge of ambiguous that Americans don’t seem to quite understand. Part of the goal is to have as much humor as possible while still being deliciously ambiguous to the very last minute…)
Tim Clark (09:20:28) : I think EM knows where I’m coming from, a tongue in cheek complement. We’re cut from the same mold, and he’s got a sense of humor.
Yup. I “got it”… Caped Crusader (bath towel safety pinned around neck) fighting the daemons of Linux from his Throne of Power (WiFi in “little room”) 😎
BTW, I’ve now run all the code up through STEP3. That just leaves STEP4_5 to go. STEP4 seems to be an “update SBBX.HadR2” process (finally found where to get SBBX.HadR2, working on finding the monthly update files) while STEP5 looks like it does the “add in sea anomaly map to land anomaly map”. I’m in the home stretch.
I’m now left with the conundrum of how to prove proper function of code who’s basic function I think is broken ;-0 but that’s for another day. For now, I’m assuming that if it compiled, it works as advertised. (The changes I made to make it go were fairly trivial and ought not to impact operation; and g95 looks like a fairly well done compiler.)
From a prior thread (about last April?) where comments are closed (and where I’d not realized there was a question asked of me…)
John Galt (08:03:10) :
@E.M. Smith:
We’re constantly told how complex the climate models are. Complexity is subjective, but how about lines of code? How big is the codebase? Does it really require a supercomputer to run?
GIStemp is a data aggregator, fabrication, and anomaly map making program. Don’t know if that counts as a “model”, but I think so. At minimum, it feeds into some of the other models.
I’ve put up answers to code detail questions at the link:
http://chiefio.wordpress.com/gistemp/
The code base is about 7000 lines (once duplicates and hand tools are discounted). It’s not all that big.
Now, especially in the context of the “cobbled together ancient computer” comment, don’t snicker too much about what high performance platform I’m running it on. I wanted to be “period correct” (as all “re-enactors” will appreciate…) My Equipment:
An old box that started life as a 486 pc about 20 years ago, but had a motherboard upgrade about a decade+ ago. Presently running on a 400 Mhz AMD chip, with 64 Meg of SDDR memory (100 Mhz !) and 48 Meg of SIMM memory (of some slow speed… they made a transition board years ago that took both kinds of memory). With RedHat 7.x for the OS and an ancient 10 GB IDE hard disk. GIStemp has sucked up about 1 GB of it as it endlessly copies the original data, duplicating it a couple of times in each step. The code itself is very small…
FWIW, the only thing that has challenged the box so far was the make of gcc that I needed to do to get the 4.x libraries (I’m on 2.x in the box… I think). That was memory limited and swap showed that 256 Meg of memory would be overkill…
Running GIStemp is fairly quick (10 minutes?) for most stages. It’s mostly I/O limited, especially in the Python steps. So “any old PC” with a fast disk would be fine.
FWIW, the speed of the Mac laptop I use for a remote terminal is vastly better than this (recovered from the depths of the garage) dedicated development platform AND on a par with the speed of the Cray Supercomputer I managed 15 ish years ago. The ‘hot PCs” of today are pretty darned strong compute engines, at least after you get MS Windoze out of your way.
CO2 Realist (08:17:35) :
Frank K. (05:44:04) writes: E.M.Smith (03:02:13) :
Thanks for the link! Your analysis of GISTEMP is extraordinary and worthy of a post here at WUWT (perhaps in several installments). How about it, Anthony? :^)
I added my vote at CO2 Realist (07:20:39) and Anthony replied:
REPLY: Convince him to pack it up into a single document and I’ll have a look. – Anthony
So E.M.Smith, what do you say? I think it would be educational for many here.
I’m putting together a “GIStemp for the masses” summary. It’s not quite ready for prime time yet. I’ve just stated making the “data flow graph” of the hundred and one spotted files that GIStemp keeps creating. For now, I have mostly “techie” stuff along with specific critiques at the link above. When I have a decent (i.e Anthony Quality, rather than Me Quality) summary of GIStemp, I’ll give Anthony the right of first refusal on it.
For now, I’m just happy to have it compiled and basically runnable.
If anyone else wants to “make it go”, holler at me at the link above and I’ll put up the working version I’ve got. (Anyone have a public ftp site?…)
Basically, at this point, I’ve got a basically working, ported to Linux, version. Anyone wants it, I’m willing to share.
Over the next few weeks I intend to do basic function testing (my conundrum returns 😉 and then do a “modest rewrite” of some parts just to stamp out confusing and redundant bits where reasonable. Add some comments, maybe pull some “10 line scripts” into the parent that calls them. Clean up the constant trail of bread crumb files it leaves behind itself. That kind of thing. So you can have a copy now, or a better copy a bit later.
And at some point in that process, it will be “clean enough” to more easily say just what it really does. That’s when a simple, non-programmer, summary of it can be done… without loss of any more of my thinning hair…
BTW, while I don’t turn down any beer, Bud (though drinkable) isn’t my favorite… I’d rather folks raised a nice Pilsner or Gordon Biersch Marzen in honor of the milestone of “GIStemp running on a PC”. 😉
E.M.Smith (13:22:13) :
You do need a specific compiler to get that code to run unaltered.
I’ll see what I can find & let you know.
DaveE.
Tim Clark (09:20:28) : My apologies. I’ve grown to like E. and think he’s on to something interesting with the GISS code.
E.M.Smith (13:22:13)
If you get all five steps running, it would be interesting to add in the “missing” data. May take a little effort on my time to compile, but would be interesting.
Also might be interesting to run it just on rural stations, although the US would be a big blank post mid-2007.
I can think of lots of experiments 🙂
SPPI report on NCDC Surface Station talking points: click
EM.. check out the work of the guys at clear climate code..
for those interested in getting gisstemp working.. along with some corrections, compiler issues etcs…
http://clearclimatecode.org/
Thanks John. Then it uses nightlights, a proxy for being rural.
E.M.Smith (13:22:13) :
Now, especially in the context of the “cobbled together ancient computer” comment, don’t snicker too much about what high performance platform I’m running it on. I wanted to be “period correct” (as all “re-enactors” will appreciate…)
Cut from the same mold also means similar experience. Since I started in computer’s with Fortran using punch cards, I appreciate the effort you’re putting in, and I’m eagerly anticipating the results, especially on TOBS.
BTW, while I don’t turn down any beer, Bud (though drinkable) isn’t my favorite… I’d rather folks raised a nice Pilsner or Gordon Biersch Marzen in honor of the milestone of “GIStemp running on a PC”. 😉
The best beer I’ve had is ice cold and purchased by someone else, but I prefer wine. So the next time I’m in CA I’ll let you give me a tour of Napa Valley vineyards. ;~P
sky (12:44:18) :
In the context of the recent discussions of course it is valid that the high current lower atmospheric temperatures have followed on fron GISS high anomolous temperatures recorded last month. Like every other science there is never a straight correlation, for example in this particular example higher anomolous Arctic temperatures have resulted in very rapid ice melt (so far 2nd only to 2007) and the resulting release of ocean heat (it is no longer being insulated by the ice) has contributed to the high lower atmospheric temperatures.
The basic premise has been that GISS’s trend has been much higher than the satellite records recently, this is of course false as can be seen in the recent record from 1999 to 2008 (1999 chosen as after the El Nino of 1998 and 008 chosen as end point as last complete year).
http://www.woodfortrees.org/plot/gistemp/from:1999/to:2008/trend/plot/uah/from:1999/to:2008/trend/offset:0.275/plot/rss/from:1999/to:2008/trend/offset:0.24
I’m not a gambling kinda girl but I think you might be wrong about next months figures, satellite temperatures will show a very high anomoly. If the current rapid ice melt continues then both figures will be very interesting but lets wait and see!
Mary Hinge (06:31:39):
Scientific validity derives not from plausible scenarios, but from provable quantitative results. Had you any real appreciation of that difference, you would not say most of the things you do. And you certainly would distinguish between similarity of recent data trends and the effects of dissimilar “centering” of anomalies on different base periods. In fact, you would have spotted the red-herring figure of ~0.15C (the actual offset is closder to 0.33C) that I threw out to entice you, corrected that figure, and insisted on a sizable bet. Instead, you offer the all-too-comfortable prediction in the fourth week of July that the month will come in with high values in all indices. Sorry, but I’m not impressed by such scientific prowess.
Off course you don’t like the graph as it totally disproves what you have been trying to say! There has been no significant difference in trend in the last ten years despite all the sensationalist and innacurate claims on this site recently.
As I said lets wait and see what July’s figure brings shall we, Arctic ice melt is still accelerating and lowers amospheric temperatures are at the highest temperatures recorded http://discover.itsc.uah.edu/amsutemps/execute.csh?amsutemps (curiously despite the unique nature of this ongoing event is still not given a dedicated post, perhaps this might be rectified).
I can understand you not looking forward to Julys figures, especially as this will further discredit the theory of low sunspots numbers and temperatures etc etc. (no decrease in temperature, no observable increase in cloud cover and no mechanism to support it).
In a nutshell, lets wait and see.
Mary Hinge:
.
You’re becoming ~seriously~ unhinged …
.
The anthropological climate records, anecdotes, and data of over 1000 years have not only shown that your remonstrations and vituperations are without substance, but that you are exhibiting a serious case of denial.
.
In three known cases of low sunspot activity, the Earth’s weather patterns reflected that lack of said activity.
.
Now, you might declare: Correlation does not equal causation.
.
But to date, you and your ilk have NOT shown how ANY OTHER AGENCY might have been responsible for the the Earth’s past thermal variations, and how those could have changed WITHOUT the sunspot aspect.
.
If the sunspot activity wasn’t responsible, then WHAT WAS?
.
IF >YOUYOU< can't tell WHY for THEN, then you can't TELL WHY for now.
.
So, go ahead: Nitpick over minor variations in temperature and sea level changes, but the fact of the matter is JUST THIS: YOU are missing the forest for the trees.
steven mosher (22:55:15) :
for those interested in getting gisstemp working.. along with some corrections, compiler issues etcs…
http://clearclimatecode.org/
They are doing a re-write into Python. While that is a worthy goal (and they have lots of good info about what they ran into), I have a slightly different goal. I want to have the original code running with minimal changes (and all of them clearly causing no change of function) so that any “issues” found are clearly not the fault of the “port” to another language.
Basically, I’m taking more of a “forensics” approach with minimal change to the “evidence” in the lab…
DaveE (15:32:33) : You do need a specific compiler to get that code to run unaltered.
I’ll see what I can find & let you know.
In the STEP4_5 parts is imports a file written in FORTRAN “unformatted” which is actually highly formatted… and dependent on your computer “endian” type. (are numbers stored with the most significant digit first or the least significant…) So you must either run the code on a “bigendian” box like a Sun SPARC (or most other large workstations, but not all) or if you run on an x86 type chip (I.E. on a PC or anything using a gaggle of them like some of the Intel Pentium thousands of processor “supercomputers”) you need a complier with a “convert=swap” flag in the file open command.
The newer (last time I looked, it was beta) release of the g95 compiler claims to support that feature.
Mary Hinge (06:31:39) : Like every other science there is never a straight correlation,
Where do you get this stuff? In MOST of science there very much is a straight correlation. Drop a rock, it falls. Drop it again, if falls. And with mathematically repeatable precision. Good enough that pendulum clocks were the standard for precision for centuries.
It is the odd and unusual cases that have non-intuitive correlations. And while there are enough of those to “keep it interesting”, to assert they are the dominate feature of science is bogus, to say “never a straight correlation” is just so wrong as to stop a reader in their tracks. Which I did. (No idea what else you said, it all became “uninteresting” at that point.)