NASA's Hansen Frees the Code !

One of the goals I and many other concerned citizens have had this summer is to get full disclosures on the measuring environment, data, methods, and computer source code used by NOAA and NASA Goddard Institute for Space Studies (GISS) to arrive at adjustments to data for the surface temperature record. Given the error discovered in August that led to a restructuring of temperature in the USA and hottest year temperature rankings (see 1998 no longer hottest year on record) renewed calls for full disclosure put NASA GISS in a nearly indefensible position.

I’m happy to report that NASA GISS has in fact released the computer code used to arrive at temperature adjustments for the USA and the world.

Apparently us “court jesters” (as as Dr. James Hansen calls us) carry some weight after all. Even with such unfortunate characterizations, I wish to publicly thank Dr. Hansen for making this new information available. It was the right thing to do. Thank you.

The first task is to make sure it matches what has been seen, and to verify that we have all of it. This is hugely important in doing independent verification of the surface temperature record. Following that, an analysis of the methodology and replication of the computer program output to see if it matches the current data sets. Then perhaps we can fully understand why some stations that are in “pristine” condition, such as Walhalla, SC, with no obvious microsite biases (from 1916-2000) get “adjusted” by Hansen’s techniques. Shouldn’t good data stand on its own?

I got an email from one of the www.surfacestations.org volunteers, Chris Dunn, that sums up the problem pretty well:

I downloaded the raw and adjusted text versions of the GISS data for Walhalla, and did a simple subtraction of annual figures: adjusted minus raw. It’s clear that they created a step-up over time. They started by subtracting 0.3 from the early record, then progressively reduced this amount by 0.1 degree a couple of times until 1990, after which there were no adjustments made. This artificial “stepping down” of the historical temperature record as you go back in time induces a false upward trend to the data where, in my opinion, one shouldn’t be. Consider that this is a rural site and the CRS was unmoved, and in the middle of a large, empty and level field in a relatively static, isolated setting from at least 1916 to 2000. There is just no justification for this whatsoever when looking at the site and the general area.

Of course, this “step” procedure is what McIntyre et. al. have been documenting over on CA for some time, now, but having visited the Walhalla site personally and seeing how pristine it was during that period, I am just shocked to see how the data have been so clearly & systematically manipulated. It seems if they can’t find an upward trend, they simply create one. It’s an outrage to an average citizen such as myself, especially when I think of the good people (private observers, among others) who dedicated their time every day for so long to create an accurate record. That’s the real rub as I see it – the arrogant disregard of honest people who have put so much of their lives into it. I truly see just how important this work is that is being done by you and the folks over at Climate Audit.

I’m considering writing my congressmen, but will wait to see what the results are when McIntyre is done.

Now we’ll have a chance to understand this firsthand instead of having to reverse engineer the method. Perhaps we’ll go down this path and it will all be perfectly valid, in which case we have no argument. But independent verification is one of the basic tenets of science, and this has been long overdue.

Steve McIntyre expounds on the new revelation in his blog:

Reposted from www.climateaudit.org

Hansen has just released what is said to be the source code for their temperature analysis. The release was announced in a shall-we-say ungracious email to his email distribution list and a link is now present at the NASA webpage.

Hansen says resentfully that they would have liked a “week or two” to make a “simplified version” of the program and that it is this version that “people interested in science” will want, as opposed to the version that actually generated their results.

Reto Ruedy has organized into a single document, as well as practical on a short time scale, the programs that produce our global temperature analysis from publicly available data streams of temperature measurements. These are a combination of subroutines written over the past few decades by Sergej Lebedeff, Jay Glascoe, and Reto. Because the programs include a variety of

languages and computer unique functions, Reto would have preferred to have a week or two to combine these into a simpler more transparent structure, but because of a recent flood of demands for the programs, they are being made available as is. People interested in science may want to wait a week or two for a simplified version.

In recent posts, I’ve observed that long rural stations in South America and Africa do not show the pronounced ROW trend (Where’s Waldo?) that is distinct from the U.S. temperature history as well as the total lack of long records from Antarctica covering the 1930s. Without mentioning climateaudit.org or myself by name, Hansen addresses the “lack of quality data from South America and Africa, a legitimate concern”, concluding this lack does not “matter” to the results.

Another favorite target of those who would raise doubt about the reality of global warming is the lack of quality data from South America and Africa, a legitimate concern. You will note in our maps of temperature change some blotches in South America and Africa, which are probably due to bad data. Our procedure does not throw out data because it looks unrealistic, as that would be subjective. But what is the global significance of these regions of exceptionally poor data? As shown by Figure 1, omission of South America and Africa has only a tiny effect on the global temperature change. Indeed, the difference that omitting these areas makes is to increase the global temperature change by (an entirely insignificant) 0.01C.

So United States shows no material change since the 1930s, but this doesn’t matter, South America doesn’t matter, Africa doesn’t matter and Antarctica has no records relevant to the 1930s. Europe and northern Asia would seem to be plausible candidates for locating Waldo. (BTW we are also told that the Medieval Warm Period was a regional phenomenon confined to Europe and northern Asia – go figure.]

On two separate occasions, Hansen, who two weeks ago contrasted royalty with “court jesters” saying that one does not “joust with jesters”, raised the possibility that the outside community is “wondering” why (using the royal “we”) he (a) “bothers to put up with this hassle and the nasty e-mails that it brings” or (b) “subject ourselves to the shenanigans”.

Actually, it wasn’t something that I, for one, was wondering about it all. In my opinion, questions about how he did his calculations are entirely appropriate and he had an obligation to answer the questions – an obligation that would have continued even if had flounced off at the mere indignity of having to answer a mildly probing question. Look, ordinary people get asked questions all the time and most of them don’t have the luxury of “not bothering with the hassle” or “not subjecting themselves to the shenanigans”. They just answer the questions the best they can and don’t complain. So should Hansen.

Hansen provides some interesting historical context to his studies, observing that his analysis was the first analysis to include Southern Hemisphere results, which supposedly showed that, contrary to the situation in the Northern Hemisphere, there wasn’t cooling from the 1940s to the 1970s:

The basic GISS temperature analysis scheme was defined in the late 1970s by Jim Hansen when a method of estimating global temperature change was needed for comparison with one-dimensional global climate models. Prior temperature analyses, most notably those of Murray Mitchell, covered only 20-90N latitudes. Our rationale was that the number of Southern Hemisphere stations was sufficient for a meaningful estimate of global temperature change, because temperature anomalies and trends are highly correlated over substantial geographical distances. Our first published results (Hansen et al., Climate impact of increasing atmospheric carbon dioxide, Science 213, 957, 1981) showed that, contrary to impressions from northern latitudes, global cooling after 1940 was small, and there was net global warming of about 0.4C between the 1880s and 1970s.

Earlier in the short essay, Hansen said that “omission of South America and Africa has only a tiny effect on the global temperature change”. However, they would surely have an impact on land temperatures in the Southern Hemisphere? And, as the above paragraph shows, the calculation of SH land temperatures and their integration into global temperatures seems to have been a central theme in Hansen’s own opus. If Hansen says that South America and Africa don’t matter to “global” and thus presumably to Southern Hemisphere temperature change, then it makes one wonder all the more: what does matter?

Personally, as I’ve said on many occasions, I have little doubt that the late 20th century was warmer than the 19th century. At present, I’m intrigued by the question as to how we know that it’s warmer now than in the 1930s. It seems plausible to me that it is. But how do we know that it is? And why should any scientist think that answering such a question is a “hassle”?

In my first post on the matter, I suggested that Hansen’s most appropriate response was to make his code available promptly and cordially. Since a somewhat embarrassing error had already been identified, I thought that it would be difficult for NASA to completely stonewall the matter regardless of Hansen’s own wishes in the matter. (I hadn’t started an FOI but was going to do so.)

Had Hansen done so, if he wished, he could then have included an expression of confidence that the rest of the code did not include material defects. Now he’s had to disclose the code anyway and has done so in a rather graceless way.

0 0 votes
Article Rating
17 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
O2converter
September 8, 2007 1:16 pm

Now that I’ve visited a few stations, I’m beginning to wonder if there isn’t something to the location changes that accompanied the intoduction of the MMTS equipment starting in the early eighties.
Now that the MMTS (and Nimbus) equipment are the majority, it would be interesting to plot the avg and individual temps of the 1221 stations against the cumulative number of site location and equipment changes. Even some of the old CRS stations had a number of moves over the ~120 years of data collection that may not be evaluated properly. Each station has a specific story to be gleaned.
In addition there does seem to be a trend of Winter lows increasing. Even in small towns that I flew over last Winter at night, there seemed to be cloud formation directly over the main drags. Is it possible that water vapor from auto exhausts in the evenings is reducing the heat radiation into space in the Winter nights?

Jeff
September 8, 2007 1:50 pm

This all just proves to me that a single global temperature number is a useless metric. Some places heat up, some cool, all on their own and for different reasons. Trying to obtain a mean or average for vastly differing regional climates is, well, beyond ridiculous. It’s like counting the number of seeds in all fruit and vegetables and then averaging them. The results are meaningless. The watermelons with many seeds will raise the average abnormally, where the avocados with one seed, or seedless fruit, will throw it off the other way. Not to mention the “fruitless” endeavor of comparing apples and green beans.

September 8, 2007 6:19 pm

I have loaded low res browse images from the USGS Earth Explorer site, for 1948, 1956, 1977, 1994 (higher res digital images could be ordered from USGS EarthExplorer web site) just to demonstrate that archival imagery exists to evaluate how much a site has or has not changed. For walhalla, all the way back to 1948.
I’ve ordered two archival images for Paso Robles dated 1988 and 1976. They’re $3 each for digitized, $30 each for scanned … so I went with digitized to see how sharp they are.
If they look good I’ll get some for Walhalla and post them.

Papertiger
September 8, 2007 6:33 pm

Anthony you are a mighty climate warrior.
Hat’s off to you.

Anthony Watts
September 8, 2007 6:43 pm

Thanks Leon,
Being able to track land use changes around the station will be key in determining whether station data adjustments reflect observed reality or not.

O2converter
September 8, 2007 7:14 pm

It would also be great to obtain evidence regarding the tree cutting in the 20’s at the N Calif site that had the temp drop off (discussed @ ClimateAudit)

September 8, 2007 9:32 pm

Hi Anthony,
I’ve analyzed the low res and identified the Walhalla site, it’s the same big empty field in 1948, 1956, 1977, 1994, 2000, 2007!
I’ve ordered the 1948, 1956, 1977, 1994 digitized images for Walhalla — will post them when I get them in a few weeks.
We need to work a way to have USGS let us access pictures for free… perhaps we can get a registered account with them for our climate research? Maybe Pielke Sr has a “research” account we can use?

sergei
September 8, 2007 10:00 pm

I am curious how the Realclimate folks will respond. The spin masters are busy working on the responses now…

GTTofAK
September 9, 2007 4:24 am

I too am very interested in realclimate’s response. If this plays out like I expect it too Hansen will cover his own ass and start offering up the heads of his subordinates the chief amongst those being Gavin.
So it will be very interesting to see how Gavin responds to this. While Gavin is a very smart scientist this isn’t science its politics and from my understanding he is a lightweight. He will stand froward and use Realclimate to protect Hansen as he always has. Right up too the point that Hansen slits his throat metaphorically speaking.

steven mosher
September 9, 2007 6:41 am

Walhalla type adjustments are explained in Hansen 99
you can find it on giss, just look thrugh hansen publications.
I believe its section 5 or 6.
I’m checking the code to see if I can find the math

O2converter
September 9, 2007 1:34 pm

RE: Walahalla adj.
I did find a paper describing the average difference between the min-max thermometers and MMTS units. Possibly due to the faster response time of the MMTS there were higher highs and lower lows that didn’t quite balance giving 0.1 deg C higher avg. readings for the day. So there would have been some kind of equipment correction indicated. Maybe it was done by NOAA and that served as the “raw” for GISS. TOB adj could still be required with MMTS, I believe.

Chris D
September 9, 2007 6:39 pm

O2converter, I appreciate your effort to suggest a possible basis for it. Given that, I still struggle to understand why it is an incremental set of adjustments instead of simply a flat adjustment of the history up to 2000, which is when the observer told me the MMTS was installed. Furthermore, why is Santuck, another very clean, long-term, rural site that also converted from CRS to MMTS, left without a single adjustment throughout its entire history?
I suspect this whole thing might have something to do with the way the code blends the data geographically, considering that Walhalla is somewhat nearby Toccoa, GA, which appears to be a relatively “hot” site, though it is urban.
It will be very interesting to see how these cases are explained as people continue to gain a better understanding of how the “code” works.

Evan Jones
Editor
September 9, 2007 7:08 pm

Congratulations, Rev!
It just didn’t seem possible that they could refuse to cough up. What’s astonishing is that they refused in the first place.
P.S., Source Code is all very nice. A sine qua non, even. But did they lay the OPERATING MANUALS and related specs on you? Or are they just trying to bury you in ascii?

steven mosher
September 10, 2007 6:29 am

Chris D. If you have Fortran Skills the urban/rural adjustment code is in step 2. I’m looking at it now and two brains are better than 1.
The INCREMENTAL adjustments are made in 10ths of C. For the US data is read in as F and then converted to Celius in tenths. The adjustments happen in TWO legs, or two linear adjustments. Fir leg is the beginning of the record to 1950 ( this is somewhat variable) and the second leg for 1950 to the end of the record.
SO, if the adjustment says add .4C to the record before 1950, that is broken up into 4 adjustments of .1C. Thats why it looks like a stairstep.

Michael Jankowski
September 10, 2007 1:40 pm

The RC folks are busy deleting any comment which mentions the code release. I am sure their stance on the final analysis of code, methodology, etc, is “it doesn’t matter.”

September 12, 2007 8:18 am

Hi Chris D.
I’ve posted closeup aerial photos for Walhalla for 1948, 1956, 1977, 1994, 2000, 2007
There’s the same big empty field with the same copse of trees. So not a lot of change for 60 years.
Leon

Don Carlson
September 28, 2007 10:24 pm

I don’t understand exactly what you guys are up to, but I’m sure pleased that you are up to it. Someone needs to be watching what the ‘authorized’ researchers are doing in order to be sure major decisions are not made on wrong information–something that is all too common. Thank you again.