Questions on the evolution of the GISS temperature product

Blink comparator of GISS USA temperature anomaly – h/t to Zapruder

The last time I checked, the earth does not retroactively change it’s near surface temperature.

True, all data sets go through some corrections, such as the recent change RSS made to improve the quality of the satellite record which consists of a number of satellite spliced together. However, in the case of the near surface temperature record, we have many long period stations than span the majority of the time period shown above, and they have already been adjusted for TOBS, SHAP, FILNET etc by NOAA prior to being distributed for use by organizations like GISS. These adjustments add mostly a positive bias.

In the recent data replication fiasco, GISS blames NOAA for providing flawed data rather than their failure to catch the repeated data from September to October. In that case they are correct that the issue arose with NOAA, but in business when you are the supplier of a product, most savvy businessmen take a “the buck stops here” approach when it comes to correcting a product flaw, rather than blaming the supplier. GISS provides a product for public consumption worldwide, so it seems to me that they should pony up to taking responsibility for errors that appear in their own product.

In the case above, what could be the explanation for the product changing?

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

551 Comments
Inline Feedbacks
View all comments
Alan the Brit
November 14, 2008 9:05 am

It certainly changes the “best-fit” curve line for temperature trend, upwards naturally of course!
I note that in the latter graph, temp red line is omitted at 1880, bringing temp down not appear to extend passed 2005, so why it is labelled US temps to 2008 I cannot think!

RW
November 14, 2008 9:07 am

These adjustments add mostly a positive bias – what you mean is, these adjustments mostly remove a negative bias.
As for why the numbers might change years after the event – one thing that happened after 1999 was improvements in how to correct for urbanisation effects. Correcting urban stations in a different and hopefully better way obviously changes the data. Would you rather they didn’t seek to improve the data, and instead never re-examined it and just left it frozen in a potentially flawed state?

Bill Marsh
November 14, 2008 9:12 am

RW,
So UHI provided a negative temperature bias? I would think it would have provided a positive bias to past urban stations.

joshv
November 14, 2008 9:14 am

You will notice that the change moves the 1990’s peak annual temperature from well below the 1930’s peak, to either equal with, or just above the 1930’s peak. Fascinating.

crosspatch
November 14, 2008 9:15 am

One explanation I have heard is that many stations lack a value for one or more months. These values are filled by using an average over time. This average is recalculated every month. So the temperature of a station (or nearby stations) reported this month can change the average value that is used to “fill” missing values in the past.

TerryBixler
November 14, 2008 9:16 am

My bifocals need recalibration. The fact that it is not known why the numbers have been changed is the primary concern. The rant who what when why and where is all about standards with version control. Without standards on data and programs archive this flippant disregard for science will continue. The cost is immense.

Tallbloke
November 14, 2008 9:21 am

“Would you rather they didn’t seek to improve(sic) the data”
Yes.
“Correct for urbanisation effects”
How have recent temps in the graph got hotter, and ones from 1900 got colder if that’s the case?
Look at the data!!

Leon Brozyna
November 14, 2008 9:23 am

[snip] no ad homs please – Anthony

Bill in Vigo
November 14, 2008 9:26 am

From what little my little pea sized brain can understand it that most of the adjustments for UHI seem to more often than not to not change the urban stations to any useful extent down but to adjust the rural stations upward to match the urban set. I don’t know how you can adjust in any way temps from 50+ years ago with any accuracy or dependability for correctness in an unbiased way. especially since a considerable amount of the old rural stations have now been affected by urbanization. There are to many problems with the surface stations to give them a pass at this time. If the stations in the U.S. are supposed to be the best in the world It makes me wonder about the rest of the world.
Bottom line is that I don’t trust the method that GISS uses nor NOAA used to adjust for urbanization. There fore I shall wait for better science before I complete my mind set. It would be a shame to destroy the economy of the world to cure a non problem.
Bill Derryberry

November 14, 2008 9:28 am

Anthony,
I have just written a summary of what you call the fiasco here.
It ends with 6 questions regarding GISS, to which I have just added yours.
1. How many other errors, less obvious to the casual observer, are there in the GISS data?
2. Why does GISS not carry out any checks on the data before publishing it?
3. Where and how did these errors arise? As you rightly say, blaming NOAA is no excuse.
4. Why are there gaps in the GISS data, when the “missing” data is readily available?
5. Why has GISS’s number of stations used dropped so dramatically in recent years?
6. Why are so many of the remaining stations at airports? (Three quarters of the GISS stations in Australia are at airports.)
7. Why does GISS keep adjusting past temperatures, as shown here?

Kate
November 14, 2008 9:33 am

The GISS temperature record is a conflict of interest.
How could a global warming advocate be in charge of a temperature record that is used by various organizations to set policy?? This is a equivalent of hiring Donald Trump to be in charge of the gambling addiction center. What’s really needed is an independent body to keep an expanded minimally adjusted surface temperature record using only quality stations that meet vigorous standards which is thoroughly gone over with a fine tooth comb to find any inconsistancies, such as those that have been occuring over at the GISS, not just recently, but in the past as well and must be transparent to the public.
The GISS simply does not meet these standards and should be discarded, overhauled, or have new folks put in charge.
Let’s go over why such an organization is needed.
1. Independent body.
There needs to be a surface temperature record kept by an organization that publishes the data without little caveats such as “2007 would have been the warmest year on record is not for (that damned) La Nina.” or “We expect 2007 to be the warmest year on record due to the ongoing el nino event” (remember that one from the HadCru guy at the beginning of January last year?)
2. Minimally adjusted stations
This one should be easy. What the temperature says is what the temperature is. With all the hoopla continuing about the latest GISS October 2008 Siberia gaffe, one can compare the temperatures entered into the GISS analysis and those from weather wunderground and notice that the GISS repeatedly adds 1-2°C to the monthly averages for many stations used in their analysis. Of course, this also ties into #3 which is using data from quality stations that meet vigorous standards. In fact, one can eliminate #2 is using data from these quality controlled stations instead of contaminated ones either by UHI effects or being placed by buildings, A/C units, parking lots, or by a groove of young trees that will eventually grow to shade out the temperature sensor.
4. Gone over to find any inconsistancies.
Obviously this is a problem for the folks over there at GISS
Excerpt from Gavin Schmidt on RC in response to a comment
“Current staffing from the GISTEMP analysis is about 0.25 FTE on an annualised basis (i’d estimate – it is not a specifically funded GISS activity). To be able to check every station individually (rather than using an automated system), compare data to the weather underground site for every month, redo the averaging from the daily numbers to the monthly to double check NOAA’s work etc., to rewrite the code to make it more accessible, we would need maybe a half a dozen people working on this. With overhead, salary+fringe, that’s over $500,000 a year extra”
It would appear to me that not only are they under budget, but they’re also understaffed. Apparently, the GISS isn’t equipped to handle the job properly. So why is it that an under budgeted, understaffed organization is put in charge of publishing one of those most important record keeping endeavours in western society today. With all that hangs on global warming – taxes, policy, future of the economy – wouldn’t we want to have one of the more pestigeous (in the eyes of policymakers, environmentalists, governing bodies, etc.) record keepers of global temperatures be an efficient and well-organized group of independent scientists rather than a mistake prone, non-transparent, metric run by an advocate?
5. Expanded network of stations.
Quoting Gavin Schmidt once again
“There were 90 stations for which October numbers equalled September numbers in the corrupted GHCN file for 2008 (out of 908). ‘
Only 908 stations used for the October 2008 GISS analysis whereas some 40 yerars ago there were double the number of stations used to derive an average global temperature. The stations used are becoming more and more spare and mysteriously, certain stations are being left out of the analysis. Why is there a different number of stations used from month to month and why do certain stations report one month but not another. This would qualify the GISS dataset as non-homogonous and therefore worthless. But it will continue to be trumpted as the most often cited dataset of the global temperature record by alarmists, despite all the past errors found, all of which artificially inflated temperatures.

Tim Clark
November 14, 2008 9:41 am

crosspatch (09:15:04) :
One explanation I have heard is that many stations lack a value for one or more months. These values are filled by using an average over time. This average is recalculated every month. So the temperature of a station (or nearby stations) reported this month can change the average value that is used to “fill” missing values in the past.
Therefore, if we are alledgedly warming, and those warmer values are used to skew the past missing data, then the previous data would be rounded up by this “adjustment”.
Better take a second gander at the graphs.

stan
November 14, 2008 9:45 am

Does anyone know if the original, unadjusted, uncorrupted temperatures for all stations over the years are still available? I believe it is very likely that all the garbage that Hansen tosses into the soup will be shown (eventually ) to be seriously flawed. Is there a record anywhere of the temperatures actually recorded at each site.

Steven Goddard
November 14, 2008 9:49 am

I wrote a piece on this topic a few months ago for The Register. It appears that the period from 1930 onwards was transformed by a counter-clockwise rotation, as can be seen in the video below. That creates the effect of older temperatures becoming colder, and younger temperatures becoming warmer.
http://www.theregister.co.uk/2008/05/02/a_tale_of_two_thermometers/

A few moths ago someone on Climate Audit suggested Hansen’s law of temperature conservation. “If the present refuses to get warmer, than the past must become colder.”

Robert Wood
November 14, 2008 9:49 am

It’s man-made climate change, I tell yer!
Except that it is changing retroactively 🙂

November 14, 2008 10:07 am

I think it oerfectly obvious why the old data changed:
We know that the earth is in thermal equilibrium,
and since Hansen’s old temperatures keep going down,
so his newer temperatures HAVE to keep going up.
It’s the same reason that North Canadian and Alaskan temperatures show cooler temperatures: all of Hansen’s (unavailable, unaudited, un-maintained, un-standardized, and inconsistent) Siberian thermometers keep going up.

Mike Bryant
November 14, 2008 10:12 am

Alan the Brit,
True, on the 1999 example the red line shows that the 1880 anomaly was zero or very near to zero. The 2008 simply does not show it. Of course the many adjustments have changed the trend, but still the 2005 graphic shows a mere .1 or .2 difference from high in the ’30s. Also as you mentioned, the 2008 is not current, which would put us within a whisker af 1880. Can someone please explain to me again why this warming is so catastrophic?
“Oh what a tangled web we weave, When first we practice to deceive”. Sir Walter Scott
Is it a deception, or a carefully prepared scientific reconstruction of temperatures for the edification and benefit of mankind?
“All other things being equal, the simplest solution is the best.” Occam’s Razor

November 14, 2008 10:13 am

Anthony
This ‘anomaly of anomalies’ has been known for some time. It is part of the continuous uppdating of historic temperature data which is the hallmark of James Hansens strivings.
(Another quicker blinking version already appeared some years ago, in 2005)
But it gets even better. Look here:
An even earlier version of that graph was published by Hansen in his 1999 paper on GISS- temperatures, see Fig 6 p37.
I suspect that the reason for publishing this was that 1998 was so warm (due to the major El niño event, although they state the opposite.
1998 is when US-temperatures reached the highest level since 1934 (but still trailing by ~0.6°C)
In the 1999-version of your graphs, this distance had shrinked to about 0.25°.
However this wasn’t quite good enough for Hansen et al. Shortly after a new (recalculated) version appeared in 2001, where 1998 essentially had caught up with 1934 (at least in the US). This is the second version of the same US-temperature data, shown in the blinking figures. In this paper Hansen et al also purport to present a rationale for adjusting up later temperatures, and adjusting down earlier ones.
There, essentially, you can find the official answer to your questions.
1998 hade essentially caught up the entire 0.6° it had been trailing behind 1934 earlier, and solely by Hansens ‘updating’ of the temperature record!
In comparison, the entire observed global warming trend over last centurey was about 0.6°, regardless of how much of this might be attributed to AGW

John M
November 14, 2008 10:21 am

Steven Goddard (09:49:06) :

It appears that the period from 1930 onwards was transformed by a counter-clockwise rotation

You know, sometimes it all snaps into clear focus.
The imaginary number “i” is a rotation operator!
Temperature data, meet imaginary numbers!

November 14, 2008 10:29 am

Or as Gavin Schmidt puts it in his answer (to comment 174):
The GISTEMP analysis is not … the ‘historical record’
(my emphasis)

November 14, 2008 10:30 am

Interesting gif presentation. Does the gerrymandering of the GISS data get Hansen the “smoking gun” he was looking for in Hansen et al, 1999 (section 11.1.3)?
See pdf file. http://pubs.giss.nasa.gov/abstracts/1999/Hansen_etal.html

November 14, 2008 10:30 am

Anthony: Does the GISS change reflect the differences between the USHCN versions 1 and 2? I believe the switch was made in 2007.
It appears to me, based on the dates of the references, that the USHCN (Version 1) was “corrected” per the following prior to 1999.
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ushcn.html
It also appears that the USHCN (Version 2) was introduced after 1999, but again, there’s no clear date listed on the following webpage.
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/
This one helps clarify when the change took place.
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/hcntmptrends.php
But are these the changes reflected by your graph?

Hasse@Norway
November 14, 2008 10:37 am

The purpose of GISS in my opinion is propaganda. It’s quasi scientific and so complicated that it’s hard for the mainstream media to assess the quality of the claims being made. It gets the job done though for the alarmist cause…

Braden Sneath
November 14, 2008 10:48 am

A quite simple explanation, I suspect. “The end justifies the means”.

November 14, 2008 10:55 am

I’ve characterized this phenomena, and it is surely a natural phenomena unrelated to any human intervention, but one not realized until recently, as Temporal Teleconnection with The Past. It’s a quantum-effect kind of thingy.
I’m working on a Properly Peer-Reviewed Paper for an Approved Climatologists-Type Journal having an extremely high Impact Factor. That’s how Science Works and that’s what Scientists do.
I’m also hoping that a bunch of Not-Certified Climatologists don’t find a problem with the concept before I get it published. I didn’t have time to check my results.
The Owner may snip at will, of course.

1 2 3 23
Verified by MonsterInsights