Questions on the evolution of the GISS temperature product

Blink comparator of GISS USA temperature anomaly – h/t to Zapruder

The last time I checked, the earth does not retroactively change it’s near surface temperature.

True, all data sets go through some corrections, such as the recent change RSS made to improve the quality of the satellite record which consists of a number of satellite spliced together. However, in the case of the near surface temperature record, we have many long period stations than span the majority of the time period shown above, and they have already been adjusted for TOBS, SHAP, FILNET etc by NOAA prior to being distributed for use by organizations like GISS. These adjustments add mostly a positive bias.

In the recent data replication fiasco, GISS blames NOAA for providing flawed data rather than their failure to catch the repeated data from September to October. In that case they are correct that the issue arose with NOAA, but in business when you are the supplier of a product, most savvy businessmen take a “the buck stops here” approach when it comes to correcting a product flaw, rather than blaming the supplier. GISS provides a product for public consumption worldwide, so it seems to me that they should pony up to taking responsibility for errors that appear in their own product.

In the case above, what could be the explanation for the product changing?

Advertisements

  Subscribe  
newest oldest most voted
Notify of
Alan the Brit

It certainly changes the “best-fit” curve line for temperature trend, upwards naturally of course!
I note that in the latter graph, temp red line is omitted at 1880, bringing temp down not appear to extend passed 2005, so why it is labelled US temps to 2008 I cannot think!

RW

These adjustments add mostly a positive bias – what you mean is, these adjustments mostly remove a negative bias.
As for why the numbers might change years after the event – one thing that happened after 1999 was improvements in how to correct for urbanisation effects. Correcting urban stations in a different and hopefully better way obviously changes the data. Would you rather they didn’t seek to improve the data, and instead never re-examined it and just left it frozen in a potentially flawed state?

Bill Marsh

RW,
So UHI provided a negative temperature bias? I would think it would have provided a positive bias to past urban stations.

joshv

You will notice that the change moves the 1990’s peak annual temperature from well below the 1930’s peak, to either equal with, or just above the 1930’s peak. Fascinating.

crosspatch

One explanation I have heard is that many stations lack a value for one or more months. These values are filled by using an average over time. This average is recalculated every month. So the temperature of a station (or nearby stations) reported this month can change the average value that is used to “fill” missing values in the past.

TerryBixler

My bifocals need recalibration. The fact that it is not known why the numbers have been changed is the primary concern. The rant who what when why and where is all about standards with version control. Without standards on data and programs archive this flippant disregard for science will continue. The cost is immense.

tallbloke

“Would you rather they didn’t seek to improve(sic) the data”
Yes.
“Correct for urbanisation effects”
How have recent temps in the graph got hotter, and ones from 1900 got colder if that’s the case?
Look at the data!!

Leon Brozyna

[snip] no ad homs please – Anthony

Bill in Vigo

From what little my little pea sized brain can understand it that most of the adjustments for UHI seem to more often than not to not change the urban stations to any useful extent down but to adjust the rural stations upward to match the urban set. I don’t know how you can adjust in any way temps from 50+ years ago with any accuracy or dependability for correctness in an unbiased way. especially since a considerable amount of the old rural stations have now been affected by urbanization. There are to many problems with the surface stations to give them a pass at this time. If the stations in the U.S. are supposed to be the best in the world It makes me wonder about the rest of the world.
Bottom line is that I don’t trust the method that GISS uses nor NOAA used to adjust for urbanization. There fore I shall wait for better science before I complete my mind set. It would be a shame to destroy the economy of the world to cure a non problem.
Bill Derryberry

Anthony,
I have just written a summary of what you call the fiasco here.
It ends with 6 questions regarding GISS, to which I have just added yours.
1. How many other errors, less obvious to the casual observer, are there in the GISS data?
2. Why does GISS not carry out any checks on the data before publishing it?
3. Where and how did these errors arise? As you rightly say, blaming NOAA is no excuse.
4. Why are there gaps in the GISS data, when the “missing” data is readily available?
5. Why has GISS’s number of stations used dropped so dramatically in recent years?
6. Why are so many of the remaining stations at airports? (Three quarters of the GISS stations in Australia are at airports.)
7. Why does GISS keep adjusting past temperatures, as shown here?

Kate

The GISS temperature record is a conflict of interest.
How could a global warming advocate be in charge of a temperature record that is used by various organizations to set policy?? This is a equivalent of hiring Donald Trump to be in charge of the gambling addiction center. What’s really needed is an independent body to keep an expanded minimally adjusted surface temperature record using only quality stations that meet vigorous standards which is thoroughly gone over with a fine tooth comb to find any inconsistancies, such as those that have been occuring over at the GISS, not just recently, but in the past as well and must be transparent to the public.
The GISS simply does not meet these standards and should be discarded, overhauled, or have new folks put in charge.
Let’s go over why such an organization is needed.
1. Independent body.
There needs to be a surface temperature record kept by an organization that publishes the data without little caveats such as “2007 would have been the warmest year on record is not for (that damned) La Nina.” or “We expect 2007 to be the warmest year on record due to the ongoing el nino event” (remember that one from the HadCru guy at the beginning of January last year?)
2. Minimally adjusted stations
This one should be easy. What the temperature says is what the temperature is. With all the hoopla continuing about the latest GISS October 2008 Siberia gaffe, one can compare the temperatures entered into the GISS analysis and those from weather wunderground and notice that the GISS repeatedly adds 1-2°C to the monthly averages for many stations used in their analysis. Of course, this also ties into #3 which is using data from quality stations that meet vigorous standards. In fact, one can eliminate #2 is using data from these quality controlled stations instead of contaminated ones either by UHI effects or being placed by buildings, A/C units, parking lots, or by a groove of young trees that will eventually grow to shade out the temperature sensor.
4. Gone over to find any inconsistancies.
Obviously this is a problem for the folks over there at GISS
Excerpt from Gavin Schmidt on RC in response to a comment
“Current staffing from the GISTEMP analysis is about 0.25 FTE on an annualised basis (i’d estimate – it is not a specifically funded GISS activity). To be able to check every station individually (rather than using an automated system), compare data to the weather underground site for every month, redo the averaging from the daily numbers to the monthly to double check NOAA’s work etc., to rewrite the code to make it more accessible, we would need maybe a half a dozen people working on this. With overhead, salary+fringe, that’s over $500,000 a year extra”
It would appear to me that not only are they under budget, but they’re also understaffed. Apparently, the GISS isn’t equipped to handle the job properly. So why is it that an under budgeted, understaffed organization is put in charge of publishing one of those most important record keeping endeavours in western society today. With all that hangs on global warming – taxes, policy, future of the economy – wouldn’t we want to have one of the more pestigeous (in the eyes of policymakers, environmentalists, governing bodies, etc.) record keepers of global temperatures be an efficient and well-organized group of independent scientists rather than a mistake prone, non-transparent, metric run by an advocate?
5. Expanded network of stations.
Quoting Gavin Schmidt once again
“There were 90 stations for which October numbers equalled September numbers in the corrupted GHCN file for 2008 (out of 908). ‘
Only 908 stations used for the October 2008 GISS analysis whereas some 40 yerars ago there were double the number of stations used to derive an average global temperature. The stations used are becoming more and more spare and mysteriously, certain stations are being left out of the analysis. Why is there a different number of stations used from month to month and why do certain stations report one month but not another. This would qualify the GISS dataset as non-homogonous and therefore worthless. But it will continue to be trumpted as the most often cited dataset of the global temperature record by alarmists, despite all the past errors found, all of which artificially inflated temperatures.

Tim Clark

crosspatch (09:15:04) :
One explanation I have heard is that many stations lack a value for one or more months. These values are filled by using an average over time. This average is recalculated every month. So the temperature of a station (or nearby stations) reported this month can change the average value that is used to “fill” missing values in the past.
Therefore, if we are alledgedly warming, and those warmer values are used to skew the past missing data, then the previous data would be rounded up by this “adjustment”.
Better take a second gander at the graphs.

stan

Does anyone know if the original, unadjusted, uncorrupted temperatures for all stations over the years are still available? I believe it is very likely that all the garbage that Hansen tosses into the soup will be shown (eventually ) to be seriously flawed. Is there a record anywhere of the temperatures actually recorded at each site.

Steven Goddard

I wrote a piece on this topic a few months ago for The Register. It appears that the period from 1930 onwards was transformed by a counter-clockwise rotation, as can be seen in the video below. That creates the effect of older temperatures becoming colder, and younger temperatures becoming warmer.
http://www.theregister.co.uk/2008/05/02/a_tale_of_two_thermometers/

A few moths ago someone on Climate Audit suggested Hansen’s law of temperature conservation. “If the present refuses to get warmer, than the past must become colder.”

Robert Wood

It’s man-made climate change, I tell yer!
Except that it is changing retroactively 🙂

I think it oerfectly obvious why the old data changed:
We know that the earth is in thermal equilibrium,
and since Hansen’s old temperatures keep going down,
so his newer temperatures HAVE to keep going up.
It’s the same reason that North Canadian and Alaskan temperatures show cooler temperatures: all of Hansen’s (unavailable, unaudited, un-maintained, un-standardized, and inconsistent) Siberian thermometers keep going up.

Mike Bryant

Alan the Brit,
True, on the 1999 example the red line shows that the 1880 anomaly was zero or very near to zero. The 2008 simply does not show it. Of course the many adjustments have changed the trend, but still the 2005 graphic shows a mere .1 or .2 difference from high in the ’30s. Also as you mentioned, the 2008 is not current, which would put us within a whisker af 1880. Can someone please explain to me again why this warming is so catastrophic?
“Oh what a tangled web we weave, When first we practice to deceive”. Sir Walter Scott
Is it a deception, or a carefully prepared scientific reconstruction of temperatures for the edification and benefit of mankind?
“All other things being equal, the simplest solution is the best.” Occam’s Razor

Anthony
This ‘anomaly of anomalies’ has been known for some time. It is part of the continuous uppdating of historic temperature data which is the hallmark of James Hansens strivings.
(Another quicker blinking version already appeared some years ago, in 2005)
But it gets even better. Look here:
An even earlier version of that graph was published by Hansen in his 1999 paper on GISS- temperatures, see Fig 6 p37.
I suspect that the reason for publishing this was that 1998 was so warm (due to the major El niño event, although they state the opposite.
1998 is when US-temperatures reached the highest level since 1934 (but still trailing by ~0.6°C)
In the 1999-version of your graphs, this distance had shrinked to about 0.25°.
However this wasn’t quite good enough for Hansen et al. Shortly after a new (recalculated) version appeared in 2001, where 1998 essentially had caught up with 1934 (at least in the US). This is the second version of the same US-temperature data, shown in the blinking figures. In this paper Hansen et al also purport to present a rationale for adjusting up later temperatures, and adjusting down earlier ones.
There, essentially, you can find the official answer to your questions.
1998 hade essentially caught up the entire 0.6° it had been trailing behind 1934 earlier, and solely by Hansens ‘updating’ of the temperature record!
In comparison, the entire observed global warming trend over last centurey was about 0.6°, regardless of how much of this might be attributed to AGW

John M

Steven Goddard (09:49:06) :

It appears that the period from 1930 onwards was transformed by a counter-clockwise rotation

You know, sometimes it all snaps into clear focus.
The imaginary number “i” is a rotation operator!
Temperature data, meet imaginary numbers!

Or as Gavin Schmidt puts it in his answer (to comment 174):
The GISTEMP analysis is not … the ‘historical record’
(my emphasis)

Interesting gif presentation. Does the gerrymandering of the GISS data get Hansen the “smoking gun” he was looking for in Hansen et al, 1999 (section 11.1.3)?
See pdf file. http://pubs.giss.nasa.gov/abstracts/1999/Hansen_etal.html

Anthony: Does the GISS change reflect the differences between the USHCN versions 1 and 2? I believe the switch was made in 2007.
It appears to me, based on the dates of the references, that the USHCN (Version 1) was “corrected” per the following prior to 1999.
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ushcn.html
It also appears that the USHCN (Version 2) was introduced after 1999, but again, there’s no clear date listed on the following webpage.
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/
This one helps clarify when the change took place.
http://www.ncdc.noaa.gov/oa/climate/research/ushcn/hcntmptrends.php
But are these the changes reflected by your graph?

Hasse@Norway

The purpose of GISS in my opinion is propaganda. It’s quasi scientific and so complicated that it’s hard for the mainstream media to assess the quality of the claims being made. It gets the job done though for the alarmist cause…

Braden Sneath

A quite simple explanation, I suspect. “The end justifies the means”.

I’ve characterized this phenomena, and it is surely a natural phenomena unrelated to any human intervention, but one not realized until recently, as Temporal Teleconnection with The Past. It’s a quantum-effect kind of thingy.
I’m working on a Properly Peer-Reviewed Paper for an Approved Climatologists-Type Journal having an extremely high Impact Factor. That’s how Science Works and that’s what Scientists do.
I’m also hoping that a bunch of Not-Certified Climatologists don’t find a problem with the concept before I get it published. I didn’t have time to check my results.
The Owner may snip at will, of course.

RW

Bill Marsh – if, before 1999, urbanisation effects were over-corrected, an improved algorithm would reduce the correction. Your problem seems to be that you are so blinkered and prejudiced that you can’t possibly think about the issue in a scientific, objective way.
Bill in Vigo – “From what little my little pea sized brain can understand it that most of the adjustments for UHI seem to more often than not to not change the urban stations to any useful extent down but to adjust the rural stations upward to match the urban set” – your candour about the size of your brain is admirable. Urban adjustments work in precisely the opposite way to what you described.

james griffin

ANTHONY, BETTER CHECK THIS ONE, ESP LAST PARA. ~ EVAN
We all know it has cooled and has done for quite a while.
For the scientific community in more normal times it would be just a matter of noting changes up or down and what if anything it was telling us lomg term..if at all.
However these grapths are nothing to do with the fate of the world.
By closing the argument on the theory of AGW from the start and going as far as calling sceptics “Holocaust deniers” the “Warmers” have raised the stakes against themselves.
Every bit of data is in fact about the long term future of all leading AGW scientists, their lifestyle and importance…and of course the excuse for politicians to use AGW as a means for taxation and for the various stock exchanges to trade in carbon offsets.
How on earth have we got to this stage?
Well…..using climate models that put in the positives of the argument and ignoring the negatives is a good start but so much more.
Do not expect the Hansens and Gore’s of this world to admit they are wrong…early retirement or a low profile will be the order of the day.
PS
[snip] not relevant to this discussion – please don’t post on this topic contained in the [snip] again – Anthony

Mongo

Why does this look like a page from George Orwells “1984”? Revisionist…… rewriting history to support the present policy. This makes me ill.

Phil Johnson

Braden, I think the quote that may be most relevant here is:
He who controls the present controls the past. He who controls the past controls the future.

barbee butts

What troubles me is not that errors occur, it’s the possibility that erroneous data may be used to determine future economic and public policy.

Pierre Gosselin

Mongo,
Welcome to the New America.
Hang on for a lot more in the months ahead.

janama

What I find amazing is that the chart is based around +/- 1 degree C. How many mercury thermometers are accurate, or at least readable, to less than 1 degree C?

M White

Climate change ‘to halt ice age’
http://news.bbc.co.uk/1/hi/sci/tech/7722300.stm
A future post perhaps.

Tilo Reber

“if, before 1999, urbanisation effects were over-corrected, an improved algorithm would reduce the correction.”
They certainly were overcorrected if you need to keep the alarmism going. Since every adjustment seems to result in more warming, we are correctly skeptical. Do you have any evidence that urbanization effects were overcorrected before 1999, or are you simply making it up to let Hansen off the hook?

tarpon

Ahh yes, but can they change the sun?
Two thumbs up Phil … You nailed it.

tty

RW
Take a look at the data. Your hypoothesis that UHI was overcorrected before 1999 simply doesn’t hold water. What you need to postulate is that in 1999 it was discovered that UHI was:
Undercorrected for 1880-1900
Overcorrected for 1900-1968
Increasingly undercorrected after 1968
Sounds pretty plausible, yes?

barbee butts

Looks like they are gradually phasing out the infamous ‘hockey stick’?
Sneeeeeky. 😉

Mike Bryant

Record high temperatures by continent:
http://icecap.us/images/uploads/Continent.jpg
Interestingly, none of these were in the ’30s, and none of them were more recent than 1974.

As I suspected. Watching the blink comparator there was an obvious “kink” in the adjusted data circa 1964. Then Jonas N. provides a link to an early Hansen paper where, sure enough, there is a “kink” in the adjustment in 1964. See Fig 3. p35. Indefensible.

Peter

RW: “Your problem seems to be that you are so blinkered and prejudiced that you can’t possibly think about the issue in a scientific, objective way.”
Then please explain to us in a scientific, objective way how it happened that they apparently over-corrected for UHI from 1970 onwards as well as before 1900, but apparently under-corrected between 1900 and 1970?

Ed Scott

“Change” for the Worse
By Alan Caruba Friday, November 14, 2008
Previously I have written that the global warming hoax was essentially dead and that the many Green organizations advocating all kinds of programs to wreck the nation’s economy were “desperate.”
I was wrong.
The Sierra Club, the Friends of the Environment, and the countless other Green organizations are euphoric and they have reason to be.
The election of Barack Obama and a Democrat controlled Congress has put the Greens in the driver’s seat and we face at least four and possibly eight years of executive orders, legislation, and regulation based on a scientifically baseless lie that will introduce Americans to what life is like in Third World nations where electricity is both costly and unpredictable.
http://canadafreepress.com/index.php/article/6277

Hans Kelp

People can say whatever they want, but in my opinion the keyword in all this is “fabrication”. I have been critizised for my position on this but I maintain that with all these “brilliant” and “professional” scientist it has to be next to impossible to screw things as they do. Considering a declared goal to prove global warming and AGW, one can only become hugely skeptic about what´s going on when temperature readings of earlier times are consequently being corrected downwards and at the same time the temperatures of the last few decades are consequently being corrected upwards. It just doesn´t match up.

Mark

If you watch it, you can see that the updated graph tends to flatten out the older data while the newer data gets a more steeper slope.
Frigging bogus is you ask me. Maybe they are trying to help Mann’s hockey stick II.

Rod Smith

janama (11:48:30) :
“What I find amazing is that the chart is based around +/- 1 degree C. How many mercury thermometers are accurate, or at least readable, to less than 1 degree C?”
As an ancient weatherman, I can truthfully state that every official mercury thermometer I used was quite easily read to the nearest tenth of a degree with no more error induced than plus or minus one-tenth of a degree.

Spam

For me, the message is somewhat diluted by having the two graphs show different data: The 2008 version drops the 5-year trend for the leftmost section of the graph (around 1805), and adds-in the extra data post 2000.
This effectively creates a strawman for people trying to distract the discussion from the changes in the data. Any chance that someone could repeat the graph using the same data periods (only)???
REPLY: Unfortunately the 1999 data in raw form is not available to redo this. But if anyone knows of it, please advise. – Anthony

hunter

GISS is corrupt.
Apparently by design.

Paul Shanahan

M White (11:51:40) :Climate change ‘to halt ice age’
http://news.bbc.co.uk/1/hi/sci/tech/7722300.stm
A future post perhaps.
I have to say, the second paragraph stands out for me.
“Based on geological history, the Earth would be expected to enter a new ice age in 10,000 to 100,000 years. ”
Lets assume that the 10,000 years is correct, I think it’s a little far fetched to even assume mans influence on the planet then. I feel this is another nonense report on behalf of the BBC.

Wondering Aloud

So according to RW since a whole string of unsubstantiated, undocumented and darn unlikely conjectures explaining the historic changes might be true, we should just accept these changes. We should not question them or ask how they came about, we should just BELIEVE!
Heck they might be true and I might win powerball tomorrrow. The second of the two is a higher probability, though I haven’t bought a ticket.

Phil

Why is all of this so important? The Massachusetts v. EPA Supreme Court decision in April 2007 apparently held that CO2 was an “air pollutant.” Using that decision, the Sierra Club succesfully appealed a permit granted by EPA on the basis that the EPA failed to apply “BACT” or best available control technology to limit CO2 emissions from a second waste-coal-fired electric generating unit at Deseret Power Electric Cooperative’s existing Bonanza Power Plant near Bonanza, Utah. Permits for over 100 new coal-fired plants and expansion of refineries now appear to be in legal limbo, pending a decision by the new administration over what the BACT would be. The EPA appeals board, in a historical understatement, said: “In remanding this permit to the Region for reconsideration of its conclusions regarding application of BACT to limit CO2 emissions, the Board recognizes that this is an issue of national scope that has implications far beyond this individual permitting proceeding.” http://yosemite.epa.gov/oa/EAB_Web_Docket.nsf/Recent~Additions/C8C5985967D8096E85257500006811A7/$File/Remand…39.pdf
IIRC, coal supplies about 50% of our electricity and no new refineries have been built in decades – only existing ones have been expanded. Since CO2 is now apparently classified as an actual “air pollutant” AND this decision may now serve as a precedent requiring control of CO2 emissions, ANY future emissions of CO2 will probably now be required to be regulated. No mention is made of how this will affect future political campaigns or oratory in Congress.

Araucan

What means TOBS, SHAP, FILNET ?
Thanks