By Steve Goddard
h/t to reader “Phil.” who lead me to this discovery.
In a previous article, I discussed how UAH, RSS and HadCrut show 1998 to be the hottest year, while GISS shows 2010 and 2005 to be hotter.
But it wasn’t always like that. GISS used to show 1998 as 0.64 anomaly, which is higher than their current 2005 record of 0.61.
You can see this in Hansen’s graph below, which is dated August 25, 1999
But something “interesting” has happened to 1998 since then. It was given a demotion by GISS from 0.64 to 0.57.
http://data.giss.nasa.gov/gistemp/graphs/Fig.A2.lrg.gif
The video below shows the changes.
Note that not only was 1998 demoted, but also many other years since 1975 – the start of Tamino’s “modern warming period.” By demoting 1998, they are now able to show a continuous warming trend from 1975 to the present – which RSS, UAH and Had Crut do not show.
Now, here is the real kicker. The graph below appends the post 2000 portion of the current GISS graph to the August 25, 1999 GISS graph. Warming ended in 1998, just as UAH, RSS and Had Crut show.
The image below superimposes Had Crut on the image above. Note that without the post-1999 gymnastics, GISS and Had Crut match quite closely, with warming ending in 1998.
Conclusion : GISS recently modified their pre-2000 historical data, and is now inconsistent with other temperature sets. GISS data now shows a steady warming from 1975-2010, which other data sets do not show. Had GISS not modified their historic data, they would still be consistent with other data sets and would not show warming post-1998. I’ll leave it to the readers to interpret further.
————————————————————————————————————-
BTW – I know that you can download some of the GISS code and data, and somebody checked it out and said that they couldn’t find any problems with it. No need to post that again.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.




If you want to prove a trend then make sure there is a trend by modifying the past. Bingo! the good old hockey stick rears its ugly head. Same with CO2 levels. Alarmists have shown that it has remained at a steady 250ppmv until the 20th century when it starts to climb. They fail to look at past data sets which show CO2 levels of up to 500ppmv in the late 1800’s. They also forget that plants require CO2 levels of at least 200ppmv to survive and without CO2 this planet would be a barren wasteland with no life of any sort.
I have just been looking at the apparent best-fit segment slopes for the decimal date interval of 2003.419 through 2010.542 from various sources.
GISS land station data only: 0.018 deg C / yr.
GISS land-ocean combined: 0.008 deg C / yr.
NOAA NCDC land-ocean: 0.000 deg C /yr.
HadCrut3 land-ocean: -0.007 deg C/ yr. (cooling)
It seems like you get your choice of fresh or stewed cherries. All of these appear to have slopes near 0.027 deg C / yr from 1994 to 2003. Depending on the criteria you choose, your mileage may vary.
Please explain this.
http://nsidc.org/data/seaice_index/images/daily_images/S_timeseries.png
And please read this.
“Twentieth century bipolar seesaw of the Arctic and Antarctic surface air temperatures”
http://www.agu.org/pubs/crossref/2010/2010GL042793.shtml
Sorry if ‘off-topic’, but there have been references above to Roger Harrabin’s BBC R4 piece.
The guy is rising in my estimation, and I would say that his piece today was fair and balanced. Prior to Climategate I had him marked as an AGW ideologue; since then I reckon that he has risen to the standard we Brits expect from the Beeb.
Does this matter – the performance of a non-scientist broadcaster? You bet! The key to neutering the AGW scare story is public opinion. The Hansens and Watsons of this world will never concede defeat, even if there were five consecutive winters with New York immobilised by blizzards: they’ll say ‘temporary reprieve’ until they die. No, when the ordinary Joe from Missouri bursts out laughing at any mention of Global Warming, that’s when AGW can be declared dead; that’s when the IPCC will implode.
“Conclusion : GISS recently modified their pre-2000 historical data, and is now inconsistent with other temperature sets. GISS data now shows a steady warming from 1975-2010, which other data sets do not show”
Wrong.
James Sexton says:
They have an algorithm that changes historical data each time its ran? What good is it? If you don’t like the word create, how about over-exaggerate. Well, only for present time. Apparently the algorithm reverses course as it goes back in history. Nice. What I believe is unlikely, is that this is anything but intentional. I wonder how often they run this “algorithm”? Every time data gets 10 years older? How is it possible to determine an anomaly for global temps if the historic temps continually change?
They run it once per month.
Perhaps I am being a little too nuanced, but they are not changing existing historical data. It is their estimates for missing data that changes. However, because so many data points are missing from the record (like holes in swiss cheese), their estimation strongly influences the global summaries they publish every month.
And yes, because their previous results change over time, it certainly does cause one to question the value of the results.
John Goetz
The algorithm you describe would seem to lend itself to producing almost any desired result.
Give it a rest. UAH had an error in their calculations, they fixed, published that they fixed on error…unlike most of the researchers on the other side of the debate. I find it trite and self serving to bring that up as some sort of proof that UAH is not doing good work. What that tells me is that you are just reaching when you have to bring that up.
As an example, Von Braun and his team blew up quite a few rockets before they got it right. By your reckoning, they would be forever branded by their failures instead of the many successes and general advancement of the discipline. You know, that is what happens when people actually do stuff, they make mistakes, learn, improve and keep on going. While a REMF like yourself casts stones without any skin in the game.
[snip – valid comment but a false email address – see policy page]
With no real evidence except the various apparent global temperature anomaly data differences, all that can be said is that these inconsistencies just serve to indicate the basic random ‘fuzzyness’ associated with the information.
I stopped reading when I came to this:
Steven Mosher says:
August 29, 2010 at 2:49 pm
but not a single one of those issues is important to the real debate about the uncertainties of global warming. As new data comes into the algorithm the past ESTIMATE will change. This is a consequence of that method.
What kind of a BS is this? If an algorithm, which is man made and written by these charlatans, keeps showing different figures for past historical climate data trends, it is a fraudulent and unreliable piece of computer program that is designed to cheat. What kind of a stupid justification is being given for this by some people here? Anyone who thinks this is no great deal and is very normal should have their head examined.
What the hell is wrong with people who try to justify a computer program ” estimating ” past facts? How the hell is that acceptable? A past fact is a past fact and any estimation of that which changes reality is a deliberate act of fraud.
Steven Goddard’s conclusion and thrust of this post was very clear that GISS deliberately altered past data by using dubious algorithms. People who argue against that conclusion are far removed from reality.
Couldn’tthis just be those who might know about such things, removing earlier accentuations in the temps. so that they don’t give a false impression of cooling in more recent times ?
kadaka (KD Knoebel) says:
August 29, 2010 at 4:00 pm
Ah, so your in-depth research has concluded he is using the unicode smiley (ctrl-shift-u together then 263a then enter for my Debian Linux setup) versus the ANSI smiley (alt with the 1 on the numeric keypad for a Windoze box)? How did you determine the difference?
There is no ANSI smiley face – ANSI (ISO 8859) is an 8-bit character encoding standard. Don’t get confused by the many ways different computers/software use to enter characters – you can easily View Source for web pages and see the underlying codes in any hex editor.
The smiley is a 16-bit unicode character – as is his evil twin: ☻
⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊⏊
James Sexton:
The name of the game is higher highs. It’s about the PR, not the science. Without higher highs, the bull market in AGW is dead.
Think about this for a moment. If you believe in AGW and that CO2 is the main driver of the increasing anomalies, and if your belief is unshakable and supported by the vast majority of your colleagues, you are going to look for ways to adjust the methods of analysis so that the results fit the science, especially if temps look like they aren’t going up enough. I mean the temps have to be going up. They just have to be. And temps had to have been cooler when there was less CO2 (which also implies that that cooling the past is totally rational and justified). So even if I do a little fudging here and there — such as creating values where none exist and adjusting these non-existent values periodically and such as smearing a single value over a huge geographic area without any data (what luck ye gods!) — the end result is that the analysis is still going to be accurate even if it is somewhat synthetic. In a way, I’m really doing everyone a favor by beating nature to the punch.
The bet is that the anomalies are going to catch up one way or another. A little divergence from the other three datasets isn’t going to kill the analysis over the short haul. There is still some wiggle room. And don’t forget the sandbagging of the literature in support of the methods, but that’s a completely different story and one that provides for a great deal of entertainment.
That’s the game in a nutshell.
John Goetz says:
August 30, 2010 at 5:28 am
John, thanks for responding. I’m in agreement with you except I’m not nearly as generous as to motivations and expected outcomes of their methods.
“Perhaps I am being a little too nuanced, but they are not changing existing historical data.”
Yes, perhaps our view of ‘changing historical data’ is a disagreement in the context of viewing historical data. I would contend that they do, indeed, change historical data, in terms of adjustments and homogenizing and otherwise processing the temps. I understand the need to do so, but for this consistent historical downward trend, gives one pause and more. Perhaps if they printed in bold, DEFINITIVE STATEMENTS SUBJECT TO CHANGE on every thing they state and publish, then I probably wouldn’t cast a cynical eye when viewing their assertions.
[snip off topic – taunt]
“Daniel H says:
August 29, 2010 at 8:45 am
Could this discrepancy be related to Hansen’s post Y2K adjustment error? That was discussed here:
http://wattsupwiththat.com/2007/08/11/does-hansens-error-matter-guest-post-by-steve-mcintyre/
As I explained when I first posted about this that’s exactly why the adjustment was made but that appears to have been left out in this post.”
Phil. (or anyone else)
Can you explain to me how a post-2000 data change to US TEMPS affected the GLOBAL anomaly number from 1998?
Realclimate has already gone on record saying the Y2K problem made “no difference” on a global scale. So therefore it has absolutely nothing to do with why 1998 went from .64C to .57C.
John, what do you mean “… does cause one to question the value of the results.”?
Every month they run a computer program which “guesses” what the temperature might have been in an area where they have no measuring instruments. This program, you have already admitted, probably is influenced by the existing trend (I think that’s a fair summation of what you wrote earlier).
I worked in four different areas of business in my working life and if had ever used a trick like that to justify an action or a business decision I would have been fired on the spot.
The temperatures are what they are. If you have places with no record then too bad; work with what you have. More and more we see (at least anecdotal) evidence of a system which deliberately selects certain weather stations and then extrapolates in a way which distorts the record.
And at the end of all this we are seeing claims that these programs, algorithms, smoothings, whatever, can still tell us the variations in the earth’s temperature to hundredths of a degree.
Forgive me if I see no justification at all for defending an organisation or an individual that plays games like this with (in effect) the future of humanity.
[snip – see above note on original – repeating it twice doesn’t change the conditions]
Smokey says:
August 29, 2010 at 5:51 pm
By claiming that I’m so ignorant of the subject, you’re implying that you’re knowledgeable.
I can watch somebody play tennis and realize he’s terrible, without being a world ranked player myself. But I was encouraging you this time, not claiming you’re ignorant – certainly you’ll be more knowledgeable in 30 or 40 years.
So speaking of guesses, give us your expert opinion: GISStimate your prediction for, oh, I don’t know… how about the lowest N.H. ice extent in 2011? What’ll it be? And don’t pull a Hansen by using the Sharpshooter’s Fallacy; be specific. This dummy wants to see a renowned expert’s prediction. Got cojones?☺☺
I’d like to see how WUWT handles this Arctic summer minimum first. I made a specific prediction months ago, as did stevengoddard. If the final result, which is only weeks away, is handled with honesty and integrity, I might make another prediction for next summer.
I’m looking forward to the Cryosat-2 data – that ice thickness data, plus the post-mortem of this summers melt season, would underly any prediction for next summers minimum.
Of course, any prediction I make would lie somewhere between young molten Earth and snowball Earth, so it is all within “natural variability” for someone with a 4.6 billion year perspective ☻
Venter
No one’s work is beyond question or review. That is how science works.
It was once said, and is sometimes still said, that there are “Lies, Damned Lies, and Statistics!”
I move we henceforth and evermore also include the phrase that there are “Lies, Damned Lies, and Gisstimates!”, and that this phrase take on special derogatory significance when speaking of anything related to the contrived, voodoo-science field of Climatology and/or NASA’s credibility in generating reliable climate data.
” Venter says:
August 30, 2010 at 7:13 am (Edit)
What kind of a BS is this? If an algorithm, which is man made and written by these charlatans, keeps showing different figures for past historical climate data trends, it is a fraudulent and unreliable piece of computer program that is designed to cheat. What kind of a stupid justification is being given for this by some people here? Anyone who thinks this is no great deal and is very normal should have their head examined.”
I will give you a simple one.
The Reference station method works something like this. Say you have 2 stations that are in close proximity to each other.
One starts in 1985, The other starts in 1900. in the year 2000 the stations have 15 years of OVERLAP. The algorithms “combines” stations at the “same” location SUBJECT to overlap rules. If there are less than 20 years of overlap, they are not combined. The station that starts in 1985 IS NOT INCLUDED. No overlap, no inclusion. Time marches on. THEN in 2005, you do have 20 years of overlap, so the
station gets “included” in the reference station time series and the past ( 2000-2004) will change. what changes is the ESTIMATE of temps. It changes BECAUSE the data changed. What changed in the data? MORE stations reporting in the 1985-2004 period. a station that WAS excluded can now be INCLUDED in the estimate. It is one of the BENEFITS of the RSM method. it uses more data than say CAM method which CRU uses.
Dont you people read the code the criticize
Just a comment;
In reply to; Günther Kirschbaum, Phil, jeez, et. al;
Are global surface temperatures actually increasing or are we walking up a ‘down’ escalator?
In 10 years is GISS going to claim 2020 is the hottest ever with at .6 anomaly while claiming 2010 was actually .57 and 1998 was at .54?
Either temp’s are actually increasing or GISS is engaged in some sort of tribalistic statistical rain dance.
I don’t need to look at the data or the code to see this; the claim is that .6 is the hottest ever, in 1998 the anomaly was claimed to be above .6 and is now shown below .6. Who am I to trust, GISS data and code or my own lying eyes?
If temps are increasing there’s no need to adjust the old temperatures down to show this. If temperatures are merely staying the same while you adjust old temperature data downward to show warming then GISS has a lot of explaining to do. I don’t think anybody would provide much funding to mitigate that problem or study that issue. They’d simply conclude that you are a cargo cult and refer students to documentation regarding the phenomenon.
Now, explain the logic behind any changes that caused the discrepancy or be a laughing stock. Your choice.