GISStimating 1998

By Steve Goddard

h/t to reader “Phil.” who lead me to this discovery.

In a previous article, I discussed how UAH, RSS and HadCrut show 1998 to be the hottest year, while GISS shows 2010 and 2005 to be hotter.

But it wasn’t always like that. GISS used to show 1998 as  0.64 anomaly, which is higher than their current 2005 record of 0.61.

You can see this in Hansen’s graph below, which is dated August 25, 1999

But something “interesting” has happened to 1998 since then. It was given a demotion by GISS from 0.64 to 0.57.

http://data.giss.nasa.gov/gistemp/graphs/Fig.A2.lrg.gif

The video below shows the changes.

Note that not only was 1998 demoted, but also many other years since 1975 – the start of Tamino’s “modern warming period.” By demoting 1998, they are now able to show a continuous warming trend from 1975 to the present – which RSS, UAH and Had Crut do not show.

Now, here is the real kicker. The graph below appends the post 2000 portion of the current GISS graph to the August 25, 1999 GISS graph. Warming ended in 1998, just as UAH, RSS and Had Crut show.

The image below superimposes Had Crut on the image above. Note that without the post-1999 gymnastics, GISS and Had Crut match quite closely, with warming ending in 1998.

Conclusion : GISS recently modified their pre-2000 historical data, and is now inconsistent with other temperature sets. GISS data now shows a steady warming from 1975-2010, which other data sets do not show. Had GISS not modified their historic data, they would still be consistent with other data sets and would not show warming post-1998. I’ll leave it to the readers to interpret further.

————————————————————————————————————-

BTW – I know that you can download some of the GISS code and data, and somebody checked it out and said that they couldn’t find any problems with it. No need to post that again.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
325 Comments
Inline Feedbacks
View all comments
DR
August 30, 2010 7:38 pm

You know, if GISS (or HadCRUT for that matter) data is as good as some think, then that means the surface really is warming faster than the LT and the greenhouse effect hypothesis is upside down.
Either that or all the satellite data (completely independent products since 2002) are bohunk.
The LT is supposed to be warming faster than the surface, isn’t it…..somewhere? Or am I mistaken?

James Sexton
August 30, 2010 7:41 pm

James Sexton says:
August 30, 2010 at 7:17 pm
Thanks mods. I thought it was apropos in more than one way here. Thanks for seeing that.

James Sexton
August 30, 2010 7:57 pm

James Sexton says:
August 30, 2010 at 7:35 pm
“REPLY: snip – please rewrite this and feel free to resubmit when you don’t have to use vulgarity to get your point across. – Anthony”
Yeh, I know. Truly, it is sometimes it’s difficult to convey the thought and expression in type. I don’t typically resort to vulgarity, but it would have conveyed the appropriate sentiment over the blogasphere.

August 30, 2010 8:17 pm

jeez
Please detail GISS “underlying data” for the Arctic. I’m keen to hear how they measure temperatures without the use of any temperature sensors.

Ralph Dwyer
August 30, 2010 8:22 pm

Steven Mosher says:
August 30, 2010 at 4:02 pm
Not that I’m anybody, but I think the point you’re trying to make is being missed by everybody on this thread. And people need to lighten up and hear what you are saying: It’s DATA QUALITY!
Slap me down if I’m wrong; but, in every example you’ve given, the second data set has been higher (I’m going to guess that there are very few second data sets that are lower). Therefore, the objective should be to examine the quality of these second data sets (for heat island, adjustments therefore and thereto, etc.). I think Mr. Watts, et.al. are examining this as we converse (soon to be published?). And what happens when those better methods you mentioned are used with the new quality data? We’ll all have to live with it.
It’s time to take a *chill* pill (pun intended)!
Thanks for your time,
Ralph Dwyer

James Sexton
August 30, 2010 8:23 pm

jeez says:
August 30, 2010 at 7:54 pm
Ok, if you don’t like “alarmist” how about apologist for the alarmists? How difficult is it for you to say, “I don’t know.”? How about, “We don’t have a way of knowing right now.”?
You asked, Seriously this is just Math isn’t it? So what is the equation since there are no subjective decisions to be made?
I’m not the one making assertions. It isn’t to me to create the equation. But, were I the one to do so, I wouldn’t create an equation that would be, by my own admission, incorrect. I’d at least have a plausible explanation why my assertions were correct if/when I made an assertion. From the discussion on this thread there is no reasonable expectation of GISS pronouncements to be correct. BY THEIR OWN ADMISSION!!!
You also said, “I have also said in this thread that I think GISS’s constant estimation is a flawed method in my opinion, but it also doesn’t matter in the big picture.
Here’s a big picture for you. Almost daily, across the globe, we pass laws making food and energy more difficult to obtain because of the “equations” and guessamations. You do know that people suffer and die because of this, right? You do know that people in other countries are condemned to poverty because the laws we are passing based on these guessamations that are admittedly wrong to begin with? Right? You know, I wish I had the convenience of a lack of conscience. If I were a betting man, and I am, I’d bet GISS is more correct than wrong. That said, I’m not willing to bet someone else’s life or livelihood on it, much less an entire world’s. For the life of me, I can’t see why you don’t see the “what is right today, is wrong tomorrow, but may be right or wrong again in the future” as wrong.
I’ll ask you the same thing I asked Mosh.
Is being wrong in a backward trend by .07 degrees sufficient over a decade period of time? Well, if the trend continues, GISS will have been off by 0.7 degrees in a century. If we’re discussing a 10 degree/century, probably not. But, that isn’t what we’re discussing, is it? We’re talking 2-3 degrees/century. Damn man!!!! If this keeps up, 1998 will be at zero with the base line still being used today! Are you going to tell me 1998 wasn’t or isn’t going to be an exceptional year if we continue to use the current baseline?
Is this sufficient?

Ralph Dwyer
August 30, 2010 8:34 pm

In case anyone is wondering to which I was refering, here is the exact quote:
Steven Mosher says:
August 30, 2010 at 4:02 pm
“The ACTION in this debate is about the data quality. PEROID. Not about the processing method. Thats the wrong grounds. Its the wrong grounds because the BEST METHOD shows more warming than their flawed method. GET IT!
“Dont aim your gun at your foot.”

August 30, 2010 8:35 pm

Kada.
“Offhand the reason might be because those are not their adjustments, not done the way they want to do them. So basically, they take the pre-adjusted historical records, then transmogrify them into their own version of the historical record. Which leads to the commonly-accepted principle, “You modify it, you own it.””
well first you would have to understand the “adjustments” That NCDC made. One used to be a UHI adjustment. Since hansen had his own proceedure It perfectly acceptable to apply your own adjustment. Second you look at the Magnitude of the adjustments. GISS adjustments are inconsequential to the overall average. Personally, I would take a different approach. But they document what they do. More troubling are the adjustments that GHCN make. That is the REAL ISSUE. The largely undocumented and untraceable changes made PRIOR to data IN at GISS. From data IN at GISS to data OUT we know the steps. We can quibble about 1/100s of a degree that GISS fiddles with. Thats a side show. OR you can focus on the real problems
1. the metadata that describes a station
2. The adjustments a that get made PRIOR to DATA IN at GISS, CRU et al.
Side show or real problem. Personally I’d like more REAL HELP in the trenches. I’d like arm chair analysts to get off their butts in join me in FOIA to NOAA. How many have you done?I have 500 pages from my last FOIA, prgramming notes, memos, phone logs, meeting notes. you get the idea. GHCNv3 is going to happen, have you downloaded the code that explains the homogenization? Probably not. That task has been left to me and a couple other guys. So once again a few people will do the work while others opine without real facts or real experience.

James Sexton
August 30, 2010 8:50 pm

Ralph Dwyer says:
August 30, 2010 at 8:22 pm
Ralph, I understand what you are saying. One of the many problems I have, is that the properties of mercury have been well known for quite some time. We knew the properties of mercury all the way back in 1998 as much as we do today. Oddly, we are now treating 1998 as if it were 1798 and apparently back then, they didn’t know how to keep the thermometer out of the outhouse. So, the reading gets to be adjusted downward. And rightfully so. Their thermometer was in a pile of excrement! But today, today’s temp is imagined. But, it is imagined to be cooler than what it really is, so we adjust it upwards. This is all ok because we have an algorithm that does so, CONSISTANTLY. We know it’s right because the programmer showed us step by step how it works that way. And we all know, if it is coded, then it is true. Forget what we said back then, we now have further insight to truth and it is better truth than the last truth. Of course, tomorrow will be better truth than today, but all of it is truth nonetheless, but only on a day to day basis.
Yep, just like the college sophomores I always remembered.

James Sexton
August 30, 2010 8:57 pm

Ralph Dwyer says:
August 30, 2010 at 8:34 pm
“In case anyone is wondering to which I was refering, here is the exact quote:
Steven Mosher says:
August 30, 2010 at 4:02 pm
“The ACTION in this debate is about the data quality. PEROID. Not about the processing method. Thats the wrong grounds. Its the wrong grounds because the BEST METHOD shows more warming than their flawed method. GET IT!”
Yeh, I’m still wondering about how he can show that. Look here and tell me what you think. Be sure to explore the site, it is neutral and well worth visiting.
http://www.woodfortrees.org/plot/gistemp/from:1998/trend/plot/hadcrut3vgl/from:1998/trend/plot/rss/from:1998/trend/plot/uah/from:1998/trend

August 30, 2010 8:58 pm

Steven Mosher
Most adults recognize that there is more than one way to analyze a problem. Rather than assuming that your view of the world is the only correct one, you might want to open your mind to the idea that other people have good ideas too.
The arrogance displayed here is indeed breathtaking.

James Sexton
August 30, 2010 9:11 pm

stevengoddard says:
August 30, 2010 at 8:58 pm
“Most adults recognize that there is more than one way to analyze a problem. Rather than assuming that your view of the world is the only correct one, you might want to open your mind to the idea that other people have good ideas too.
The arrogance displayed here is indeed breathtaking.”
Character is a funny thing. Many of us know we lack character because we have the intelligence to know character is a desirable, if not, necessary trait. Those that lack intelligence do not pursue character.
Einstein once said, “Weakness of attitude becomes weakness of character.”

August 30, 2010 9:16 pm

Ralph
‘Not that I’m anybody, but I think the point you’re trying to make is being missed by everybody on this thread. And people need to lighten up and hear what you are saying: It’s DATA QUALITY!”
Yes. in any analysis problem there are always two components. Data quality and algorithm limitations. When it comes to algorithms for processing data, there are usually subjective/methodological choices. If we are good analysts we test a variety of choices to see how our choices shape the answer. We want to Minimize bias and minimize uncertainty. Let me give you an example of a “subjective” decision. The baseline period for CAM. Why does CRU pick 30 years? Jones actually discusses this issue in the mails. But the reading adverse probably dont recall that mail. The question is Does the answer change if you pick 29 years? 25? 18? 32? 35. Well, we can test that and the answer does change. Not much, but it does. And we know why it changes. What number of years is objectively correct? Wrong question. The question is does the selection change the MOM ( measure of merit) and by how much? CRU decide that 15 years are required. What about 16? what about 30? what about 20 years where each year as a minimum of 11 months? All good questions. And for someone like me, they are question I answer by varying the parameter throughout the range to see the effect. So I dont sit at home and opine. I actually do the test. We also test the methods with Synthetic data. Does it capture a signal buried in noise? how well? with bias or no bias? but how many people read Jeffs work or Chads work. who here even read mciintyres work on Hansens RSM method? written over TWO years ago:What steve did was study the code, implement a emulation, write his own approach and compare the results. he didnt cut and past a graph and count pixels. he did real work. Real work that has a statistical point. So, we understand that RSM doesnt necessarily give us an unbiased answer. By comapring Hansens method with Romans we gain insight into RSMs shortcomings. We also gain insight into the shortcomings of CRU and of FDM (preferred by Willis and EM ) As I noted jeffs work changed my view of FDM. But here is steveMc. real work on Hansen. If people have not read that they have no standing in my eyes. No credibility. its required reading.
http://climateaudit.org/2008/06/28/hansens-reference-method-in-a-statistical-context/
The sum total of all the work that people like JeffID, roman, chad,zeke, nick stokes, and others is this: The methodological issues amount to a VERY SMALL component of the final answer. If anything the methods used by CRU and GISS underestimate the warming. So, my suggestion is that the focus should turn to data quality. EXCLUSIVELY. focus your brains where the real problem is. Focus your efforts on understanding the homogenization code. Demonstrate that you have a nose for the real issue and not the side show. In a nutshell that is the message. Dont talk about GISS unless you know the code. you will make mistakes. You will make the same mistakes that I made before I read the code. You will walk down paths that others have. You will miss the real problem and waste time and bandwidth on the tangents. On metadata, I need someone with GIS talent. Otherwise I will have to teach myself how to do that. I will, but every second I spend correcting the bogus issues is time I cannot spend on understanding the homogenization code or in getting better metadata.

August 30, 2010 9:23 pm

stevengoddard says:
August 29, 2010 at 9:13 am
Phil.
It is one of the miracles of modern science how GISS corrections almost invariably seem to make the past cooler, and the present warmer.

Thanks for the acknowledgment, I was glad I was able to lead you to the discovery that you had previously been wrong about the GISS temperature record.
This correction to the data can be laid at the door of that well-known ‘warmist’, Steve McIntyre.
He brought to Hansen’s attention that a change in the data base used by GISS had led to a change from a TOBS adjusted series to a series without that adjustment, the result of which was a jump in temperature. When GISS corrected that error the jump was removed and the 1998 global temp dropped as you described. I don’t know why you didn’t include that in your post as well?
Sooner or later they may be able to correct away the Dust Bowl entirely.
Seems a rather odd comment since this correction reduced the 1998 temperature compared with the 1934 value, if they’d been trying to correct away the Dust Bowl surely they would have adjusted upwards?

August 30, 2010 9:25 pm

Steven Mosher
Most adults recognize that there is more than one way to analyze a problem.
#########
It has been my argument that there are MANY ways to address the problem. Did you read what Zeke and I wrote on the various methods. We recognize that there are many ways. The difference is that we actually code up the different approaches and TEST THEM to see how important those differences are. Do you read where i wrote about the different analytical choices? probably not. Did you read where McKittrick wrote about the question of methods? probably not. The bottom line is there are basically these methods:
1. a least squares approach
2. CAM
3. RSM
4. FDM
Now, I used to put FDM in position 2. But reading Jeffs work changed my mind about that approach. It changed other guys minds as well. But the bottom line is this. NONE of those choices change the final answer in any substantial way. Are the differences interesting technical discussions? yup. Do they change the science of AGW? nope.

James Sexton
August 30, 2010 9:30 pm

Steven Mosher says:
August 30, 2010 at 8:35 pm
“……OR you can focus on the real problems
1. the metadata that describes a station
2. The adjustments a that get made PRIOR to DATA IN at GISS, CRU et al.”
Steven, you don’t think GISS is aware of your points 1. or 2.? RU KIDDING ME? But, then you’ll lend credence and validity to their assertions? Again, are you kidding me?
lol, Steven, did you lose your ability to be a critical thinker in this regard?

August 30, 2010 9:38 pm

“jeez
Please detail GISS “underlying data” for the Arctic. I’m keen to hear how they measure temperatures without the use of any temperature sensors.”
they dont measure. they estimate.
I will ask you a couple questions, see if you can answer.
1. you have a measurement at 80N, 30E for 100 years. That measure shows a 1C per century warming.
2. You have a measure at 80N 150W for 100 years. That measure shows 1 C per century.
What is your best estimate of the trend at 90,0?
A. no guess ( CRU)
B. Estimate 1C ( GISS)
C. cooling?
D. the global average.
I weighed 10lbs at birth. At age 12 I weighed 120 lbs. whats your best estimate of my weight at age 6? you have no measurement. Are you misleading people if you say
“based on the data at hand I guess X” no. you yourself make estimates of what you think the ice will be at. How’d that work out? You estimate area from counting pixels? how’d that work out? you estimate slopes by fitting OLS lines. you know that method has assumptions.

August 30, 2010 9:41 pm

steven,
Are you practicing Phil. rudeness? or perfecting it?

August 30, 2010 9:49 pm

Steven Mosher
GISS has very little data in the Arctic. Is that a difficult concept to understand?

August 30, 2010 9:50 pm

Steven Mosher
Your rudeness has been on a par with your arrogance and your hidden agenda.

James Sexton
August 30, 2010 9:53 pm

“If people have not read that they have no standing in my eyes. No credibility. its required reading.
http://climateaudit.org/2008/06/28/hansens-reference-method-in-a-statistical-context/
I can’t speak for the rest of the people here, but I believe many, if not most, of us here have read the GD post! Do you think you live in a vacuum? I was reading CA before I ever heard of WUWT! Personally, I followed the whole damned thing. Stop. For whatever reason, you’re trying to convince people that GISS is proper in their assertions. You know they are not.
You’ve gone from absolute rejection of other peoples thoughts, to a condescending manner about how people can’t understand basic math principles, to rationalizing that it isn’t GISS that is doing the “adjustments”, but GCHN and NCDC. I notice you didn’t mention NOAA. Steven, it isn’t rational to believe GISS doesn’t understand what these groups are doing to the data. Further, it isn’t rational to accept the assertions GISS makes understanding that the assertions will change in the near future. Seek help or show me where it is rational to have such beliefs.

August 30, 2010 9:55 pm

Anthony Watts says:
August 29, 2010 at 9:44 am
Phil.
And what of 1934 Phil. ? The GISS Y2K data step failure made 1934 the hottest year in the USA on record by a small margin.
That seems to have been reversed also.

I’m not sure why you say that, Hansen has always said that 1934 is the record.
“In the contiguous 48 states the statistical tie among 1934, 1998 and 2005 as the warmest year(s) was unchanged. In the current analysis, in the flawed analysis, and in the published GISS analysis (Hansen et al. 2001), 1934 is the warmest year in the contiguous states (not globally) but by an amount (magnitude of the order of 0.01°C) that is an order of magnitude smaller than the uncertainty.”
In his 2001 paper, before he was aware of the data error Hansen said:
“The U.S. annual (January-December) mean temperature is slightly warmer in 1934 than in 1998 in the GISS analysis (Plate 6). This contrasts with the USHCN data, which has 1998 as the warmest year in the century. In both cases the difference between 1934 and 1998 mean temperatures is a few hundredths of a degree. The main reason that 1998 is relatively cooler in the GISS analysis is its larger adjustment for urban warming. In comparing temperatures of years separated by 60 or 70 years the uncertainties in various adjustments (urban warming, station
history adjustments, etc.) lead to an uncertainty of at least 0.1°C. Thus it is not possible to declare a record U.S. temperature with confidence until a result is obtained that exceeds the temperature of 1934 by more than 0.1°C.”
Seems off the radar now it never gets a mention anymore, either in global or CONUS context:
http://www.giss.nasa.gov/research/news/20100121/
“Although 2008 was the coolest year of the decade, due to strong cooling of the tropical Pacific Ocean, 2009 saw a return to near-record global temperatures. The past year was only a fraction of a degree cooler than 2005, the warmest year on record, and tied with a cluster of other years — 1998, 2002, 2003, 2006 and 2007 — as the second warmest year since recordkeeping began. ”
Well it is a CONUS record, that quote is about global data.

James Sexton
August 30, 2010 9:59 pm

“How’d that work out? You estimate area from counting pixels?”
Can you show how pixel counting is less accurate?

August 30, 2010 10:10 pm

James:
Steven, you don’t think GISS is aware of your points 1. or 2.? RU KIDDING ME? But, then you’ll lend credence and validity to their assertions? Again, are you kidding me?
**********
james yes there were some metadata issues that GISS was not aware of and some metadata solutions that NOAA was apparently not aware of. as for the adjustment issues, yes there are some issues that have not been raised. Further, whether or not they are aware of all the issues is immaterial to the question of what you should focus on. You can choose to focus on the weak part of their argument or the strong part. You can focus on the thing that requires work or you focus on the things you can write comments about things that dont require much in the way of thought much less work. So, I have respect and admiration for the work that Anthony has done. It was real work. A hunch, followed up by investigation, followed up by field work. And I think that the more people focus on that corner of the problem the better. The more people make this place a place where non problems get raised, or were minor issues get blown out of proportion the worse.

August 30, 2010 10:39 pm

Everyone involved with image processing counts pixels of one form or other. A pixel is a digital representation of an equal area of data.
Some people think they are being really clever by repeatedly displaying their ignorance on the topic.