Chiefio asks: why does GISS make us see red?

A GIS anomaly map with a 9999 hot ocean from baseline = report  period

The empty ocean goes infinite hot on a null anomaly

What to make of THIS bizarre anomaly map?

What Have I Done?

I was exploring another example of The Bolivia Effect where an empty area became quite “hot” when the data were missing (Panama, posting soon) and that led to another couple of changed baselines that led to more ‘interesting red’ (1980 vs 1951-1980 baseline). I’m doing these examinations with a 250 km ’spread’ as that tells me more about where the thermometers are located. The above graph, if done instead with a 1200 km spread or smoothing, has the white spread out to sea 1200 km with smaller infinite red blobs in the middles of the oceans.

I thought it would be ‘interesting’ to step through parts of the baseline bit by bit to find out where it was “hot” and “cold”. (Thinking of breaking it into decades…. still to be tried…) When I thought:

Well, you always need a baseline benchmark, even if you are ‘benchmarking the baseline’, so why not start with the “NULL” case of baseline equal to report period? It ought to be a simple all white land area with grey oceans for missing data.

Well, I was “A bit surprised” when I got a blood red ocean everywhere on the planet.

You can try it yourself at the NASA / GISS web site map making page.

In all fairness, the land does stay white (no anomaly against itself) and that’s a very good thing. But that Ocean!

ALL the ocean area with no data goes blood red and the scale shows it to be up to ‘9999′ degrees C of anomaly.

“Houston, I think you have a problem”…

Why Don’t I Look In The Code

Well, the code NASA GISS publishes and says is what they run, is not this code that they are running.

Yes, they are not publishing the real code. In the real code running on the GISS web page to make these anomaly maps, you can change the baseline and you can change the “spread” of each cell. (Thus the web page that lets you make these “what if” anomaly maps). In the code they publish, the “reach” of that spread is hard coded at 1200 km and the baseline period is hard coded at 1951-1980.

So I simply can not do any debugging on this issue, because the code that produces these maps is not available.

But what I can say is pretty simple:

If a map with no areas of unusual warmth (by definition with the baseline = report period) has this happen; something is wrong.

I’d further speculate that that something could easily be what causes The Bolivia Effect where areas that are lacking in current data get rosy red blobs. Just done on a spectacular scale.

Further, I’d speculate that this might go a long way toward explaining the perpetual bright red in the Arctic (where there are no thermometers so no thermometer data). This “anomaly map” includes the HadCRUT SST anomaly map for ocean temperatures. The striking thing about this one is that those two bands of red at each pole sure look a lot like the ‘persistent polar warming’ we’ve been told to be so worried about. One can only wonder if there is some “bleed through” of these hypothetical warm spots when the ‘null data’ cells are averaged in with the ‘real data cells’ when making non-edge case maps. But without the code, it can only be a wonder:

GISS anomaly map with HadCRUT SST anomalies, bright red polesWith 250 km ‘spread’ and HadCRUT SST anomalies we get bright red poles

The default 1200 km present date map for comparison:

GISS Anomaly map for November 2009GIS Anomaly Map for November 2009

I’m surprised nobody ever tried this particular ‘limit case’ before. Then again, experienced software developers know to test the ‘limit cases’ even if they do seem bizarre, since that’s where the most bugs live. And this sure looks like a bug to me.

A very hot bug…

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

161 Comments
Inline Feedbacks
View all comments
February 1, 2010 3:07 pm

9999 is ‘null value’
Smith is graphing a null value and suggesting that NASA preforms the same idiotic procedure that he does.
:eye roll:

pat
February 1, 2010 3:08 pm

Economic Times, India: IPCC claims on Amazon, ice not based on science
The pressure is be on the IPCC to improve its procedures. “The goof ups that are being reported are all from Working Group II. Clearly, evangelism has overtaken science. I am told that there are many things in the summary for policymakers that are not there in the Working Group reports. There is a clear need to distinguish science from advocacy, and the IPCC should stick with science,” environment minister Jairam Ramesh said.
http://economictimes.indiatimes.com/news/politics/nation/IPCC-claims-on-Amazon-ice-not-based-on-science/articleshow/5525992.cms

February 1, 2010 3:11 pm

JJ (14:35:34) :
The value 9999 is a standard ‘no data’ value. This is nonsense.

The color for no data on the GISS maps is Gray not Red and it won’t show up in the temp anomaly scale either:

Note: Gray areas signify missing data.

http://tiny.cc/F20SI
There is a bug in the code that they use for their mapping program. When you add in the SST data the red in the oceans turns white.

kadaka
February 1, 2010 3:11 pm

Michael (14:51:46) :
Oh, that’s just Barry playing his Chicago-style politics, “Pass Cap and Trade or Granny won’t get her medicine and babies will starve.” Etc. Don’t worry, they’ll make the money up with the massive defense cuts once the job is labeled finished in Iraq and Afghanistan. After all, Osama bin Laden agreed global warming is a major threat therefore he is now an ally. Time to shake hands and share the apologies! Oh, and kudos to North Korea and Iran for helping to save the planet with clean energy. Great job, friends!

kadaka
February 1, 2010 3:18 pm

I doubt the 9999 is getting worked into the calculations, that would make quite a change. But they should have a different shading for the NULL case. Too bad there are no “grey areas” in (peer-reviewed!) Climate Science. And that’s settled!

February 1, 2010 3:19 pm

You know, it might be nice to do an analysis to determine if GISS is letting 9999s bleed into the real observations before accusing them of doing so. Given how much a single 9999 would skew the global temperature (since anomalies are generally ~+/- 5 C max), it wouldn’t be too hard to catch.
I’d suggest poking through the Python version of GISTemp instead of the old FORTRAN mess: http://clearclimatecode.org/
Until you identify an error, this is just blowing smoke. The much more likely explanation of high arctic temps is polar amplification: http://www.realclimate.org/index.php/archives/2006/01/polar_amplification/

Jan
February 1, 2010 3:24 pm

It gets even better, when one compares interval 1951-1980 to interval 2009-2009.
Then one gets a bug-map of the area, where the GISS has now no real coverage. One sees directly the geography – where the GISS now avoids any real data – half of the Africa, center of the south America, almost all Greenland and Arctica, big chunks of Canada, all the cold thermohaline suroundings and even the center of Antarctica (didn’t the NASA yet discovered there is a big base on the south pole?…) etc.
Here I made some pictures for a quick comparison:
http://xmarinx.sweb.cz//gissmaperror.JPG
http://xmarinx.sweb.cz//polargissmaperror.JPG
It looks like the area covered by the GISTEMP panel shrinks even much faster than the infamous arctic sea ice… 😉
One then doesn’t much wonder why the GISTEMP has recently so big divergence from UAH, RSS and even HADCRUT3 trends:
http://preview.tinyurl.com/yar6759

Harold Blue Tooth
February 1, 2010 3:24 pm

Shoddy workmanship? Accidental? I don’t think so. This is NASA.
Errors that have been found with NASA work in the GISS (James Hansen, Gavin Schmidt) department over the years would not happen at a lesser eshtablishment, let alone NASA. Someone along the line would have caught and corrected them. Especially with the particularly embarassing Sep/Oct 2008 “error” you’d think GISS would be keen to root out these issues.
I could speculate that there is a tentacle of politics extending from Washington to GISS that uses these computer “errors” for its purposes. I think other people are speculating the same.

Tom_R
February 1, 2010 3:30 pm

>> Arizona CJ (14:35:57) :
Interesting that all the errors of this nature (buggy code, etc) that I’ve seen show warming. I wonder what the odds of that are, if one assumes that they are indeed all errors and no bias was present? My guess is very long odds indeed, and that argues strongly for a non-accidental explanation. <<
Since they expect warming, the true believers are much less likely to take a second look at an error that increases warming. Errors that show cooling (along with non errors) would be highly scrutinzed.

February 1, 2010 3:34 pm

Zeke,
The python code would be the exact wrong thing to look at.
The graphic displayed on the NASA site is not generated from the code
that NASA has released. Since the CCC project is rewriting THAT code,
your suggestion doesnt help much. Simply the python code doesnt draw the grpahic that EM is pointing to. What code does? Dunno.
have those charts ever been used in publication? Dunno.
Now, if journals had reproducible results requirements The DATA AS USED
and the CODE AS USED to product any result ( table, chart, graph, textual claim) we might be able to tell.
So, think of EMs post as a bug report.
is there a bug?
yes.
Does the bug hit anything other than a graphic drawn on the web page?
dunno.

Harold Blue Tooth
February 1, 2010 3:34 pm

rbroberg (15:07:35) :
After you have finished rolling your eyes would you have a look at GISTemp temperature values compared to other data sets?
GISTemp shows warmer values than all other sets. GISS is carrying on about hottest year this, 2nd hottest year that. No one else is.
ps. is the sky falling? are you living in a world “terrifying vistas of reality”?
:rolls eyes:

Peter
February 1, 2010 3:42 pm

A bit OT, but you have to have a look at the Pachauri video on Richard North’s blog (eureferendum)
Pachauri seems hell-bent on single-handedly destroying the credibility of the IPCC.
I mean, calling Jonathan Leake a skeptic conspirator… who would have thought?
Oh, and the Himalayan glaciers will melt by 2050, not 2035 – from the horse’s mouth.

February 1, 2010 3:46 pm

Chiefo,
Do you have a recommendation on what is an effective way for me, “USA Taxpayer”, to contact GISS for the release of the code by and request more open GISS forums for analysis?
I would greatly appreciate it.
John

kadaka
February 1, 2010 3:47 pm

Zeke Hausfather (15:19:40) :
He said in the article he is not poking through the real code that creates this effect, FORTRAN or otherwise, as it is not publicly available.
Clear Climate Code seeks to emulate the GISTEMP program with easy-to-follow code, and otherwise make reliable code for real climate science. What value is there in playing with their stuff when one is seeking to find out how the real GISS code works?
Oh look, you cited Real Climate as an authoritative source of reliable scientific information. Yeah, that’ll sure win you some points around here.

E.M.Smith
Editor
February 1, 2010 3:48 pm

rbroberg (15:07:35) : 9999 is ‘null value’
Smith is graphing a null value and suggesting that NASA preforms the same idiotic procedure that he does.

Um, no.
The graphs are straight from GISS. I’ve done nothing but download their graph from their software and their data. It’s 100% them, and not about me at all. So you can stop rolling your eyes and try to focus them on the graph…

Peter of Sydney
February 1, 2010 3:51 pm

I’ve been a serious computer programmer for over two decades and I can assure you that null is not 9999. It’s typically 0, but can be other values depending on where it’s uses. I have never seen 9999 for null. Besides, null means literally means unknown in computer programming. So, if 9999 is somehow “bleeding” into the way they are interpolating the readings then it’s proof that climate science is corrupt science. I doubt this is the case but if it is then we would have irrefutable proof that they are committing a fraud as non one, not even a high school student would do this.

February 1, 2010 3:52 pm

Mosh,
I agree with you on how to best debug their online maps. However, the more pernicious accusation that 9999s are bleeding through into the actual temperature data would be much easier to catch in the python implementation unless you really enjoy trying to read through poorly documented Fortran.

E.M.Smith
Editor
February 1, 2010 3:56 pm

Zeke Hausfather (15:19:40) : You know, it might be nice to do an analysis to determine if GISS is letting 9999s bleed into the real observations before accusing them of doing so. Given how much a single 9999 would skew the global temperature (since anomalies are generally ~+/- 5 C max), it wouldn’t be too hard to catch.
Nice idea. So I ran with it. Don’t know how long this GISS map stays up on their site, but I just did 2009 vs 2008 baseline. The “red” runs up to 10.3 C on the key.
http://data.giss.nasa.gov/work/gistemp/NMAPS/tmp_GHCN_GISS_250km_Anom12_2009_2009_2008_2008/GHCN_GISS_250km_Anom12_2009_2009_2008_2008.gif
So unless we’ve got a 10 C + heat wave compared to last year, well, I think it’s a bug …
I’d suggest poking through the Python version of GISTemp instead of the old FORTRAN mess:
Nope. KEY point about debugging. You always test the code being run. the docs are nice, the pseudo code is nice, even some new translation is nice; but they do not exactly capture all the bugs. So for debugging, it’s the real deal only. (Heck, I’ve even had cases where the written code did something different from the binary – compiler bugs…)

February 1, 2010 4:03 pm

@mosher: Does the bug hit anything other than a graphic drawn on the web page?
The bug is seems to be limited to graphics that meet these two condition:
a) NASA is reporting with a 250km Smoothing Radius
b) NASA is reporting an interval where the reporting period = the baseline period
I can’t recall seeing any such graph.
Have you?

TerryS
February 1, 2010 4:06 pm

Re: rbroberg (15:07:35) :

9999 is ‘null value’
Smith is graphing a null value and suggesting that NASA preforms the same idiotic procedure that he does.

How do you know the 9999 is a ‘null value’? Do you have access to the code that generates the maps because I dont believe anybody outside of NASA GISS has access.
What if the code that produces the 9999 is something along the lines of:
if val > MAX_VALUE then val = 9999
instead of
if val == NULL then val = 9999
or
if val == ERROR then val = 9999
Because if its the first then it means the code has the potential to produce artificially high anomalies.

Iren
February 1, 2010 4:09 pm

“Earlier this month, the Bureau of Meteorology released its 2009 annual climate statement. It found 2009 was the second hottest year in Australia on record and ended our hottest decade. In Australia, each decade since the 1940s has been warmer than the last…

What they leave out is that they’re only counting since 1961! Apparently, 1890 was the end of our warmest decade.

hotrod ( Larry L )
February 1, 2010 4:11 pm

E.M.Smith (15:56:43) :

Nope. KEY point about debugging. You always test the code being run.

Very important point. Also since we have no known version control and most reports do not list exactly what model version was used to generate the run, you have no way of knowing if the results that appear in a given report, are in fact the real output of the software version that allegedly produced the model run.
In that sense you are chasing ghosts regarding older reports where we have no hope of determining exactly what build the model run was made on.
Along with your note about the compiler errors, it is possible that a given code could run differently on different hardware due to such deeply hidden errors. Unless you can run your code on the NASA system there is no way to exclude that possibility.
Larry

E.M.Smith
Editor
February 1, 2010 4:11 pm

John Whitman (15:46:41) : Do you have a recommendation on what is an effective way for me, “USA Taxpayer”, to contact GISS for the release of the code by and request more open GISS forums for analysis?
Well, they have released some of the code. I’ve got a copy from prior to 15 November 2009 that has fixed 1200 km. I’ve seen one report that the new code they installed then as a RCRIT value that is not hard coded and can be either 1200 km or 256 km (but I’ve not seen the code myself… another thing for the ‘to do’ list… down load the new walnut shell and start looking for where the ‘pea’ is all over again … :-{ so it is possible that the 15 Nov 2009 code is what makes the data that feeds this graph, but then we’re still missing the link from that data to the graph. They have contact information on their web site, FWIW. See:
http://data.giss.nasa.gov/gistemp/
where at the bottom is says Hansen does the ‘scientific’ stuff while Ruedy does the webpages and technical So I’d guess it’s him:
http://www.giss.nasa.gov/staff/rruedy.html
contact info is on his page.

kadaka
February 1, 2010 4:24 pm

*sigh*
Why does moderation involve a LIFO stack instead of FIFO? While it is mildly amusing to see the order in which comments miraculously appear after a long moderator break, this is more than offset by the irritation at seeing the newest posts appear (and get replied to) before the older longer-waiting ones (like mine), with problems following a thread by having to check backwards from the “newest” to see if others have suddenly appeared since the comments were last checked.
Wouldn’t it be easier for the moderators to go to FIFO, besides being easier for us readers?
[Reply: You are right. But WordPress sets the template, and it doesn’t make starting moderation with FIFO simple. It’s a hassle to go back for as many pages as necessary to start at the oldest post. Sometimes I do it that way, if the posts are on the first Edit page. When that’s the case I’ll start doing them from the bottom up from now on. Just speaking for myself here. ~dbs]

CrossBorder
February 1, 2010 4:39 pm

@rbroberg (15:07:35) :
“Smith is graphing a null value and suggesting that NASA preforms the same idiotic procedure that he does.”
NASA pre-forming some procedure – looks just about what they might be doing. Hmm, some PERformance!
(Signed, another grammar n*zi)

Verified by MonsterInsights