Chiefio asks: why does GISS make us see red?

A GIS anomaly map with a 9999 hot ocean from baseline = report  period

The empty ocean goes infinite hot on a null anomaly

What to make of THIS bizarre anomaly map?

What Have I Done?

I was exploring another example of The Bolivia Effect where an empty area became quite “hot” when the data were missing (Panama, posting soon) and that led to another couple of changed baselines that led to more ‘interesting red’ (1980 vs 1951-1980 baseline). I’m doing these examinations with a 250 km ’spread’ as that tells me more about where the thermometers are located. The above graph, if done instead with a 1200 km spread or smoothing, has the white spread out to sea 1200 km with smaller infinite red blobs in the middles of the oceans.

I thought it would be ‘interesting’ to step through parts of the baseline bit by bit to find out where it was “hot” and “cold”. (Thinking of breaking it into decades…. still to be tried…) When I thought:

Well, you always need a baseline benchmark, even if you are ‘benchmarking the baseline’, so why not start with the “NULL” case of baseline equal to report period? It ought to be a simple all white land area with grey oceans for missing data.

Well, I was “A bit surprised” when I got a blood red ocean everywhere on the planet.

You can try it yourself at the NASA / GISS web site map making page.

In all fairness, the land does stay white (no anomaly against itself) and that’s a very good thing. But that Ocean!

ALL the ocean area with no data goes blood red and the scale shows it to be up to ‘9999′ degrees C of anomaly.

“Houston, I think you have a problem”…

Why Don’t I Look In The Code

Well, the code NASA GISS publishes and says is what they run, is not this code that they are running.

Yes, they are not publishing the real code. In the real code running on the GISS web page to make these anomaly maps, you can change the baseline and you can change the “spread” of each cell. (Thus the web page that lets you make these “what if” anomaly maps). In the code they publish, the “reach” of that spread is hard coded at 1200 km and the baseline period is hard coded at 1951-1980.

So I simply can not do any debugging on this issue, because the code that produces these maps is not available.

But what I can say is pretty simple:

If a map with no areas of unusual warmth (by definition with the baseline = report period) has this happen; something is wrong.

I’d further speculate that that something could easily be what causes The Bolivia Effect where areas that are lacking in current data get rosy red blobs. Just done on a spectacular scale.

Further, I’d speculate that this might go a long way toward explaining the perpetual bright red in the Arctic (where there are no thermometers so no thermometer data). This “anomaly map” includes the HadCRUT SST anomaly map for ocean temperatures. The striking thing about this one is that those two bands of red at each pole sure look a lot like the ‘persistent polar warming’ we’ve been told to be so worried about. One can only wonder if there is some “bleed through” of these hypothetical warm spots when the ‘null data’ cells are averaged in with the ‘real data cells’ when making non-edge case maps. But without the code, it can only be a wonder:

GISS anomaly map with HadCRUT SST anomalies, bright red polesWith 250 km ‘spread’ and HadCRUT SST anomalies we get bright red poles

The default 1200 km present date map for comparison:

GISS Anomaly map for November 2009GIS Anomaly Map for November 2009

I’m surprised nobody ever tried this particular ‘limit case’ before. Then again, experienced software developers know to test the ‘limit cases’ even if they do seem bizarre, since that’s where the most bugs live. And this sure looks like a bug to me.

A very hot bug…

Advertisements

  Subscribe  
newest oldest most voted
Notify of
Socratease

I recall reading somewhere else that ‘9999’ isn’t a temperature but is instead an error code. Not sure what it’s complaining about, you’d need to look at the code to find out, and there are probably more than one set of conditions that produce that output.

Fred from Canuckistan

wonder if that model was peer reviewed or beer reviewed?

Craigo

Reminds me of all that Steigian red from last year’s Antarctic smearing.

So, if the thermometer is dead, it’s red!

Chris

Two winners today, no three
(1) Times are a-changin’ when Google Ads shows this
First-rate, too, IMHO. Want to run a thread on it here?
(2) important post here: the ridiculous QC of GISS has got to come out alongside that of CRU
(3) yesterday’s UHI arrived here this morning, just after I sent Warren Meyer a set of pics to think of using to enliven above video, one of which was my rendition of Anthony’s Reno urban transect overlaid on another pic resembling that used for Case 8. Ha, I feel proud.
But I still don’t know how Anthony manages to keep rolling it out, watching other blogs, writing papers, challenging Menne, and putting courteous notes in for the (declining) trolls here. Thank you from the bottom of my heart, moderators too.

John Galt

If GISS truly uses 9999 for null, it just shows how out-of-date their software is.
It appears they don’t discard or ignore that value, either, when they create an image from the data. Wouldn’t it be better to show those areas as black?

p.g.sharrow "PG"

One more indication that the code is ether poorly done or cooked.

Andy Scrase

Can we test this hypothesis with a bit of reverse engineering of data? Null values in data sets are a common problem I find in my work. (Software dev)
Why don’t they publish the freakin source code? Surely all these scientist types are open source zealots anyway.

dearieme

So CRU don’t have a monopoly on crap warmmonger software. Who’d have thunk it?

stumpy

999 or 9999 (and negative versions) are typically used to represent errors, as values such as 0 for no data can be misinterpreted as real data. Blank values can also cause an error in some code, hence the use of a silly low or silly high value, with the intention that the user imediately notice it or that it can be immediately identified as wrong. The code *should* ignore such values, but its not unheard of for these values to be pulled in as someone has used “9999” instead of the “999” value the software ignores.
I understand the code works by smearing hot spots around (homogenisation), so there is potential for one of these large numbers to make it through only to be smeared around between the other low numbers so the error is not immediately obvious to the eye. Hansen et al would expect (or would like) to see warming at the poles, so they make not see this as an error due to their preconcieved solid belief in global warming.
As chiefio states, this is why its so important to test the softwate or code to extremes. I use numerous engineering and hydraulic modelling packages, and reguarly find small and occassionaly large flows. I have a similar habit of testing extreme scenarios so I know where the software can and can not be trusted, but also manually replicating sections of the software engine using their own equations and comparing results – using this method I end up identifying faults that have gone un-missed for years!
Great work! This just demonstrates why we should never stop questioning – even if we are lowly peasent “non-climate scientists” incapable of computer engineering, maths, statistics etc..

JJ

The value 9999 is a standard ‘no data’ value. This is nonsense.

Arizona CJ

Interesting that all the errors of this nature (buggy code, etc) that I’ve seen show warming. I wonder what the odds of that are, if one assumes that they are indeed all errors and no bias was present? My guess is very long odds indeed, and that argues strongly for a non-accidental explanation.
Arizona CJ

Antonia

Slightly OT, mod, but I need an answer fast and here’s the place to get it.
Australia’s Climate Change Minister, Penny Wong, claimed in an article in today’s The Australian that, “Globally,
Surely that’s a porky.

Antonia

Slightly OT, mod, but Australia’s Climate Change Minister has claimed the following in today’s The Australian.
“Earlier this month, the Bureau of Meteorology released its 2009 annual climate statement. It found 2009 was the second hottest year in Australia on record and ended our hottest decade. In Australia, each decade since the 1940s has been warmer than the last…
Globally, 14 of the 15 warmest years on record occurred between 1995 and 2009.”

Michael

Breaking News;
Glenn Beck just reported on Dr Rajendra Pachauri’s steamy new sex novel and Obama’s new budget that includes $650 billion from carbon taxing, says Beck, ” what does the Obama administration know that we don’t know?”
The world wealthy elite are pushing ahead with their global government carbon taxing schemes.
And if you think this is a joke, it’s not.
I watched the Davos world economic forum interviews and the most powerful people in the world are talking as if the global carbon tax is a given.
Carbon taxing is based on a lie for Christ’s sake.

hotrod ( Larry L )

Just another example of poorly written or intentionally deceptive code.
If a computer model, cannot be audited in public and validated that the code that is reviewed is the actual code being used to run the simulations the output is no better than throwing dice or tossing darts at a wall map, and should never be used in any way shape of form in the development of policy or legislation.
We need a law passed that explicitly blocks use of any model simulation for policy, regulation or legislation without a verifiable audit process.
Any computer programmer that has gotten past “Hello world” level programming knows it is impossible to write error free code containing thousands or millions of lines of code. The best you can do is to thoroughly test for and eliminate the coding and logic errors that are most likely to bite you under unexpected conditions.
A couple decades ago, I was asked to beta test a relatively simple program used by FEMA to process some shelter data. The first think I did was every time it asked for a value I hit every odd ball key on the key board like ! @ # etc. I blew the program up about a dozen times simply because such nonsense key strokes were not tested for and trapped with an error routine that enforced reasonable input limits.
If this code cannot be open sourced or audited by a professional soft ware audit process it should be barred from use.
Larry

carrot eater

You just broke their online widget by asking it to do something that didn’t make any sense. Do you really think the error flag 9999 is entering into real calculations anywhere?
I really don’t understand your fascination with changing the baseline. It’s a waste of time. You’ll just nudge the absolute values of the anomalies up and down, but the trends won’t change.
As for your Bolivia effect, you could probably figure it out with just one day’s effort. GISS finds the temperature at each grid point by adding in surrounding stations, linearly weighted based on the distance from the grid point. Just take the current surrounding stations, and you can confirm their calculation. Then, you can easily check to see how good the interpolation is.
If there is data from Bolivia before 1990 or whenever, just repeat that calculation, pretending that you can’t see the Bolivian stations. Then compare the interpolation to the actual Bolivian data, and you’ll see how good the interpolation is. If you want people to take you seriously that the interpolated values for Bolivia are not good, why don’t you actually do that analysis and find out?

Trev

OFF TOPIC –
The British monthly car magazine ‘What Car’ has an article on global warming. At first glance it seems to take the ‘rising CO2 is going to cause dangerous harm – and man is to blame’ claim as factual and even has a 700,000 year graph of CO2 which looks pretty hockey stickish to me at its end.
I hope if needs be someone will correct any of their errors . The magazine claims that current CO2 is at an all time high, but my dim memory thinks it has read something about it being much higher in the far past.

I see you haven’t noticed it doesn’t matter if this part of NASA is correct or not because they just keep getting the money. The moon program is now dead and Obama is rerouting that money to climate monitoring. Hansen and the boys are all grins-That’s what makes me see red!! Obama needs cap and trade for the taxes and the control. As long as these type people are in control we will all be seeing red ,especially in our bank statements. I don’t know when everyone will wake up and see this. Maybe Mass. was a start-we’ll see.

pat

lengthy interview with Pachauri in the Hindustan Times:
Pachauri: ‘They can bend me, but they can’t break me’
Q: Can you provide us revenue for 10 years to prove there is no link between IPCC and TERI?
A: …. I have proved myself in several aspects in the world. Not in eyes of Sunday Telegraph. Fortunately, there are a few people, thank God, like the Sunday Telegraph. But yes, if you want, we can provide the accounts, the payments made over these 10 years.
http://www.hindustantimes.com/They-can-bend-me-but-they-can-t-break-me/H1-Article1-504204.aspx
COMMENT BELOW ARTICLE: Yes, we do want … what are you waiting for? We’ve already asked. Richard North, Sunday Telegraph
also from the interview which needs to be read in full:
Q: Did failure at Copenhagen help climate skeptics?
A: No agreement at Copenhagen in fact encouraged some of the deniers, and those who are financing them with, maybe millions, who knows, billions of dollars.

Hi,
I’ve been digging through the code (just getting it to run on my beefy Linux box). I notice that when they do the griding:
1) They seem to use a linear weighting around the center of each grid, so a station twice as far away from the center contributes half as much to the total – shouldn’t that be some form of inverse square law relationship instead? as the temperature at a given point interacts in all directions – using pure linear is very akin to resistance over distance in a wire.
2) From what I’ve seen in the code (I maybe wrong, being a good many years since I did FORTRAN) they have some weird rule around where a station can be outside 1200km yet still inside the grid – but in this case they give all such stations the weight of the furthest station within the 1200km radius – so pulling such external stations ‘inside’ the 1200km boundary.
once I have the code running I’ll see how this impacts the results.

Hank Hancock

It seems to me that temperature data analysis methods must accept both positive and negative single or double precision values for input. For this reason, the developer will use an out-of-range value to represent NULL data. If the application is written properly, whatever value is used to represent NULL should be tested for and not be used in any calculation that drives output.
Many modern programming languages now support the NaN value in numeric data types, which can be used to represent NULL or erred data. The beauty of NaN is that it will throw an exception if used in a calculation or used as direct output to any control that expects a real number. That assures that a NaN cannot be accidentally included in analysis, even if by programming error. Me thinks it is time for NASA to upgrade to modern programming languages and systems.

Pingo

I wonder if they’ll “lose” data for Scotland this winter now that we’ve heard they have had their coldest ever Dec-Jan on record.

Oh, all you skeptic scientists have missed out and I’m sorry. But at least you have kept your dignity and everything else. Keep after them- it just might turn around in your favor. Everyone else help out as much as possible by voting and giving.

9999 is ‘null value’
Smith is graphing a null value and suggesting that NASA preforms the same idiotic procedure that he does.
:eye roll:

pat

Economic Times, India: IPCC claims on Amazon, ice not based on science
The pressure is be on the IPCC to improve its procedures. “The goof ups that are being reported are all from Working Group II. Clearly, evangelism has overtaken science. I am told that there are many things in the summary for policymakers that are not there in the Working Group reports. There is a clear need to distinguish science from advocacy, and the IPCC should stick with science,” environment minister Jairam Ramesh said.
http://economictimes.indiatimes.com/news/politics/nation/IPCC-claims-on-Amazon-ice-not-based-on-science/articleshow/5525992.cms

JJ (14:35:34) :
The value 9999 is a standard ‘no data’ value. This is nonsense.

The color for no data on the GISS maps is Gray not Red and it won’t show up in the temp anomaly scale either:

Note: Gray areas signify missing data.

http://tiny.cc/F20SI
There is a bug in the code that they use for their mapping program. When you add in the SST data the red in the oceans turns white.

kadaka

@ Michael (14:51:46) :
Oh, that’s just Barry playing his Chicago-style politics, “Pass Cap and Trade or Granny won’t get her medicine and babies will starve.” Etc. Don’t worry, they’ll make the money up with the massive defense cuts once the job is labeled finished in Iraq and Afghanistan. After all, Osama bin Laden agreed global warming is a major threat therefore he is now an ally. Time to shake hands and share the apologies! Oh, and kudos to North Korea and Iran for helping to save the planet with clean energy. Great job, friends!

kadaka

I doubt the 9999 is getting worked into the calculations, that would make quite a change. But they should have a different shading for the NULL case. Too bad there are no “grey areas” in (peer-reviewed!) Climate Science. And that’s settled!

You know, it might be nice to do an analysis to determine if GISS is letting 9999s bleed into the real observations before accusing them of doing so. Given how much a single 9999 would skew the global temperature (since anomalies are generally ~+/- 5 C max), it wouldn’t be too hard to catch.
I’d suggest poking through the Python version of GISTemp instead of the old FORTRAN mess: http://clearclimatecode.org/
Until you identify an error, this is just blowing smoke. The much more likely explanation of high arctic temps is polar amplification: http://www.realclimate.org/index.php/archives/2006/01/polar_amplification/

Jan

It gets even better, when one compares interval 1951-1980 to interval 2009-2009.
Then one gets a bug-map of the area, where the GISS has now no real coverage. One sees directly the geography – where the GISS now avoids any real data – half of the Africa, center of the south America, almost all Greenland and Arctica, big chunks of Canada, all the cold thermohaline suroundings and even the center of Antarctica (didn’t the NASA yet discovered there is a big base on the south pole?…) etc.
Here I made some pictures for a quick comparison:
http://xmarinx.sweb.cz//gissmaperror.JPG
http://xmarinx.sweb.cz//polargissmaperror.JPG
It looks like the area covered by the GISTEMP panel shrinks even much faster than the infamous arctic sea ice… 😉
One then doesn’t much wonder why the GISTEMP has recently so big divergence from UAH, RSS and even HADCRUT3 trends:
http://preview.tinyurl.com/yar6759

Harold Blue Tooth

Shoddy workmanship? Accidental? I don’t think so. This is NASA.
Errors that have been found with NASA work in the GISS (James Hansen, Gavin Schmidt) department over the years would not happen at a lesser eshtablishment, let alone NASA. Someone along the line would have caught and corrected them. Especially with the particularly embarassing Sep/Oct 2008 “error” you’d think GISS would be keen to root out these issues.
I could speculate that there is a tentacle of politics extending from Washington to GISS that uses these computer “errors” for its purposes. I think other people are speculating the same.

Tom_R

>> Arizona CJ (14:35:57) :
Interesting that all the errors of this nature (buggy code, etc) that I’ve seen show warming. I wonder what the odds of that are, if one assumes that they are indeed all errors and no bias was present? My guess is very long odds indeed, and that argues strongly for a non-accidental explanation. <<
Since they expect warming, the true believers are much less likely to take a second look at an error that increases warming. Errors that show cooling (along with non errors) would be highly scrutinzed.

Zeke,
The python code would be the exact wrong thing to look at.
The graphic displayed on the NASA site is not generated from the code
that NASA has released. Since the CCC project is rewriting THAT code,
your suggestion doesnt help much. Simply the python code doesnt draw the grpahic that EM is pointing to. What code does? Dunno.
have those charts ever been used in publication? Dunno.
Now, if journals had reproducible results requirements The DATA AS USED
and the CODE AS USED to product any result ( table, chart, graph, textual claim) we might be able to tell.
So, think of EMs post as a bug report.
is there a bug?
yes.
Does the bug hit anything other than a graphic drawn on the web page?
dunno.

Harold Blue Tooth

rbroberg (15:07:35) :
After you have finished rolling your eyes would you have a look at GISTemp temperature values compared to other data sets?
GISTemp shows warmer values than all other sets. GISS is carrying on about hottest year this, 2nd hottest year that. No one else is.
ps. is the sky falling? are you living in a world “terrifying vistas of reality”?
:rolls eyes:

Peter

A bit OT, but you have to have a look at the Pachauri video on Richard North’s blog (eureferendum)
Pachauri seems hell-bent on single-handedly destroying the credibility of the IPCC.
I mean, calling Jonathan Leake a skeptic conspirator… who would have thought?
Oh, and the Himalayan glaciers will melt by 2050, not 2035 – from the horse’s mouth.

John Whitman

Chiefo,
Do you have a recommendation on what is an effective way for me, “USA Taxpayer”, to contact GISS for the release of the code by and request more open GISS forums for analysis?
I would greatly appreciate it.
John

kadaka

@ Zeke Hausfather (15:19:40) :
He said in the article he is not poking through the real code that creates this effect, FORTRAN or otherwise, as it is not publicly available.
Clear Climate Code seeks to emulate the GISTEMP program with easy-to-follow code, and otherwise make reliable code for real climate science. What value is there in playing with their stuff when one is seeking to find out how the real GISS code works?
Oh look, you cited Real Climate as an authoritative source of reliable scientific information. Yeah, that’ll sure win you some points around here.

rbroberg (15:07:35) : 9999 is ‘null value’
Smith is graphing a null value and suggesting that NASA preforms the same idiotic procedure that he does.

Um, no.
The graphs are straight from GISS. I’ve done nothing but download their graph from their software and their data. It’s 100% them, and not about me at all. So you can stop rolling your eyes and try to focus them on the graph…

Peter of Sydney

I’ve been a serious computer programmer for over two decades and I can assure you that null is not 9999. It’s typically 0, but can be other values depending on where it’s uses. I have never seen 9999 for null. Besides, null means literally means unknown in computer programming. So, if 9999 is somehow “bleeding” into the way they are interpolating the readings then it’s proof that climate science is corrupt science. I doubt this is the case but if it is then we would have irrefutable proof that they are committing a fraud as non one, not even a high school student would do this.

Mosh,
I agree with you on how to best debug their online maps. However, the more pernicious accusation that 9999s are bleeding through into the actual temperature data would be much easier to catch in the python implementation unless you really enjoy trying to read through poorly documented Fortran.

Zeke Hausfather (15:19:40) : You know, it might be nice to do an analysis to determine if GISS is letting 9999s bleed into the real observations before accusing them of doing so. Given how much a single 9999 would skew the global temperature (since anomalies are generally ~+/- 5 C max), it wouldn’t be too hard to catch.
Nice idea. So I ran with it. Don’t know how long this GISS map stays up on their site, but I just did 2009 vs 2008 baseline. The “red” runs up to 10.3 C on the key.
http://data.giss.nasa.gov/work/gistemp/NMAPS/tmp_GHCN_GISS_250km_Anom12_2009_2009_2008_2008/GHCN_GISS_250km_Anom12_2009_2009_2008_2008.gif
So unless we’ve got a 10 C + heat wave compared to last year, well, I think it’s a bug …
I’d suggest poking through the Python version of GISTemp instead of the old FORTRAN mess:
Nope. KEY point about debugging. You always test the code being run. the docs are nice, the pseudo code is nice, even some new translation is nice; but they do not exactly capture all the bugs. So for debugging, it’s the real deal only. (Heck, I’ve even had cases where the written code did something different from the binary – compiler bugs…)

@mosher: Does the bug hit anything other than a graphic drawn on the web page?
The bug is seems to be limited to graphics that meet these two condition:
a) NASA is reporting with a 250km Smoothing Radius
b) NASA is reporting an interval where the reporting period = the baseline period
I can’t recall seeing any such graph.
Have you?

TerryS

Re: rbroberg (15:07:35) :

9999 is ‘null value’
Smith is graphing a null value and suggesting that NASA preforms the same idiotic procedure that he does.

How do you know the 9999 is a ‘null value’? Do you have access to the code that generates the maps because I dont believe anybody outside of NASA GISS has access.
What if the code that produces the 9999 is something along the lines of:
if val > MAX_VALUE then val = 9999
instead of
if val == NULL then val = 9999
or
if val == ERROR then val = 9999
Because if its the first then it means the code has the potential to produce artificially high anomalies.

Iren

“Earlier this month, the Bureau of Meteorology released its 2009 annual climate statement. It found 2009 was the second hottest year in Australia on record and ended our hottest decade. In Australia, each decade since the 1940s has been warmer than the last…

What they leave out is that they’re only counting since 1961! Apparently, 1890 was the end of our warmest decade.

hotrod ( Larry L )

E.M.Smith (15:56:43) :

Nope. KEY point about debugging. You always test the code being run.

Very important point. Also since we have no known version control and most reports do not list exactly what model version was used to generate the run, you have no way of knowing if the results that appear in a given report, are in fact the real output of the software version that allegedly produced the model run.
In that sense you are chasing ghosts regarding older reports where we have no hope of determining exactly what build the model run was made on.
Along with your note about the compiler errors, it is possible that a given code could run differently on different hardware due to such deeply hidden errors. Unless you can run your code on the NASA system there is no way to exclude that possibility.
Larry

John Whitman (15:46:41) : Do you have a recommendation on what is an effective way for me, “USA Taxpayer”, to contact GISS for the release of the code by and request more open GISS forums for analysis?
Well, they have released some of the code. I’ve got a copy from prior to 15 November 2009 that has fixed 1200 km. I’ve seen one report that the new code they installed then as a RCRIT value that is not hard coded and can be either 1200 km or 256 km (but I’ve not seen the code myself… another thing for the ‘to do’ list… down load the new walnut shell and start looking for where the ‘pea’ is all over again … :-{ so it is possible that the 15 Nov 2009 code is what makes the data that feeds this graph, but then we’re still missing the link from that data to the graph. They have contact information on their web site, FWIW. See:
http://data.giss.nasa.gov/gistemp/
where at the bottom is says Hansen does the ‘scientific’ stuff while Ruedy does the webpages and technical So I’d guess it’s him:
http://www.giss.nasa.gov/staff/rruedy.html
contact info is on his page.

kadaka

*sigh*
Why does moderation involve a LIFO stack instead of FIFO? While it is mildly amusing to see the order in which comments miraculously appear after a long moderator break, this is more than offset by the irritation at seeing the newest posts appear (and get replied to) before the older longer-waiting ones (like mine), with problems following a thread by having to check backwards from the “newest” to see if others have suddenly appeared since the comments were last checked.
Wouldn’t it be easier for the moderators to go to FIFO, besides being easier for us readers?
[Reply: You are right. But WordPress sets the template, and it doesn’t make starting moderation with FIFO simple. It’s a hassle to go back for as many pages as necessary to start at the oldest post. Sometimes I do it that way, if the posts are on the first Edit page. When that’s the case I’ll start doing them from the bottom up from now on. Just speaking for myself here. ~dbs]

CrossBorder

@rbroberg (15:07:35) :
“Smith is graphing a null value and suggesting that NASA preforms the same idiotic procedure that he does.”
NASA pre-forming some procedure – looks just about what they might be doing. Hmm, some PERformance!
(Signed, another grammar n*zi)