Quote Of The Week #13

qotw_cropped

From Gary Strand, software engineer at the National Center for Atmospheric Research (NCAR) commenting on Climate Audit:

As a software engineer, I know that climate model software doesn’t meet the best standards available. We’ve made quite a lot of progress, but we’ve still quite a ways to go.

I’ll say. NASA GISS model E written on some of the worst FORTRAN coding ever seen  is a challenge to even get running. NASA GISTEMP is even worse. Yet our government has legislation under consideration significantly based on model output that Jim Hansen started. His 1988 speech to Congress was entirely based on model scenarios.

Do we really want congress to make trillion dollar tax decisions today based on “software [that] doesn’t meet the best standards available.”?

There’s more. Steve McIntyre comments:

Re: Gary Strand (#56),

Gary, if this is what you think, then this should have been reported in IPCC AR4 so that politicians could advise themselves accordingly. I do not recall seeing any such comment in AR4 – nor for that matter in any review comments.

…and to the second part of the comment:

Re: Gary Strand (#56),

If we can convince funding agencies to better-fund software development, and continued training, then we’ll be on our way. It’s a little harsh, IMHO, to assign blame to software engineers when they’re underpaid and overworked.

Boo-hoo. Hundreds of millions of dollars, if not billions of dollars is being spent. PErhaps the money should be budgeted differently but IMO there’s an ample amount of overall funding to have adequate software engineers. Maybe there should be some consolidation in the climate model industry, as in the auto industry. If none of the models have adequate software engineering, then how about voluntarily shutting down one of the models and suggest that the resources be redeployed so that the better models are enhanced?

I’m not making this QOTW to pick on Gary Strand, though I’m sure he’ll see it that way. It is a frank and honest admission by him. I’m making it QOTW because Gary highlights a real problem that we see when we look at code coming from NASA GISS.

But don’t take my word for it, download it yourself and have a look. Take it to a software engineer at your own company and ask them what they think.

download-icon

GISS Model E global climate model source here

GISTEMP (surface temperature analysis) source  here

Sure, this is one of many climate modeling software programs out there, but it happens to be the most influential, since GISS and GISTEMP are the most widely cited outputs in the popular media.

U.S. industry seems to do a better job of software development than government programs, because in business, if something doesn’t work, or doesn’t work well, contracts get lost and/or people get fired. There’s consequences to shoddy work.

In academia, the solution is usually to ask for more grant money.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

164 Comments
Inline Feedbacks
View all comments
Pete W
July 6, 2009 5:05 pm

henrychance (15:17:41) :
Thank you for the clarification/education. So there is an actual degree in “software engineering”. Good to know. I’m just ruffled by people that call themselves engineers when they aren’t. M. Simon quite likely is one.
Bobn (16:36:30) :
I agree.
In addition to being complex the climatology models are – and are supposed to be – constantly changing as they learn more of the natural forcings at play. To suggest they should be required to stop now and re-write the code from scratch is… shall we say… not very reasonable.
The models they are working with appear to have been proven fairly accurate over the last 20 years. This video contains some historical perspective of climate models;

Pete

Chris Reed
July 6, 2009 5:41 pm

I had to take a FORTRAN 90 class while I was studying for my Meteorology degree at Texas A&M. I had to write 3 fairly simple program in FORTRAN 90. It took me most of my study time during the week to write the programs weeks they were due. I was just trying to work out all the bugs in the program. This was the reason I decided not to go into grad school because I did not want to deal with FORTRAN 90 anymore. Once I was finished with the class, I did not write anymore program in FORTRAN 90. I did not like the syntax with that program language. I did write a program for automating the trading for the FOREX market a few weeks and that took two days to complete. A program with same length in FORTRAN 90 took a week to complete because of the debugging.
I did some modeling as student at Texas A&M, but never dealt with the actual code of FORTRAN. I used the MM5 model to model the weather condition around the Houston area in the summer of 2000. I spent a 2 weeks just reading how to set up the model and run it properly. I then spent about a month just looking at how the model preformed during that time period. It something did not look right or was way off, we made adjustments to the model and reran the model.
As a Meteoroloist, I learned to look at the model as a tool. If a model did not perform well, I did not use that model or used as a base for me to make my forecast. I learn that skill quite well thanks to many more seasoned Meteorologist who taught me look at the model in that way. I bet that is something none of the people who work on these climate model never learn that skill, looking at the model and learning when the model is not performing at an acceptable level.

Mike Bryant
July 6, 2009 5:46 pm

A plumber comments:
It seems that an engineer is in search of reality, while the climate model programmer is in search of continued funding.
Real data in = future warming = continued funding
Real data in = future equivocal = new job opening
Of course I could be wrong…
Mike Bryant

Andrew
July 6, 2009 5:50 pm

If “the models” are “fairly accurate” why are they “constantly changing”?
Is it because of their proven “accuracy” that someone feels the need to “constantly change” them?
Andrew

July 6, 2009 5:57 pm

My two Baht’s worth. I’ve worked with computers all my life as a field engineer (gorified mechanic for the most part) and as a code developer who knew the hardware side. My experience has been that the programmer is given specific parameters the code is to work within, and away he/she goes. Personally I have never written a piece of “clean code” on the first try. Just keeping all the “if this then that” routines functioning together is challenging, especially on a tight schedule.
The programmer finally makes the original “assumptions” work as directed, then someone says they actually want something else because of unexpected developments. This is a programmers worst nightmare and can impact thousands of lines of code, not to mention the new coding required for the “unexpected developments”. Computer coding is a dangerous thing if it changes on a regular basis. I once developed a program to track a billion dollar project. Actually the original assignment was easy, even with the tight deadline. As the project developed, management changed their wants and needs and they always wanted it tomorrow. By the time the project ended, I had a bloated program that worked but had become such a nightmare that no one but me would ever be able to follow my logic. Ugly.
I actually feel sorry for the guys who are driven like slaves to accommodate the “climate scientists” whims. It has to be frustrating.
Just my opinion.
Jesse

Doug Ferguson
July 6, 2009 6:00 pm

John Balttutis (15:30:18) :
John(above) seems to be one of the few commentators here that have focused on the science and not the programming. (Lot’s of touchy programers out there!)That was the whole point of my story on Edward Lorenz above. Good or bad programing(providing it works at all) has more to do with maintaining it and modifiying it(the software model) than it does with whether or not it is a good model. It’s the assumptions on which the model is built is what matters. Lorenz sensed and later showed that chaotic systems like our climate and any other turbulant system are impossible to model for any long range forecasting or predicting because they are not systematic and never really repeat their cycles exactly. Until we can accurately model clouds and cloud behavior, we don’t have a chance of modeling our whole climate with it’s interaction with the earth’s ecosystems.

Bill Illis
July 6, 2009 6:10 pm

The climate models do not magically produce results. The problem is not code-related.
They are designed to follow global warming theory on how temperatures increase as GHGs increase. Different modelers will have slightly different assumptions about the temp impact or build in longer lag times or incorporate different assumptions about feedbacks like water vapour or have different assumptions about Aerosols.
But the climate models are just weather models, with a LN (GHGs) module and then some random white noise processes built-in. They all are building toward +3.0C by 2100 give or take and a simple spreadsheet would work just as well.
It is not a question of whether the code is done properly, it is a question of whether the theory is right is the first place. So far, the theory needs lots of little plugs to plug the Watt leakages.

Doug Ferguson
July 6, 2009 6:15 pm

Sorry about the mispelled words in the previous post. I just rushed it out without the engineer’s handy tool, Spellcheck!

Allan M R MacRae
July 6, 2009 6:24 pm

ON HOW CLIMATE MODELS OVERSTATE GLOBAL WARMING
Edited from a previous post:
Allan M R MacRae (12:54:27) :
“There are actual measurements by Hoyt and others that show NO trends in atmospheric aerosols, but volcanic events are clearly evident.”
But increased atmospheric CO2 is NOT a significant driver of global warming – that much is obvious by now.
What has that to do with aerosols?
**************************
The Sensitivity of global temperature to increased atmospheric CO2 is so small as to be inconsequential – much less than 1 degree C for a doubling of atmospheric CO2.
Climate models assume a much higher Sensitivity, by assuming that CO2 feedbacks are positive, when in fact there is strong evidence that these feedbacks are negative.
Climate model hindcasting fails unless false aerosol data is used to “cook” the model results.
Connecting the dots:
The false aerosol data allows climate model hindcasting to appear credible while assuming a false high Sensitivity of global temperature to atmospheric CO2.
The false high Sensitivity is then used to forecast catastrophic humanmade global warming (the results of the “cooked” climate models).
What happens if the false aerosol data is not used?
No false aerosol data > no credible model hindcasting > no false high climate Sensitivity to CO2 > no model forecasting of catastrophic humanmade global warming.
Regards, Allan
Supporting P.S.:
Earth is cooling, not warming. Pass it on…

Steve in SC
July 6, 2009 6:38 pm

I am not a software engineer.
I am an engineer. I have both mechanical and electrical degrees with many years of dealing with chemists and chemical engineer PHDs in research.
Software is a necessary evil. Make no mistake about it it is evil.
That said I have written software for 30+ years. It is evil. I started with Fortran IV in 1967. I make no claim to be an expert but I do have some small degree of competence and a physical understanding of phenomena that most folks do not possess.
There are coders, developers, software engineers, and programmers behind every bush. Competent ones are very difficult to find. The rest are just code pumps.
Regarding the disparaging comment about someone being self taught, Newton was self taught.
A lot of this code is indeed ancient. It does have the lava flow signature and appears to be patched in a random sort of way. (I guess they were afraid to touch it after they got it to run.)
While it is not undocumented I would have preferred it to be a little more heavily commented.
While not completely horrible ( what I have looked at), it is fairly painful.
The long and short of my brief analysis is that given a monster set of initial conditions, this thing will chew on it for a period of time and regurgitate factored data based on their assumptions of the way the world works. Therein lies the rub, as it would seem to me that these guys haven’t figured that part out yet. (really nobody else has either but a lot of their assumptions are faulty.)
If these people really wanted to have a good or even better than they have model they would throw the flow chart for this beast out there and let everyone have at it. Let everyone pick it apart and accept reasonable suggestions then code from that. I doubt very seriously that a flow chart for this beastie has ever been done. Yeah, I do flow charts because I’m lazy and don’t like doing stuff over N+1 times.
Just my $0.02 worth.

Jim
July 6, 2009 6:51 pm

Jesse (17:57:27) : The situation you describe what with the requirements changing as time goes by is normal. That’s what good programming practices are good for – making all that change manageable.

Jim
July 6, 2009 6:57 pm

Allan M R MacRae (18:24:23) : I’m glad you brought up aerosols. Something has been bothering me. The alarmists say aerosols cool, but some clouds warm. But, isn’t a cloud in fact an aerosol? The particle size can vary from the CN to a large hail stone, but most of it is an aerosol, no? If aerosols cool, shouldn’t also clouds that have not progressed to precipitation. (I realize clouds bring a lot more to the table that that, but the aerosol thing has been bothering me.)

Reply to  Jim
July 6, 2009 6:58 pm

Jim:
The term aerosol when used by AGW proponents usually means sulphate aerosols.

henrychance
July 6, 2009 7:06 pm

Gary Strand was near to providing 2 answers for 2 questions. Did I miss them? Is Gary now staying away? I personally find modeling in fortran is interesting.

Jim
July 6, 2009 7:08 pm

jeez (18:58:44) : I thought aerosols acted by virtue of their small size. What’s the significant difference between sulfate aerosols and hydroxyl ones?

Reply to  Jim
July 6, 2009 7:10 pm

I’m not qualified to discuss aerosol chemistry. I was addressing your original question.

July 6, 2009 7:09 pm

Mike Bryant (16:54:01) :

I miss Gary Strand… He should be commenting here…

Yes, he should be. But Mr. Strand is hiding out. He has been hiding out ever since Caspar and the Jesus Paper was mentioned. Strand put his tail between his legs and ran off yelping. [Can’t blame him; that link completely destroys Strand’s whole argument.]
Also, thanx to Bob Tisdale for his link to a very interesting back-and-forth with several other commenters: click. I’ve just re-read the entire thread, and it’s as good as it gets in the blogosphere. The putative “authority”, Mr. Strand, gets positively owned. Strand runs ‘n’ hides when Bishop Hill’s well documented article is mentioned — as soon as that article was mentioned, Strand headed for the hills.

July 6, 2009 7:46 pm

snip

crosspatch
July 6, 2009 7:46 pm

If I were to build a financial model with the same results compared to observations as these climate models, I would be fired. Unless, of course, I was modeling the housing market for a mortgage guarantee pseudo-government corporation. In that case I would get a bonus.
Orwell is laughing.

Frederick Michael
July 6, 2009 7:49 pm

William Woody (10:32:26) :
has it just about right.
I LIVE in FORTRAN — and I’m not underpaid. I was literally up until 2am last night coding an FFT post-processor in FORTRAN. I have a customized version of the old Personal Editor that converts FOR to F90 fast. The LF95 compiler makes FORTRAN the best for serious number crunching.
This old code doesn’t look that bad to me. I DO NOT have time to fiddle with it this week but, with only one or two small miracles, will have time in a week or two. I’ll try to convert it to something readable.

Milwaukee Bob
July 6, 2009 8:19 pm

I wrote my first code in 1963, in a language? called SPS. Then Autocoder, Fortran, RPG, COBOL, Basic. (and now HTML) But I stopped being a programmer because I realized that it was nothing but being a slave to the system! So I became a systems analyst. But I stopped being a systems analyst because I realized that it was nothing but being a slave to the guy holding the money! So I became one of those guys with the money. And I made a lot of money because I remembered what it was like to be a slave to IT and them that hold it. THAT’S WHERE PROBABLY EVERY “PROGRAMMER” OF GLOBAL CLIMATE MODELS IS AT, A SLAVE TO WHATEVER ANSWERS THE GUYS WITH THE MONEY WANT TO SEE. However, I believe what every one of them knows, but would NEVER admit to, is that no digital computer, no matter how perfectly programmed will ever be able to “model” the super complex ANALOG system we know as – global climate. Will there ever be a computer (and language) capable of such? Probably, but it’ll be a “Bionary” (remember you read it here first) unit?, device?, that will be based in nano-biological functionality and be capable of self-correcting its “programming” as it runs processes and finds out-of-bounds anomalies through out the “steps” therein. And these “steps” (functions) will not be limited and set mathematical increments, they will be analog at their core providing an infinite range of conditions and interaction within themselves. Even with such a device it may not be possible to model the global climate or at least not for a long time. Why? Because we do not even know WHAT WE DO NOT KNOW! We don’t know enough about the oceans, clouds, dust from land, dust from space, CO2 dispersions and concentrations, and even the that big bright ball in the sky to fill a small bit bucket, much less feed into a computational device and expect that some lowly programmer has somehow written the code of century that is going to spit out what the temperature at 6ft above the ground is going to be at the corner of 5th and Main in Timbuktu at 3:31am on the 14tn day of July, 2077. And with historical record of any of the above (and more) how could we ever test “to reality” what we created?
I can program a computer to whistle Dixie while dancing an Irish jig and tell you that in 10 years you’ll be 10 years older – if something doesn’t happen. That doesn’t mean the computer was built in Ireland or programmed at Georgia Tech, or make me a scientist. AND it doesn’t somehow magically put into my hands the data I need to RUN THE MODEL. You can have the most powerful computer physically possible (even a Bionary one) and the most sophisticate language and genius of an analyst and of course a master coder and without the RIGHT data, YOU GOT NOTHING.

Milwaukee Bob
July 6, 2009 8:22 pm

“And with historical record of any of the above…” That should be -And without a historical record . . ..

GlennB
July 6, 2009 8:28 pm

Frederick,
Convert? Gak! This may be of interest to you if you haven’t seen it:
“The GISS GCM model development process is over 25 years old…”
http://www.giss.nasa.gov/tools/modelE/modelE.html
from Anthony’s link
http://www.giss.nasa.gov/tools/modelE/

Doug Ferguson
July 6, 2009 8:48 pm

To Milwaukee Bob,
Well said and Amen!

CodeTech
July 6, 2009 8:59 pm

Steve in SC, if you’re referring to my comment as “disparaging”, I didn’t mean it that way. I wrote:

There is a difference between someone writing code using established methodologies and someone who taught themselves and gets mired in an oversized project.

Many times I have had to clean up after people who got bogged down in something bigger than they can handle. Part of the “established methodologies” I refer to include stepping back and seeing the big picture, then breaking down or modularizing into sections, then concentrating on building functionality for each module or section, then optimizing.
Most self-taught programmers are very good at one or more of these steps, but few do all of them. Obviously, since I’m writing this, I like to think I am one of those, at least I keep getting called back and others who follow my work are never wading through several feet of muck to get where they need to be.
Either way, I’m trying to summarize what many have also tried:
NO, it’s not all about the code. Sure, it’s about the physics. However, without the code to support the physics you have muck, you can’t verify or confirm accuracy, you can’t test anything, and you have a horrid time making any changes.
The whole point of spreadsheets, for example, was to take programming out of the picture and allow those who worked with numbers to just go ahead and plug in numbers. Yes, programmers were involved, you just didn’t see them. VisiCalc and Lotus 1-2-3 created something of a revolution by letting the numbers and formulas people just work with numbers and formulas, and not have to fight with the computer as well.
Maybe we need to see the development of an independent, publicly accessible, open source GCM, and allow people to share their tweaks and assumptions. Surely then we could come up with a better model.
(yes, I know the current group don’t actually WANT a better model)

GlennB
July 6, 2009 9:02 pm

Smokey (19:09:54) :
Mike Bryant (16:54:01) :
I miss Gary Strand… He should be commenting here…
“Yes, he should be. But Mr. Strand is hiding out.”
Gary has been over at climateaudit puffing up and obfuscating.

Andrew
July 6, 2009 9:07 pm

“These codes are what they are – the result of 30 years and more effort by dozens of different scientists (note, not professional software engineers), around a dozen different software platforms and a transition from punch-cards of Fortran 66, to Fortran 95 on massively parallel systems.” – Gavin Schmidt, RealClimate.org