
Guest Post by Steven Goddard
Global Climate Models (GCM’s) are very complex computer models containing millions of lines of code, which attempt to model cosmic, atmospheric and oceanic processes that affect the earth’s climate. This have been built over the last few decades by groups of very bright scientists, including many of the top climate scientists in the world.
During the 1980s and 1990s, the earth warmed at a faster rate than it did earlier in the century. This led some climate scientists to develop a high degree of confidence in models which predicted accelerated warming, as reflected in IPCC reports. However, during the last decade the accelerated warming trend has slowed or reversed. Many climate scientists have acknowledged this and explained it as “natural variability” or “natural variations.” Some believe that the pause in warming may last as long as 30 years, as recently reported by The Discover Channel.
But just what’s causing the cooling is a mystery. Sinking water currents in the north Atlantic Ocean could be sucking heat down into the depths. Or an overabundance of tropical clouds may be reflecting more of the sun’s energy than usual back out into space.
“It is possible that a fraction of the most recent rapid warming since the 1970’s was due to a free variation in climate,” Isaac Held of the National Oceanic and Atmospheric Administration in Princeton, New Jersey wrote in an email to Discovery News. “Suggesting that the warming might possibly slow down or even stagnate for a few years before rapid warming commences again.”
Swanson thinks the trend could continue for up to 30 years. But he warned that it’s just a hiccup, and that humans’ penchant for spewing greenhouse gases will certainly come back to haunt us.
What has become obvious is that there are strong physical processes (natural variations) which are not yet understood, and are not yet adequately accounted for in the GCMs. The models did not predict the current cooling. There has been lots of speculation about what is causing the present pattern – changes in solar activity, changes in ocean circulation, etc. But whatever it is, it is not adequately factored into any GCMs.
One of the most fundamental rules of computer modeling is that if you don’t understand something and you can’t explain it, you can’t model it. A computer model is a mathematical description of a physical process, written in a human readable programming language, which a compiler can translate to a computer readable language. If you can not describe a process in English (or your native tongue) you certainly can not describe it mathematically in Fortran. The Holy Grail of climate models would be the following function, which of course does not exist.
FUNCTION FREEVARIATION(ALLOTHERFACTORS)
C Calculate the sum of all other natural factors influencing the temperature
…..
RETURN
END
Current measured long term warming rates range from 1.2-1.6 C/century. Some climatologists claim 6+C for the remainder century, based on climate models. One might think that these estimates are suspect, due to the empirically observed limitations of the current GCMs.
As one small example, during the past winter NOAA’s Climate Prediction Center (CPC) forecast that the upper midwest would be above normal temperatures. Instead the temperatures were well below normal.
http://www.cpc.ncep.noaa.gov/products/archives/long_lead/gifs/2008/200810temp.gif
http://www.hprcc.unl.edu/products/maps/acis/mrcc/Last3mTDeptMRCC.png
Another much larger example is that the GCMs would be unable to explain the causes of ice ages. Clearly the models need more work, and more funding. The BBC printed an article last year titled “Climate prediction: No model for success .”
And Julia Slingo from Reading University (Now the UK Met Office’s Chief Scientist) admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.
“We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.
“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.“
……
One trouble is that as some climate uncertainties are resolved, new uncertainties are uncovered.
Some modellers are now warning that feedback mechanisms in the natural environment which either accelerate or mitigate warming may be even more difficult to predict than previously assumed.
Research suggests the feedbacks may be very different on different timescales and in response to different drivers of climate change
…….
“If we ask models the questions they are capable of answering, they answer them reliably,” counters Professor Jim Kinter from the Center for Ocean-Land-Atmosphere Studies near Washington DC, who is attending the Reading meeting.
“If we ask the questions they’re not capable of answering, we get unreliable answers.“
I am not denigrating the outstanding work of the climate modelers – rather I am pointing out why GCMs may not be quite ready yet for forecasting temperatures 100 years out, and that politicians and the press should not attempt to make unsupportable claims of Armageddon based on them. I would appreciate it if readers would keep this in mind when commenting on the work of scientists, who for the most part are highly competent and ethical people, as is evident from this UK Met Office press release.
Stop misleading climate claims
11 February 2009
Dr Vicky Pope
Dr Vicky Pope, Met Office Head of Climate Change, calls on scientists and the media to ‘rein in’ some of their assertions about climate change.
She says: “News headlines vie for attention and it is easy for scientists to grab this attention by linking climate change to the latest extreme weather event or apocalyptic prediction. But in doing so, the public perception of climate change can be distorted. The reality is that extreme events arise when natural variations in the weather and climate combine with long-term climate change. This message is more difficult to get heard. Scientists and journalists need to find ways to help to make this clear without the wider audience switching off.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

Let me check if I understand this correctly. Those “leading climate scientists” produce climate models which predictions range anywhere between oceans freezing and boiling. As general public is eager to know what it is going to be, more funding is just what is needed to solve the problem?
I find it interesting that people believe they can model our entire climate system when in fact they cannot even model hurricane activity with any reliable precision. Take hurricane Katrina for example (the AGW poster child), models were not (and are not) able to accurately predict landfall location nor intensity any further than 24-48hrs. out. We are talking about a tiny fraction of our climate system here, and one that is observable over a series of weeks, and yet if you look at hurricane path predictions, they are quite vague, results vary greatly between models (that is why they average several) and those predictions are updated and change frequently. Hurricane models have been around for quite a long time and have probably improved more than almost any type of chaotic system model. They are getting pretty good, in relative terms, however, even those models are far from what I would call “accurate”.
This all begs the question, if models cannot predict hurricane paths and intensity with any discernible accuracy outside or 48hrs., why would people have so much faith in predictions of models that are much more complex, and try to predict global temperatures to the 1/10th of a degree, precipitation in every region around the world, sea levels around the world, polar ice coverage, glacial ice content around the world, and more…, and make those predictions for 50, 100, 200, 1000 years time out?
I would question the mere sanity of any individual that would assert such a notion.
Don Keiller (11:55:42) :
Where? And what’s the point?
If burning all the fossil fuels currently known to exist fails to make the ocean acidic (it would actually end up slightly alkaline, assuming sea critters and plants did nothing to sequester any of it) what possible purpose is gained by such a demonstration?
Global climate models are extraordinarily useful tools, and among other things have successfully predicted the rise in temperature as greenhouse gases increased, the cooling of the stratosphere even as the troposphere warmed, the increase in the height of the tropopause, polar amplification due the ice-albedo, and other effects, and greater increase in nighttime than in daytime temperatures, and the magnitude and duration of the cooling and the water vapour feedbacks caused by the eruption of Mount Pinatubo.
One is tempted to ask what is meant by ‘the current cooling’ given that the 18-month, 10 year, 20 year and 30 year trends are all positive. Is it the modest cooling trend that has occurred over the last 8 years or so? If so it is not true, nor particularly relevant, to say that no model predicted this. IPCC projections are based on the aggregated results of many model runs, averaged out. Some of these models did predict the cooling trend, while most did not, however over such a short period the ‘noise’ is large compared to the long term signal. Here James Annan shows that since 2001, 95% of the model predictions are within the uncertainty band of the observations. In normal statistical usage that means that the two are consistent. Lucia agrees..
Using all the above, I find that the best estimate of the underlying trend in GMST based on the average of 38 runs (2.22 C/century ) is not inconsistent to a significance level of 95% with the observed trend of -0.59 C/century.
Climate is by definition a long term phenomenon and we can see the longer term model-based projections made by the IPCC here. The midrange scenario A2 projects a temperature rise from 1990-2010 of 0.35C, equivalent to a linear increase of 0.175C per decade. How are they doing? Well the trends in the 4 main indices since 1990 are:
UAH 0.168
Hadley 0.171
GISS 0.183
RSS 0.185
(That GISS is such an outlier huh? ;-). Do these observations increase or decrease our confidence in the models?
maksimovich (11:09:17) :
There are physical reasons beyond the issues of dealing with a chaotic system, which I assume is the reason for your mirth. Current models use a grid that’s too coarse to model thunderstorms. Using a 10X finer resolution would be a big help, but that means a thousand-fold increase in the number of data points if all three axes were extended. More if you increase the time resolution. (I suspect you could make a decent case for 5X the vertical resolution and 2X the time resolution.) Poor resolution also make modeling hurricanes atrocious, and those move a huge amount of heat and water poleward.
I don’t see the chaotic effects as being significant – we’re not forecasting the weather, we’re forecasting the boundary limits and chaotic attractors, i.e. climate.
BTW, one thing that hasn’t been discussed here, and shouldn’t be discussed much, is NCAR’s CCSM (Community Climate System Model). Source and lots of documentation is at http://www.ccsm.ucar.edu. It’s supported on various supercomputers, but some people use it on Linux clusters. More at http://chiefio.wordpress.com/2009/03/06/gistemp-start_here/#comment-101
If You Can’t Explain It, You Can’t Model It
the headline is wrong. you might want to talk to anyone, who is doing any part of “modeling”.
start with railways:
http://www.elhamvalleylinetrust.org/Railway%20model.jpg
and check, whether those who do the model, will be able to explain everything.
I understand if you bundle 100 sub-prime mortgages you get a great investment opportunity.
Models without validation are just so much arithmetic without meaning. Build your models, set T0 = 1 AD and predict the 2000 years that follow. That’s what real engineers would do.
Roger Sowell (11:37:18) :
At no point will the judge accept any single view as “settled science.”
Just a shame that in the UK’s Kings North power station vandalism trial only one view was on show. The prosecution wasn’t about to disturb the consensus and put up anyone to dispute Mr Hansen’s views.
Dr Vicky Pope also contradicted herself with a MET OFFICE press release for the recent Copenhagen conference. The first sentence reads :-
“If greenhouse gas emissions continue to rise at current rates, then global temperatures could rise by more than 6C over this century”
The last sentence is slightly modified :-
“Dr Pope warns: If the world fails to make the required reductions. it will be faced with adapting not to just 2C rise in temperature but to 4C or more by the end of the century”
To me this is the very same alarmist propaganda which she has earlier criticised.
I have to say that a model using neural nets, as Tsonis et al have done, if used judiciously so that the differential equations that enter the problem are also simulated , would be a better direction to go in creating new models.
I have been asking whether an analogue computer could not be devised for climate, where the coupled differential equations are entered as electric elements , and be more successful and economical in describing the time evolution of climate.
On the subject of processing power, many of Hansen’s predictions were based on very early models running on computers that were woefully underpowered compared to today’s. Yet years later he is still standing by the output of that crude technology.
I think you’re overblowing it with the “highly complex” and “millions of lines of code” stuff. As you should know, whenever there is an overly massive amount of code then it usually means that most of it is unnecessary bloat from being badly written. The fundamental equations and computations that are used in these models can be written in very few lines of code. We don’t know so much about about Hadley (except that it’s in Fortran too) but GISS is 70’s Fortran, largely written by students, poorly documented, with massive amounts of redundancy and duplication, some obviously unchecked coding errors and very little in the way of validation. It is very far from being commercial quality code, or even good academic code (of which there is a lot). Modern multiphysics or fluid dynamics software and even those ultimately crappy financial models are all a good deal more complex/sophisticated and written by far better programmers. In fact I’ll guarantee that the coding for the movie Shrek was about 100 times more mathematically complex. As for modeling the atmosphere with 3-deep elements, each one the size of Florida – words fail me.
John Philips,
The cooling trend which NOAA’s Isaac Held was referring to above, has been over the last decade.
Since 2002, RSS and UAH both show cooling at a rate of nearly 4C/century.
http://www.woodfortrees.org/plot/uah/from:2002/trend/plot/rss/from:2002/trend
Which model predicted that?
I should have said “Kyle Swanson of the University of Wisconsin-Milwaukee” rather than “NOAA’s Isaac Held” in the last post.
http://www.msnbc.msn.com/id/29469287/wid/18298287/
Slightly OT, though the question of scientific veracity versus personal belief systems and associatesd personal actions are relevant.
Is Dr. Hansen on a sabbatical?
I read that he is going to lead another demonstration – this time in the UK.
http://www.timesonline.co.uk/tol/news/uk/article5908377.ece
Perhaps our alleged Prime Minister could arrange to meet him and ask him to take the DVD’s Mr. Obama loaned him on his recent trip to Washington back the the White House video library. Brown must have had time by now to watch all of them. I assume Barry’s gift was just a holding gesture whilst he arranged to make Hansen available.
Why IS the US government paying one of its employees (one way or another) to act as a focal point for troublemakers in another country? Is this a new direction for foreign policy?
“Sadly Vicky Pope contradicted herself within a matter of a few weeks. Just last week she said that 85% of the Amazon was going to disappear because of even modest climate change. This alarmist information came to her by way of a computer model.
The study, which has been submitted to the journal Nature Geoscience, used computer models to investigate how the Amazon would respond to future temperature rises.”
1) The unpublished paper is yet to pass “peer review” as i understand (Revkin NYT)
2) The ‘model’ (without having to even read it) will encompass anumber of mathematical fictions eg Kolgomrov 1956
3) The “model” will violate the second law eg Brillouin, L., 1956,Morowitz 1989
4) There is NO reducible algorithm ( The complexity described cannot be described more concisely then by writing down all the observed phenomena) eg Monod 1972 IE 10^27 DEGREES OF FREEDOM.
5) Point 4 brings to the front the Turing-Kolgomorov-Chaiten complexity problem
6) Who wrote this paper JK Rowling?
If all the Amazon rain forest is cleared it will get hotter because the sun is now shining on exposed soil. Maybe CO2 might do something as well, but the sun will heat the ground with immediate affect. In climate models forest is given the same albido rating as cities. I do not believe this is correct. I used to go gliding and became reasonably proficient at staying up by using thermals, on one case for over 6 hours. The strongest thermals were always from buildings, roads, car parks etc. The next source was open farmland. I never experienced any thermals from forest although I understand more experienced pilots can get them.
What concerns me is we will focus on fixing CO2 and at the same time continue all the other land clearing activities including draining, that will definately make it hotter. We view all these activities as local and therefore can be dismissed. When such clearances can be measured in size by using countries for scale then the effect ceases to be local.
The arguments over GW have become partisan and are not helping anyone. Is this how it was on Easter Island? Maybe they new something was not right but they cleared all the trees anyway.
Mr Maksimovich observed (11:09:17) :
“as Vladimir Arnold said (address to the Field Institute)
For example, we deduce the formulas for the Riemannian curvature of a group endowed with an invariant Riemannian metric. Applying these formulas to the case of the infinite-dimensional manifold whose geodesics are motions of the ideal fluid, we find that the curvature is negative in many directions. Negativeness of the curvature implies instability of motion along the geodesics (which is well-known in Riemannian geometry of infinite-dimensional manifolds). In the context of the (infinite-dimensional) case of the diffeomorphism group, we conclude that the ideal flow is unstable (in the sense that a small variation of the initial data implies large changes of the particle positions at a later time).”
I’ll be able to sleep well tonight now I know that. Then tomorrow I can set about learning Swahili so that I can translate it. Or maybe it’s the world’s longest anagram. Either way, a fun day beckons.
John Philip,
You said that some of the models did predict the recent cooling.
What was the reason for the cooling, and what do those same correct models forecast for the remainder of the century?
Incidentally, Mr Sowell (at 11.37.18) mentioned judicial notice. I was once involved in a trial where one of my opponents was trying to persuade the judge that he could not properly take judicial notice of a particular matter. In the course of his argument he was endeavouring to explain what “judicial notice” means and said “For example, Your Lordship would not be able to take judicial notice of the fact that Arsenal beat West Ham United at football yesterday.” To which the judge replied “I wouldn’t need to, I was at the match.”
Ric Werme (12:37:55) :
I don’t see the chaotic effects as being significant – we’re not forecasting the weather, we’re forecasting the boundary limits and chaotic attractors, i.e. climate.
The divergence from reality (from unknown quantities of ‘natural variation”) at present being one level of uncertainty.
The level of mathematical skill from the “proprietors” of “state of the art gcm” being another as Dymnikov 2007 observes.
” It should be emphasized that most of the scientific community concerned with climate models are generally not interested in mathematical problems and treat a model as a specific finite-dimensional construction with a specific description of physical processes, thus reducing the study of the model’s quality to a numerical experiment alone “….
…It is necessary to prove some statements for the system of partial differential equations describing the climate system’s model.
I. The global theorem of solvability on an arbitrarily large time interval t.
Unfortunately, there is presently no such a theorem in a spherical coordinate system with “correct” boundary conditions. This is not a consequence of the absence of such theorems for three-dimensional Navier–Stokes equations. The equations of the state of-the-art climate models have a dimension of 2.5 because the hydrostatic equation is used instead of the full third equation of motion.
JamesG (13:24:26) :
We don’t know so much about about Hadley (except that it’s in Fortran too) but GISS is 70’s Fortran, largely written by students, poorly documented, with massive amounts of redundancy and duplication, some obviously unchecked coding errors and very little in the way of validation. It is very far from being commercial quality code, or even good academic code (of which there is a lot).
Indeed. I used to write Fortran 77 code and if we assume the skeleton function in the article is supposed to be F77, I can identify at least 4-5 formal issues with it (and I am ignoring the “….” ellipses), even though it does nothing. One issue is fatal, it isn’t going to compile. Writing commercial quality code is harder than most people think 🙂
Why IS the US government paying one of its employees (one way or another) to act as a focal point for troublemakers in another country? Is this a new direction for foreign policy?
That’s nothing. Ed Milliband said young British activists should take Direct Action to other countries to bully them against using their resources. Talk about imperialism. I am a very patriotic Brit and this kind of thing is completely out of line with post-imperial British values.
Because nobody in this country has the gonads to do the right thing and fire the bastard!
Which model predicted that?
I would be more inclined to rely upon the natural world around me, like the behavior of animals & plants, rather than on a computer model.
The flora & fauna are properly programmed, the computer models are at best guesswork based on incomplete information. They are in thier infancy.