
Guest Post by Steven Goddard
Global Climate Models (GCM’s) are very complex computer models containing millions of lines of code, which attempt to model cosmic, atmospheric and oceanic processes that affect the earth’s climate. This have been built over the last few decades by groups of very bright scientists, including many of the top climate scientists in the world.
During the 1980s and 1990s, the earth warmed at a faster rate than it did earlier in the century. This led some climate scientists to develop a high degree of confidence in models which predicted accelerated warming, as reflected in IPCC reports. However, during the last decade the accelerated warming trend has slowed or reversed. Many climate scientists have acknowledged this and explained it as “natural variability” or “natural variations.” Some believe that the pause in warming may last as long as 30 years, as recently reported by The Discover Channel.
But just what’s causing the cooling is a mystery. Sinking water currents in the north Atlantic Ocean could be sucking heat down into the depths. Or an overabundance of tropical clouds may be reflecting more of the sun’s energy than usual back out into space.
“It is possible that a fraction of the most recent rapid warming since the 1970’s was due to a free variation in climate,” Isaac Held of the National Oceanic and Atmospheric Administration in Princeton, New Jersey wrote in an email to Discovery News. “Suggesting that the warming might possibly slow down or even stagnate for a few years before rapid warming commences again.”
Swanson thinks the trend could continue for up to 30 years. But he warned that it’s just a hiccup, and that humans’ penchant for spewing greenhouse gases will certainly come back to haunt us.
What has become obvious is that there are strong physical processes (natural variations) which are not yet understood, and are not yet adequately accounted for in the GCMs. The models did not predict the current cooling. There has been lots of speculation about what is causing the present pattern – changes in solar activity, changes in ocean circulation, etc. But whatever it is, it is not adequately factored into any GCMs.
One of the most fundamental rules of computer modeling is that if you don’t understand something and you can’t explain it, you can’t model it. A computer model is a mathematical description of a physical process, written in a human readable programming language, which a compiler can translate to a computer readable language. If you can not describe a process in English (or your native tongue) you certainly can not describe it mathematically in Fortran. The Holy Grail of climate models would be the following function, which of course does not exist.
FUNCTION FREEVARIATION(ALLOTHERFACTORS)
C Calculate the sum of all other natural factors influencing the temperature
…..
RETURN
END
Current measured long term warming rates range from 1.2-1.6 C/century. Some climatologists claim 6+C for the remainder century, based on climate models. One might think that these estimates are suspect, due to the empirically observed limitations of the current GCMs.
As one small example, during the past winter NOAA’s Climate Prediction Center (CPC) forecast that the upper midwest would be above normal temperatures. Instead the temperatures were well below normal.
http://www.cpc.ncep.noaa.gov/products/archives/long_lead/gifs/2008/200810temp.gif
http://www.hprcc.unl.edu/products/maps/acis/mrcc/Last3mTDeptMRCC.png
Another much larger example is that the GCMs would be unable to explain the causes of ice ages. Clearly the models need more work, and more funding. The BBC printed an article last year titled “Climate prediction: No model for success .”
And Julia Slingo from Reading University (Now the UK Met Office’s Chief Scientist) admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.
“We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.
“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.“
……
One trouble is that as some climate uncertainties are resolved, new uncertainties are uncovered.
Some modellers are now warning that feedback mechanisms in the natural environment which either accelerate or mitigate warming may be even more difficult to predict than previously assumed.
Research suggests the feedbacks may be very different on different timescales and in response to different drivers of climate change
…….
“If we ask models the questions they are capable of answering, they answer them reliably,” counters Professor Jim Kinter from the Center for Ocean-Land-Atmosphere Studies near Washington DC, who is attending the Reading meeting.
“If we ask the questions they’re not capable of answering, we get unreliable answers.“
I am not denigrating the outstanding work of the climate modelers – rather I am pointing out why GCMs may not be quite ready yet for forecasting temperatures 100 years out, and that politicians and the press should not attempt to make unsupportable claims of Armageddon based on them. I would appreciate it if readers would keep this in mind when commenting on the work of scientists, who for the most part are highly competent and ethical people, as is evident from this UK Met Office press release.
Stop misleading climate claims
11 February 2009
Dr Vicky Pope
Dr Vicky Pope, Met Office Head of Climate Change, calls on scientists and the media to ‘rein in’ some of their assertions about climate change.
She says: “News headlines vie for attention and it is easy for scientists to grab this attention by linking climate change to the latest extreme weather event or apocalyptic prediction. But in doing so, the public perception of climate change can be distorted. The reality is that extreme events arise when natural variations in the weather and climate combine with long-term climate change. This message is more difficult to get heard. Scientists and journalists need to find ways to help to make this clear without the wider audience switching off.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

Change Scenarios—- Did any of the Goverments/ Top Financial Institutions predict the Global Financial problems until it hit them ? No.
The financial system should be a much more easy to model system than climate and there would be loads of money and brain power swilling around the trough of banks/ big finance to try and make such predictions if it were possible.
But no, with all these highly paid financial whizz kids and the money put in no one said ” Houston, we have a problem”
Why would climate modellers be any better?
Sadly Vicky Pope contradicted herself within a matter of a few weeks. Just last week she said that 85% of the Amazon was going to disappear because of even modest climate change. This alarmist information came to her by way of a computer model.
http://www.guardian.co.uk/environment/2009/mar/11/amazon-global-warming-trees
The study, which has been submitted to the journal Nature Geoscience, used computer models to investigate how the Amazon would respond to future temperature rises.
Computer models cannot predict protein folding where everything is understood. I’m no expert, but I don’t understand how models can claim to predict climate when the simplist processes are not well understood. For example, I read in an oceanography text that evaporation from the open ocean cannot be accurately modelled, because the effects of wind and waves are too difficult to deal with. How can anyone make a thousand guesses and come up with something meaningful?
Aron,
I suspect it is somewhat easier to model the impact of a pre-defined temperature change on a particular ecosystem, than it is to model what the temperature change is likely to be over time.
AdrianS (10:03:15) :
Change Scenarios—- Did any of the Goverments/ Top Financial Institutions predict the Global Financial problems until it hit them ? No.
even better the Skeptics were ignored and told they didn’t know what they were talking about
Steven,
But on what basis can they predict that 85% of the Amazon will disappear when most of the deforestation has been from logging and nobody can tell if the feedback effects of warming will result in drought or increased humidity. It seems they approached the subject with the expectation that a positive feedback effect would result in drought, without even considering that it might be the other way around. We could tell a model that warming will result in increased precipitation, and thus rainfall. We could also tell the model that the Amazon river will overflow because of rising sea levels and thus cause floods. Or we could get a model to give us a less alarming result.
When you bundle the output of 15 models, no matter what happens, “the models predicted it.” Sometimes, I get tired of hearing that. Even I find a nickel on the pavement from time to time. (/mild mini-rant OFF)
I’m curious why the poll is present. Sure it’s a bit of fun but it is entirely useless. For one, it doesn’t have an answer I would have chosen. The closest is
“No, they are not adequate at all.” The problem is that they’ve never been tested for conformance to reality except in hindsight so their adequacy is indeterminate. The same can be said about their uncertainty. If they truly are identical models run from alternate starting points then the output indicates they are no better than a random generator or predicting the temperature next year the same as this year +/- 1C.
I suspect it is somewhat easier to model the impact of a pre-defined temperature change on a particular ecosystem, than it is to model what the temperature change is likely to be over time.
Both are impossible, intractable problems. Regarding the first: “ecosystems” are undefined conceptual entities, i.e. they don’t actually exist in the real world. No computer model of any real world area has been created that accurately predicts “the impact of temperature change.” Model predictions fail at population change, growth-and-yield, presence-absence, and every other characteristic of real world vegetation and animals.
You think climate models are bad? You should see the junk that comes from ecosystem modeling. Even the most simplified, focused, much studied problem of single species tree growth at the stand level (the basic question of tree farm management) has not produced accurate models. Imagine how inaccurate are those models that attempt to predict changes in multiple species.
Please do not assume that environmental modeling efforts in scientific disciplines other than climate are any more valid than CCM’s. They are not. We call them GAP models, for “guess all parameters.”
The Amazon models are pure speculation without foundation in the real world. Absolute GIGO.
In a prior life I created a computer model of the human visual system as a commercial product. When tuning the engine I tweaked the sensitivities and relationships of visual primitives to predict the class of objects in the field of view. I would not make a tuning move that wasn’t understood on the basis of first principles. The further I strayed from first principles on tuning any data set, the worse the performance on other data sets.
The need to understand the process in order to model it is true only where the model is expected to work on multiple data sets.
If you will tolerate a little sarcasm:
The less you know about something the easier it is to model. Just use a neural net and the model can get very good results on the training data set. Even without a neural net it takes very little skill to tweak sensitivities and relationships to obtain good correlations with the training data set.
The GCM’s developed by the IPCC were tuned to model the climate with the data that was current at the time. And sure enough the models could predict the past. Since the IPCC models were tweaked to show that global temperature rises with an increase in CO2, the models have failed (been disproved) in the years since with falling temperatures and rising CO2.
And Julia Slingo from Reading University (Now the UK Met Office’s Chief Scientist) admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.
“We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.
“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.“
Hilarious statements like that bring mathematical physics into disrepute.
In an interesting chapter entitled Engineers Dreams from his book” Infinite in all directions”, Freeman Dyson explains the reasons for the failings of Von Neumann and his team for the prediction and control of Hurricanes.
Von Neumann’s dream
“As soon as we have good enough computers we will be able to divide the phenomena of meteorology cleanly into two categories, the stable and the unstable”, The unstable phenomena are those that are which are upset by small disturbances, and the stable phenomena are those that are resilient to small disturbances. All disturbances that are stable we will predict, all processes that are unstable we will control”
Freeman Dyson page 183.
What went wrong? Why was Von Neumann’s dream such a total failure. The dream was based on a fundamental misunderstanding of the nature of fluid motions. It is not true that we can divide cleanly fluid motions into those that are predictable and those that are controllable. Nature as usual is more imaginative then we are. There is a large class of classical dynamic systems, including non-linear electrical circuits as well as fluids, which easily fall into a mode of behavior that is described by the word “chaotic” A chaotic motion is generally neither predictable nor controllable. It is unpredictable because a small disturbance will produce exponentially growing perturbation of the motion .It is uncontrollable because small disturbances lead only to other chaotic motions, and not to any stable and predictive alternative.
Or as Vladimir Arnold said (address to the Field Institute)
For example, we deduce the formulas for the Riemannian curvature of a group endowed with an invariant Riemannian metric. Applying these formulas to the case of the infinite-dimensional manifold whose geodesics are motions of the ideal fluid, we find that the curvature is negative in many directions. Negativeness of the curvature implies instability of motion along the geodesics (which is well-known in Riemannian geometry of infinite-dimensional manifolds). In the context of the (infinite-dimensional) case of the diffeomorphism group, we conclude that the ideal flow is unstable (in the sense that a small variation of the initial data implies large changes of the particle positions at a later time).Moreover, the curvature formulas allow one to estimate the increment of the exponential deviation of fluid particles with close initial positions and hence to predict the time period when the motion of fluid masses becomes essentially unpredictable.
For instance, in the simplest and utmost idealized model of the earth’s atmosphere (regarded as two-dimensional ideal fluid on a torus surface), the deviations grow by the factor of 10^5 in 2 months. This circumstance ensures that a dynamical weather forecast for such a period is practically impossible (however powerful the computers and however dense the grid of data used for this purpose)
Fortunately we can understand why mathematics is a limiting quality for the MET.
http://www.rsc.org/aboutus/news/pressreleases/2007/chinesemaths.asp
If the underlying principals are not well understood, the model can only spit out guesswork. The effort, though herculean, is rather misdirected. Focus on doing the work that connects and follows all the paths. They did quite the job on DNA. The same amount of work needs to be done on climate before the models can be used for practical purposes.
We have had quite enough of hitting the panic button both ways: Ice Age Coming and Global Overheating.
Enough of this already.
Get back to work.
Maurice,
You can purchase stock prediction models that use a sixth or seventh order back-fitted equation, which is blindingly accurate looking backwards. Unfortunately they are no better than a random number generator looking forward.
“With four parameters I can fit an elephant, and with five I can make him wiggle his trunk”
– John Von Neumann
The IPCC model is a very expensive video game. Something the kids would play on Wii or Nintendo. We can ill afford for grown men to be wasting time playing with such things. The CO2 lag was already known from the Ice Cores.
Get back to work.
Would anyone agree with me that it is absurd to claim (or believe) the science is settled, when there exists multiple climate models (GCMs), each from a learned scientist or group, but none of which agree with the others?
The very fact that there are conflicting models is proof positive the science is not “settled.” This fact is quite useful to attorneys in a trial; we simply bring in an expert witness with a different, but equally credible, opinion. In complex trials, there may be multiple expert witnesses, each with a view that conflicts with the others. At no point will the judge accept any single view as “settled science.”
In the legal realm, a judge may take “judicial notice” of some facts, and does not require presentation of evidence for proof of them. In California (the other states use a similar definition), judicial notice is granted for “Facts and propositions that are not reasonably subject to dispute and are capable of immediate and accurate determination by resort to sources of reasonably indisputable accuracy.” (source: California Rules of Evidence 452(h) )
In the scientific realm, a model of gravity, coriolis, and air resistance can be used to predict quite accurately the landing spot of a ballistic shell shot from a cannon, given the starting conditions such as projectile mass and volume, muzzle velocity, angle of aim, and drag coefficient. THAT is settled science.
The GCM guys/gals have a ways to go.
If this “experiment” to teach people about ocean “acidification” is anything to go by “Take a clam shell and put it in vinegar”!!!
pH vinegar about 2.5
pH seawater about 8
This “demo” exagerates by at least 5 orders of magnitude,
A real quality site
I’ve often wondered who actually thought they could model climate in the first place. The best I could figure is someone without a clue about computer models but a good deal of salemanship sold the idea somewhere. The rest, unfortunately, is history.
We see the AGW believers admit the models are worthless every time they open their mouths and say the climate is getting “worse” than predicted. Of course, most of them don’t realze they are damning the very work that is being used to support their cause.
Save us from agenda funded yet, brilliant scientists and well-meaning idiots with inadequate computer models.
This whole article is a very very good analysis of GIGO. Garbage in garbage out. Well meaning idiots relying on inadequate computer models have destroyed the global economy and are hell-bent on preventing any recovery due to planned excessive carbon taxes.
Can we ever hope to return to reality based science?
I used to think GCMs were Global Climate Models; but then I kept reading papers from people who presumably know better; and they were calling them Global Circulation Models; NOT Climate models.
Well on planet earth it seems that at reasonable (non-geological) time scales, we have basically Ocean Waters, Atmosphere, and perhaps energy that are capable of Circulating. At longer geological time scales, the land itself is circulating; so let’s not go there.
Well it seems to me that in order to get circulation of either water, or atmosphere or energy, you MUST have differences in Temperature both in time and in space.
But limate we are told (by definition) is the long term average of weather. Therfore it is ideally a single number; not a graph, which implies changes over time or space; which would be weather rather than climate.
So climate and circulation seem to be mutually incompatible concepts. You can’t get circulations or model them, without having temperature (and other) differences in time and space, and that means averaging is verboten, because it eliminates differences.
A model that depicts the earth as an isothermal body at about 15 deg C, that radiates 390 Watts per square meter from every point on it 24 hours a day doesn’t have any temperature differences in time or space to use with the laws of Physics to ompute circulations.
Yet an untoward focus ofclimate science, and certainly climate politics rests on what happens to a single number that gets updated each year or maybe monthly or maybe each five years; namely the GISStemp anomaly fo Dr James Hansen at NASA.
Do any of thse vaunted GCMs whatever they are, predict both the mediaeval warm period, and the little ice age; the Maunder and dalton minima; or any other widely recognized instance of a climate change, that presumably had a natural reason.
As for the good Met Office lady’s request for more powerful computers (and any other form or research (taxpayer) dollar expenditures; Nonsense; those more powerful computers will simply spit out garbage at a much faster clip; but it will still be garbage, because clearly the models are NOT adequate to explain what is happening, because there isn’t even scientific agreement, on what the key Physical processes even are; let a lone any experimental verification of the extent to whih any such processes are operating.
The only climate or circulation model that works; and it works very well, is the model that is being run on the data, by planet earth with its oceans, and atmospheres, and the physical geography of its land masses; not to mention all the biological processes going on at the same time. Well then there’s the sun of course; which ranges from the cause of everything; to having nothing to do with earth’s climate.
When the GCMs make even a token effort to emulate the model that planet earth is running; they might start to get some meaningful results.
But I can assure you that faster computers are neither the problem nor the solution. Computers can be made to do the stupidest things at lighning speed; the trick is to have them not doing stupid things, in the first place..
The old astronomy professor was addressing his brand new freshman astronomy class for their first lecture.
“when I was an undergraduate like you all, ” he said, “we could write down the universal equations of motionon the back of a postage stamp; but it took us six months with a battery of students working with log tables to solve those equations, and determine the motions of the universe..”
“Now that has all changed; we have modern high speed computers that can solve those dynamical equations in microseconds; but now takes a battery of students six months to program those computers with meaningful code to even start working on those problems of the universe.”
I know a little bit about computer modelling. I spend hours each day, modelling optical systems with a quite powerful yet desk top computer (eight Intel Processors). Well there are a number of other people in the company, that have the same lens design software, and maybe not quite so powerful computers; who presumably could do the same things I do, with my computers; but they can’t; and they don’t.
You see they are all mechanical packaging engineers; who have been told that optics is just packaging with transparent materials. They can’t do what i do, because they don’t know a darn thing about the Physics of Optical systems. What’s more, they don’t know that that is why they can’t do what i do; it sin’t my eight core computer at all.
Don Keiller,
This may come as quite a revelation to you, but as you neutralize an acid with pH 2.5, the pH traverses through all intermediate levels on the way to it’s final value.
Sometimes you have to think a little for yourself, though it can be painful for some.
What has always just floored me about the debates over models and global warming has been the arrogance of some people (usually politicians and not scientists, but some scientists too) who evidently believe that we have a good enough understanding of the Earth’s climate to be able to design a computer program that can predict what temperature it will be in 2100 (or beyond!). We can’t predict the weather accurately tomorrow, yet we know what will be in 2100. We have zero evidence that any climate model can accurately predict the weather next year, but they surely can predict 2100. Breathtaking arrogance.
Natural systems are amazingly complex things. They aren’t easily understood. There are so many factors influencing global climate (if there is such a thing), and we don’t understand them all…not to mention there are almost certainly factors that we don’t even know exist that influence it, and new ones are discovered from time to time.
It’s time not to shut up about varying beliefs and opinions, but to display a little humility. And to stop politicians from hijacking this issue in an attempt to push a political agenda that has origins far from climate.
Quote:
“The study, which has been submitted to the journal Nature Geoscience, used computer models to investigate how the Amazon would respond to future temperature rises.”
—————————–
Did the models account for the carbon used by lots and lots of petrol driven chainsaws? or the action of their teeth? It is not the (mostly) natural and modest level of climate change that is harming the Amazonian rain forests. It is the action of thousands and thousands of loggers cutting the trees down, leaving huge gaps in the canopies that do not trap evaporating water and the ground dries out so that the rivers feeding the mighty Amazon also dry up.
Mankind’s activities are indeed damaging the Amazon rain-forest, but it is NOT our production of CO2 that is to blame.
Your poll is based on a false premise. There is no single GCM so there is no single prediction. A large number of modelled outcomes with no logical basis to prefer one over another is no prediction at all. If I claimed to be a psychic and you asked me to divine which card you will choose from a deck, you wouldn’t be impressed by my psychic powers if I rattled off twenty or thirty possibilities as my answer even if one of them turned out to be right.
I don’t think this statement is supported by any reliable evidence.
The thing is we have been through this before, in biochemistry. In the late 60’s and early 70’s people believed that they could model non-equilibrium thermodynamic processes using constants derived from equilibrum theromdynamics. They couldn’t.
Kacser, H. & Burns, J.A. ((1973) Symp. Soc. Exp. Biol.27, 65-104) therefore came up with metabolic control theory, now called metabolic control analysis. This is the way forward, use steady state models and recognize that you are dealing with an non-equilibrium system.
It is not hard to realize that these models are fundamentally, thermodynamically, flawed.