If You Can't Explain It, You Can't Model It

Source: Center for Multiscale Modeling of Atmospheric Processes - click image for more

Guest Post by Steven Goddard

Global Climate Models (GCM’s)  are very complex computer models containing millions of lines of code, which attempt to model cosmic, atmospheric and oceanic processes that affect the earth’s climate.  This have been built over the last few decades by groups of very bright scientists, including many of the top climate scientists in the world.

During the 1980s and 1990s, the earth warmed at a faster rate than it did earlier in the century.  This led some climate scientists to develop a high degree of confidence in models which predicted accelerated warming, as reflected in IPCC reports.  However, during the last decade the accelerated warming trend has slowed or reversed.  Many climate scientists have acknowledged this and explained it as “natural variability” or “natural variations.”  Some believe that the pause in warming may last as long as 30 years, as recently reported by The Discover Channel.

But just what’s causing the cooling is a mystery. Sinking water currents in the north Atlantic Ocean could be sucking heat down into the depths. Or an overabundance of tropical clouds may be reflecting more of the sun’s energy than usual back out into space.

It is possible that a fraction of the most recent rapid warming since the 1970’s was due to a free variation in climate,” Isaac Held of the National Oceanic and Atmospheric Administration in Princeton, New Jersey wrote in an email to Discovery News. “Suggesting that the warming might possibly slow down or even stagnate for a few years before rapid warming commences again.”

Swanson thinks the trend could continue for up to 30 years. But he warned that it’s just a hiccup, and that humans’ penchant for spewing greenhouse gases will certainly come back to haunt us.

What has become obvious is that there are strong physical processes (natural variations) which are not yet understood, and are not yet adequately accounted for in the GCMs.  The models did not predict the current cooling.  There has been lots of speculation about what is causing the present pattern – changes in solar activity, changes in ocean circulation, etc.  But whatever it is, it is not adequately factored into any GCMs.

One of the most fundamental rules of computer modeling is that if you don’t understand something and you can’t explain it, you can’t model it.  A computer model is a mathematical description of a physical process, written in a human readable programming language, which a compiler can translate to a computer readable language.  If you can not describe a process in English (or your native tongue) you certainly can not describe it mathematically in Fortran.  The Holy Grail of climate models would be the following function, which of course does not exist.

FUNCTION FREEVARIATION(ALLOTHERFACTORS)

C    Calculate the sum of all other natural factors influencing the temperature

…..

RETURN

END

Current measured long term warming rates range from 1.2-1.6 C/century.  Some climatologists claim 6+C for the remainder century, based on climate models.  One might think that these estimates are suspect, due to the empirically observed limitations of the current GCMs.

As one small example, during the past winter NOAA’s Climate Prediction Center (CPC) forecast that the upper midwest would be above normal temperatures.  Instead the temperatures were well below normal.

http://www.cpc.ncep.noaa.gov/products/archives/long_lead/gifs/2008/200810temp.gif

hprcc_upr_midwest_08to09

http://www.hprcc.unl.edu/products/maps/acis/mrcc/Last3mTDeptMRCC.png

Another much larger example is that the GCMs would be unable to explain the causes of ice ages.  Clearly the models need more work, and more funding.  The BBC printed an article last year titled “Climate prediction: No model for success .”

And Julia Slingo from Reading University (Now the UK Met Office’s Chief Scientist) admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.

We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.

“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.

……

One trouble is that as some climate uncertainties are resolved, new uncertainties are uncovered.

Some modellers are now warning that feedback mechanisms in the natural environment which either accelerate or mitigate warming may be even more difficult to predict than previously assumed.

Research suggests the feedbacks may be very different on different timescales and in response to different drivers of climate change

…….

“If we ask models the questions they are capable of answering, they answer them reliably,” counters Professor Jim Kinter from the Center for Ocean-Land-Atmosphere Studies near Washington DC, who is attending the Reading meeting.

If we ask the questions they’re not capable of answering, we get unreliable answers.

I am not denigrating the outstanding work of the climate modelers – rather I am pointing out why GCMs may not be quite ready yet for forecasting temperatures 100 years out, and that politicians and the press should not attempt to make unsupportable claims of Armageddon based on them.  I would appreciate it if readers would keep this in mind when commenting on the work of scientists, who for the most part are highly competent and ethical people, as is evident from this UK Met Office press release.

Stop misleading climate claims

11 February 2009

Dr Vicky Pope

Dr Vicky Pope, Met Office Head of Climate Change, calls on scientists and the media to ‘rein in’ some of their assertions about climate change.

She says: “News headlines vie for attention and it is easy for scientists to grab this attention by linking climate change to the latest extreme weather event or apocalyptic prediction. But in doing so, the public perception of climate change can be distorted. The reality is that extreme events arise when natural variations in the weather and climate combine with long-term climate change. This message is more difficult to get heard. Scientists and journalists need to find ways to help to make this clear without the wider audience switching off.


Bridgekeeper: Stop. What… is your name?
King Arthur: It is Arthur, King of the Britons.
Bridgekeeper: What… is your quest?
King Arthur: To seek the Holy Grail.
Bridgekeeper: What … is the air-speed velocity of an unladen swallow?
King Arthur: What do you mean? An African or European swallow?
Bridgekeeper: Huh? I… I don’t know that.
[he is thrown over]
Bridgekeeper: Auuuuuuuugh.
Sir Bedevere: How do know so much about swallows?
King Arthur: Well, you have to know these things when you’re a king, you know.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

216 Comments
Inline Feedbacks
View all comments
Roger Knights
March 18, 2009 12:05 am

Richard S. Courtney wrote:
“The climate system is seeking an equilibrium that it never achieves. The Earth obtains radiant energy from the Sun and radiates that energy back to space. The energy input to the system (from the Sun) may be constant (although some doubt that), but the rotation of the Earth and its orbit around the Sun ensure that the energy input/output is never in perfect equilibrium.
“The climate system is an intermediary in the process of returning (most of) the energy to space (some energy is radiated from the Earth’s surface back to space). And the Northern and Southern hemispheres have different coverage by oceans. Therefore, as the year progresses the modulation of the energy input/output of the system varies. Hence, the system is always seeking equilibrium but never achieves it.
“Such a varying system could be expected to exhibit oscillatory behaviour. And, importantly, the length of the oscillations could be harmonic effects which, therefore, have periodicity of several years. Of course, such harmonic oscillation would be a process that – at least in principle – is capable of evaluation.
“However, there may be no process because the climate is a chaotic system. Therefore, the observed oscillations (ENSO, NAO, etc.) could be observation of the system seeking its chaotic attractor(s) in response to its seeking equilibrium in a changing situation.”

I’m thankful to read that, because it provides thoughtful and well-informed backing for the hand-waving guess I posted yesterday:
“Indeed, it seems to me that there needn’t be any external cause for the variation–variation might be generated internally as the means by which a system’s stability is attained. A tight-rope walker maintains his balance because he constantly counterbalances himself back-and-forth against his balance-pole. If he were forced to eliminate his wobbling and walk steadily, he would fall. (I owe this insight to an essay by a Hungarian author in a book of his odd-ball essays published within the last ten years whose title I’ve forgotten.) Similarly, in a dynamical system with long-term feedback loops and natural counter-balances and counter-counter-balances, multi-decadal zigs and zags are probably (IMO) part of an overall equilibrium.”

John Philip
March 18, 2009 12:05 am

Given the seven year drop in temperatures, I take it that you are predicting a rapid rise over the next year or two to get back on the scenario A curve?
For new readers, and ‘dealing with reality’ in 1988 Dr Hansen testified to Congress on global warming and as part of the presentation submitted projections of the change in global temperatures from an early version of the NASA climate model under three scenarios, A,B,C, which can be simplified to ‘High’, Medium’ and ‘Low’ [This is a gross simplification – see the original paper for details]. In practice the temperature projections under Scenario B has proven to be extraordinarily accurate – the difference in trends between the projected and the actual trend is actually less than the observational uncertainty, or to put it another way, a ‘perfect’ model would not have done better.
Now in some quarters this evidence of the predictive skill of a climate model has not gone down well, and there have been some attempts to rewrite history, notably by Michael Crichton and also over at Climate Audit, to the effect that Hansen actually believed Scenario A was most likely, or Mr Goddard’s claim that A most accurately tracks the actual forcings. Fortunately we have Hansen’s testimony …
These scenarios are designed to yield sensitivity experiments for a broad range of future greenhouse forcings. Scenario A, since it is exponential, must eventually be on the high side of reality in view of finite resource constraints and environmental concerns even though the growth of emissions in scenario A (~1.5%/yr) is less than the rate typical of the past century (~4%/yr). Scenario C is a more drastic curtailment of emissions than has generally been imagined. It represents elimination of chlorofluorocarbon (CFC) emissions by 2000, and reduction of CO2 and other trace gas emissions to a level such that the annual growth rates are zero (i.e. the sources just balance the sinks by the year 2000. Scenario B is perhaps the most plausible of the three.
Pat Micheals also testified to Congress and in doing so simply erased scenarios B and C, misrepresenting Hansen as an exaggerator. Hansen was not impressed and responded
One of the skeptics, Pat Michaels, has taken the graph from our 1988 paper with simulated global temperatures for scenarios A, B and C, erased the results for scenarios B and C, and shown only the curve for scenario A in public presentations, pretending that it was my prediction for climate change. Is this treading close to scientific fraud?
I know my answer to that particular question. Now, NASA have published estimated radiative forcings up until 2003 and comparing these to the scenarios show that Scenario B was easily the closest match to reality, tracking about 10% higher than observed – while A was 25% higher and C about 25% lower. CO2 has increased slightly faster than in scenario B, but this is offset by a reduction in CFCs thanks to the Montreal Protocols and a plateauing of methane concentrations, for reasons largely unknown.
So, to answer the question, no I am not expecting a reversion to Scenario A anytime soon.
Footnotes: You can download the scenario and forcing data from this page.
The 1988 projections are probably reaching the end of their useful lifetime – a key parameter, climate sensitivity, was set to a value about a third higher than our current best estimate. Over the first few decades this has little impact, but over time will have the effect of causing the model to overestimate temperatures, if the modern value is more accurate. Seven year and longer flat or cooling trends are not unusual in Scenario B, still the planet warms.

Roger Knights
March 18, 2009 12:17 am

Smokey wrote:
“Observed global warming, as minor as it is, is primarily the result of the planet emerging from the last Ice Age.’
And also the result of emerging from the Little Ice Age, and of most oceanic oscillations being set to Warm for the past 30 years, a factor the IPCC was unaware of 20 years ago.

Steven Goddard
March 18, 2009 4:01 am

John Philips,
Nice long-winded attempt at at obfuscation.
You wrote : Scenario A, since it is exponential, must eventually be on the high side of reality
To date, Scenario A is on the low side of CO2 emissions, as Hansen et al constantly remind us. CO2 has risen faster than Scenario A, and temperatures have risen much slower. The fact that 30 years ago he considered scenario B CO2 to be the most plausible is irrelevant, because atmospheric CO2 has risen much faster than he expected and his prediction was incorrect.
Alarmists can’t have it both ways – claiming that lower scenarios are valid while simultaneously forecasting a 6+C rise in temperatures and “global warming much worse than predicted.” That is exactly the marketing scam which people in Copenhagen are trying to pull off.
Despite widespread concern over global warming, humans are adding carbon to the atmosphere even faster than in the 1990s, researchers warned Saturday.
http://www.huffingtonpost.com/2009/02/14/global-warming-seen-worse_n_167002.html

John Philip
March 18, 2009 5:09 am

Steve,
The model forcings are for all GHGs combined, not just CO2. Please provide a single reference to support your assertion that Hansen, or anyone else has claimed that the actual forcing trajectory are higher than Scenario A.
In 2006 Hansen wrote:
Real-world GHG climate forcing so far has followed a course closest to scenario B. The real world even had one large volcanic eruption in the 1990s, Mount Pinatubo in 1991, whereas scenario B placed a volcano in 1995.
Global Temperature Change PNAS.
thanks.

March 18, 2009 5:29 am

John Philip,
Given the opportunity to predict three highly diverging scenarios in the 1980’s does not give credibility to the prognosticator in 2009, when one of the three scenarios is still inaccurate, but less inaccurate than the other two.
Face it, Hansen was wrong then and he is wrong now. And 250 more words in response won’t change that fact.

Bill Illis
March 18, 2009 5:38 am

Steve Mcintyre found the actual GHG forcing numbers Hansen used in Scenario A, B, C and the actual GHG numbers are just slightly under Scenario B.
I note that Scenario B projects a temperature anomaly of 0.85C in 2008 while GISS was only 0.435C in 2008.
So it is off by 50% in 20 years.
Like GISS Model E is now off by nearly 50% in just 5 years.

John Philip
March 18, 2009 5:45 am

‘Highly divergent’ Really?

Steven Goddard
March 18, 2009 6:13 am

http://www.telegraph.co.uk/news/uknews/1553508/CO2-rising-three-times-faster-than-expected.html
CO2 ‘rising three times faster than expected’
Global emissions of carbon dioxide are increasing three times faster than scientists previously thought, with the bulk of the rise coming from developing countries, an authoritative study has found.
The increase in emissions of the gases responsible for global warming suggests that the effects of climate change to come in this century could be even worse than United Nations scientists have predicted.
The report, by leading universities and institutes on both sides of the Atlantic, will create renewed pressure on G8 leaders who are meeting this week in Heiligendamm, on Germany’s Baltic coast.
Top of the agenda are proposals by Angela Merkel, the German Chancellor, to halve global emissions by 2050.
There were violent clashes at the weekend in the nearby city of Rostock between police and protesters during a march by tens of thousands demonstrating about the summit.
The latest study was written by scientists from the Oak Ridge National Laboratory in the United States, the University of East Anglia and the British Antarctic Survey, as well as institutes in France and Australia.
It shows that carbon dioxide emissions have been increasing by three per cent a year this decade, compared to a 1.1 per cent a year rise in the 1990s. Three quarters of this rise came from developing countries, with a particularly rapid increase in China.
The rise is much faster than even the most fossil-fuel intensive scenario developed by the Intergovernmental Panel on Climate Change (IPCC) during the 1990s.

John Philip
March 18, 2009 8:40 am

That report found emissions rose by 1.1% per annum during the 1990s rising to 3% this decade, Hansen’s scenario A used a 1.5% per annum increment, so taking into account the ‘compound interest’ effect, overall scenario A is still way higher than observed. Singling out CO2 for a moment, scenario A projected a concentration in 2008 of 464ppm, the actual was 386, so for this GHG scenario A is a 20% overestimate.
So what exactly is the origin of this idea:
To date, Scenario A is on the low side of CO2 emissions, as Hansen et al constantly remind us. CO2 has risen faster than Scenario A, and temperatures have risen much slower. ?
Bill – I assume you aware that the GISS and scenario anomalies use different baselines? It makes a slight difference, also rather then using a single (La Nina) year for your comparison, try looking at the trends

Jack
March 18, 2009 4:30 pm

John Philip
According to the UAH data, the lower troposhere is now about 0.2 deg C warmer that it was 30 years ago. How does this increase match with Hansen’s predictions? (I am assuming that the most recent 30 years qualifies the observations as climate rather than weather).

John Philip
March 19, 2009 12:04 am

Jack, I don’t know where you get 0.2C from. http://www.woodfortrees.org/plot/uah/plot/uah/trend
Steve – do you stand by your assertion regarding Scenario A, or should we believe Steve McIntyre?

Bill Illis
March 19, 2009 5:56 am

To John Philip,
Hansen’s actual baseline was 1960. He is also describes 1984 as a baseline.
All the Scenarios and GISS temp are at effectively Zero in 1960 and they are all just above Zero in 1984.
They are on the same baseline.

John Philip
March 19, 2009 8:09 am

Bill – Both your statements are wrong, but the difference between right and wrong has little effect on the outcome (<0.1C), which is why I said it only makes a slight difference above.
Hansen’s scenario baseline for the anomalies (or ‘Control Year’ as it is called in the paper) is actually 1958:
The zero point for observations is the 1951-1980 mean and the zero point for the model is the control run mean (Caption to Fig 3). Different baseline, in other words.
And the observational dataset used by the paper was an early iteration of GISTEMP and so GISTEMP also uses the mean of the years 1951-1980 as its baseline.

mehmehmeh
March 19, 2009 10:14 am

multi-scale modelling’s a bitch.

Eric
March 21, 2009 10:59 am

George E. Smith (12:05:17) :
Wrote:
“I used to think GCMs were Global Climate Models; but then I kept reading papers from people who presumably know better; and they were calling them Global Circulation Models; NOT Climate models.
Well on planet earth it seems that at reasonable (non-geological) time scales, we have basically Ocean Waters, Atmosphere, and perhaps energy that are capable of Circulating. At longer geological time scales, the land itself is circulating; so let’s not go there.
Well it seems to me that in order to get circulation of either water, or atmosphere or energy, you MUST have differences in Temperature both in time and in space.
But limate we are told (by definition) is the long term average of weather. Therfore it is ideally a single number; not a graph, which implies changes over time or space; which would be weather rather than climate.
So climate and circulation seem to be mutually incompatible concepts. You can’t get circulations or model them, without having temperature (and other) differences in time and space, and that means averaging is verboten, because it eliminates differences.
A model that depicts the earth as an isothermal body at about 15 deg C, that radiates 390 Watts per square meter from every point on it 24 hours a day doesn’t have any temperature differences in time or space to use with the laws of Physics to ompute circulations.”
You are presenting a straw man argument here. Climate models do not assume that the earth is at a uniform temperature etc. The method is to use an ensemble of models that have different intial conditions close to the actual initial conditions.
Because of the chaotic nature a distribution of initial conditions must be used.
Each simulation proceeds to calculate the trajectory of conditions which certainly do vary around the world. The results are summarized as a global average versus time for some purposes. If you believe a different sort of result should be reported, please say what that should be.
[/quote]
Yet an untoward focus ofclimate science, and certainly climate politics rests on what happens to a single number that gets updated each year or maybe monthly or maybe each five years; namely the GISStemp anomaly fo Dr James Hansen at NASA.
Do any of thse vaunted GCMs whatever they are, predict both the mediaeval warm period, and the little ice age; the Maunder and dalton minima; or any other widely recognized instance of a climate change, that presumably had a natural reason.[/quote]
In order to do that they would have to have and accurate idea of what the climate forcings were in that time frame and a good set of global initial conditions.
The solar output is not well known, and volcanic action can only be quessed at.
I don’t see how that question is relevant to the utility or validity of the models in modern times.
“As for the good Met Office lady’s request for more powerful computers (and any other form or research (taxpayer) dollar expenditures; Nonsense; those more powerful computers will simply spit out garbage at a much faster clip; but it will still be garbage, because clearly the models are NOT adequate to explain what is happening, because there isn’t even scientific agreement, on what the key Physical processes even are; let a lone any experimental verification of the extent to whih any such processes are operating.
The only climate or circulation model that works; and it works very well, is the model that is being run on the data, by planet earth with its oceans, and atmospheres, and the physical geography of its land masses; not to mention all the biological processes going on at the same time. Well then there’s the sun of course; which ranges from the cause of everything; to having nothing to do with earth’s climate.
When the GCMs make even a token effort to emulate the model that planet earth is running; they might start to get some meaningful results.”
The do emulate the model that the planet earth is running. They use a mixture of empirically derived and physical models. The modelers do not claim complete accuracy. They provide an range of values and average trends as the final output.
“But I can assure you that faster computers are neither the problem nor the solution. Computers can be made to do the stupidest things at lighning speed; the trick is to have them not doing stupid things, in the first place..
The old astronomy professor was addressing his brand new freshman astronomy class for their first lecture.
“when I was an undergraduate like you all, ” he said, “we could write down the universal equations of motionon the back of a postage stamp; but it took us six months with a battery of students working with log tables to solve those equations, and determine the motions of the universe..”
“Now that has all changed; we have modern high speed computers that can solve those dynamical equations in microseconds; but now takes a battery of students six months to program those computers with meaningful code to even start working on those problems of the universe.”
I know a little bit about computer modelling. I spend hours each day, modelling optical systems with a quite powerful yet desk top computer (eight Intel Processors). Well there are a number of other people in the company, that have the same lens design software, and maybe not quite so powerful computers; who presumably could do the same things I do, with my computers; but they can’t; and they don’t.
You see they are all mechanical packaging engineers; who have been told that optics is just packaging with transparent materials. They can’t do what i do, because they don’t know a darn thing about the Physics of Optical systems. What’s more, they don’t know that that is why they can’t do what i do; it sin’t my eight core computers.”
Are you claiming that scientists who have PhD.’s and study this problem for a living are not as smart as you are? That they don’t know a darn thing about the earth’s climate, and don’t know how to program a computer? Where is the evidence of that?
Your old astronomy professer obviously was a fossil who was given undergraduate courses to teach because they didn’t trust him to mentor graduate students.

1 7 8 9