If You Can't Explain It, You Can't Model It

Source: Center for Multiscale Modeling of Atmospheric Processes - click image for more

Guest Post by Steven Goddard

Global Climate Models (GCM’s)  are very complex computer models containing millions of lines of code, which attempt to model cosmic, atmospheric and oceanic processes that affect the earth’s climate.  This have been built over the last few decades by groups of very bright scientists, including many of the top climate scientists in the world.

During the 1980s and 1990s, the earth warmed at a faster rate than it did earlier in the century.  This led some climate scientists to develop a high degree of confidence in models which predicted accelerated warming, as reflected in IPCC reports.  However, during the last decade the accelerated warming trend has slowed or reversed.  Many climate scientists have acknowledged this and explained it as “natural variability” or “natural variations.”  Some believe that the pause in warming may last as long as 30 years, as recently reported by The Discover Channel.

But just what’s causing the cooling is a mystery. Sinking water currents in the north Atlantic Ocean could be sucking heat down into the depths. Or an overabundance of tropical clouds may be reflecting more of the sun’s energy than usual back out into space.

It is possible that a fraction of the most recent rapid warming since the 1970’s was due to a free variation in climate,” Isaac Held of the National Oceanic and Atmospheric Administration in Princeton, New Jersey wrote in an email to Discovery News. “Suggesting that the warming might possibly slow down or even stagnate for a few years before rapid warming commences again.”

Swanson thinks the trend could continue for up to 30 years. But he warned that it’s just a hiccup, and that humans’ penchant for spewing greenhouse gases will certainly come back to haunt us.

What has become obvious is that there are strong physical processes (natural variations) which are not yet understood, and are not yet adequately accounted for in the GCMs.  The models did not predict the current cooling.  There has been lots of speculation about what is causing the present pattern – changes in solar activity, changes in ocean circulation, etc.  But whatever it is, it is not adequately factored into any GCMs.

One of the most fundamental rules of computer modeling is that if you don’t understand something and you can’t explain it, you can’t model it.  A computer model is a mathematical description of a physical process, written in a human readable programming language, which a compiler can translate to a computer readable language.  If you can not describe a process in English (or your native tongue) you certainly can not describe it mathematically in Fortran.  The Holy Grail of climate models would be the following function, which of course does not exist.

FUNCTION FREEVARIATION(ALLOTHERFACTORS)

C    Calculate the sum of all other natural factors influencing the temperature

…..

RETURN

END

Current measured long term warming rates range from 1.2-1.6 C/century.  Some climatologists claim 6+C for the remainder century, based on climate models.  One might think that these estimates are suspect, due to the empirically observed limitations of the current GCMs.

As one small example, during the past winter NOAA’s Climate Prediction Center (CPC) forecast that the upper midwest would be above normal temperatures.  Instead the temperatures were well below normal.

http://www.cpc.ncep.noaa.gov/products/archives/long_lead/gifs/2008/200810temp.gif

hprcc_upr_midwest_08to09

http://www.hprcc.unl.edu/products/maps/acis/mrcc/Last3mTDeptMRCC.png

Another much larger example is that the GCMs would be unable to explain the causes of ice ages.  Clearly the models need more work, and more funding.  The BBC printed an article last year titled “Climate prediction: No model for success .”

And Julia Slingo from Reading University (Now the UK Met Office’s Chief Scientist) admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.

We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.

“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.

……

One trouble is that as some climate uncertainties are resolved, new uncertainties are uncovered.

Some modellers are now warning that feedback mechanisms in the natural environment which either accelerate or mitigate warming may be even more difficult to predict than previously assumed.

Research suggests the feedbacks may be very different on different timescales and in response to different drivers of climate change

…….

“If we ask models the questions they are capable of answering, they answer them reliably,” counters Professor Jim Kinter from the Center for Ocean-Land-Atmosphere Studies near Washington DC, who is attending the Reading meeting.

If we ask the questions they’re not capable of answering, we get unreliable answers.

I am not denigrating the outstanding work of the climate modelers – rather I am pointing out why GCMs may not be quite ready yet for forecasting temperatures 100 years out, and that politicians and the press should not attempt to make unsupportable claims of Armageddon based on them.  I would appreciate it if readers would keep this in mind when commenting on the work of scientists, who for the most part are highly competent and ethical people, as is evident from this UK Met Office press release.

Stop misleading climate claims

11 February 2009

Dr Vicky Pope

Dr Vicky Pope, Met Office Head of Climate Change, calls on scientists and the media to ‘rein in’ some of their assertions about climate change.

She says: “News headlines vie for attention and it is easy for scientists to grab this attention by linking climate change to the latest extreme weather event or apocalyptic prediction. But in doing so, the public perception of climate change can be distorted. The reality is that extreme events arise when natural variations in the weather and climate combine with long-term climate change. This message is more difficult to get heard. Scientists and journalists need to find ways to help to make this clear without the wider audience switching off.


Bridgekeeper: Stop. What… is your name?
King Arthur: It is Arthur, King of the Britons.
Bridgekeeper: What… is your quest?
King Arthur: To seek the Holy Grail.
Bridgekeeper: What … is the air-speed velocity of an unladen swallow?
King Arthur: What do you mean? An African or European swallow?
Bridgekeeper: Huh? I… I don’t know that.
[he is thrown over]
Bridgekeeper: Auuuuuuuugh.
Sir Bedevere: How do know so much about swallows?
King Arthur: Well, you have to know these things when you’re a king, you know.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
216 Comments
Inline Feedbacks
View all comments
Steven Goddard
March 15, 2009 4:10 pm

Tamino thinks that the models forecast rapid warming since 2002.
http://tamino.files.wordpress.com/2008/03/rahmstorf2.jpg

tallbloke
March 15, 2009 4:12 pm

I wonder if we could get Hansen arrested at heathrow under the terrorism laws. Plenty of examples of him using language likely to incite hatred.

Paul S
March 15, 2009 4:25 pm

sod (14:33:18) :
http://www.realclimate.org/images/Hansen06_fig2.jpg
but you knew that, of course?!?

But scenario’s A & B are the ones promoted. Perhaps we should dismiss those scenarios for scenario C, seeing as they are currently, note, currently, closest to reality.

Paul S
March 15, 2009 4:29 pm

DaveE (14:57:20) :
Let them try it in China for instance, it would keep them out of OUR hair for a year or two 😉

That’s assuming the eco-terrorist group, Plane Stupid, will let them fly there…

Roger H
March 15, 2009 4:37 pm

Well, call me simple minded or whatever, but when the amount of experience, time, money and people involved in near-time weather forecasting cannot accurately predict what the conditions are going to be 5 days from now, why should I trust our Climate experts to be accurate 10 – 20 – 50 – 100 years from now? I know a lot of the experts claim that Weather forecasting and Climate Forecasting are different things entirely. Well, Math and Physics are also separate, but if you can’t do the math accurately you aren’t going to answer many physics questions.

Paul S
March 15, 2009 4:38 pm

Bill Illis (16:04:30) :
Hansen estimated it only took 6 years back in 1988 when he was 100% sure he had everything figured out and even testified to Congress to that fact. He has now changed the equilibrium response time to 1500 years (but he has “nailed it” for sure this time).

6 years = accountability
1500 years = “Who’s James Hansen?”

JohnB
March 15, 2009 4:40 pm

Question for Anthony possibley concerning models. I believe that it was almost a year ago that you published a letter regarding the Aqua Satellite results. As I recall, they were not being released pending further peer review since the findings were at odds with current thinking [models?]. Any update on the status?

Squidly
March 15, 2009 4:42 pm

Bill Illis (16:04:30) :

Thanks Bill! I get the gist of now. I think I am on the same page, hence the probable accuracy of my last post. lol…

red432
March 15, 2009 5:03 pm

JohnG, In my experience giant code is often not “just bloat” — it’s hacking. Like for instance “if the surface of the ocean at the tropics freezes force fix the data…” That’s the kind of thing I’d be looking for if I audited the code. I agree that the code really *should* be simple to have any hope of validity.

pft
March 15, 2009 5:11 pm

With four free parameters, a mathematician can build a model that describes exactly everything that an elephant can do. Given a fifth parameter, the elephant can be forecast to fly.
Climate modelling involves a great many more than five free parameters due to uncertainties about the effects of clouds, dust, moisture and more.
So maybe that elephant can be made to travel in time to tell us what the climate will be in 100 years in the future.

kurt
March 15, 2009 5:29 pm

I have always had a few questions regarding the modeling processes used by the IPCC, and although these questions are argumentative in that they reveal my opinion about the usefuness of climate models, perhaps someone here is in a position to address the factual bases for these questions or explain why the questions might be misplaced.
Based on a very cursory reading of the latest IPCC report, it is my understanding that the conclusion of a causal effect between GHGs and warming is premised in part on a finding that exising climate models cannot simultate current climate trends without a substantial CO2 warming effect. Due to the chaotic nature of the climate system, and the unknown nature of what initial conditions to use in the models, a model is “run” many times with different intial conditions to produce a range of climate trends, and the reasoning is that, because none of these model runs show the existing warming trend without the modeled CO2 sensitivity, then it is reasonable to conclude that the existing warming trend in the real world is caused by CO2.
My first question is whether I have fairly summarized both the modeling process of the IPCC, and the reasoning behind it’s finding of a causal effect between CO2 and warming; because if so, this logic seems flawed. All this exercise demonstrates is that the models used by the IPCC require a substantial CO2 contribution to temperature in order to reasonably fit known climate trends. Nothing more. It certainly does not demonstrate that NO climate model could be constructed that both reasonably fits known climate data and uses an insignificant sensitivity to CO2. In other words the premise that models A, B, and C cannot simulate climate trends without relationship 1 does not logically support the conclusion that relationship 1 holds true in the real world system that the models A, B, and C attempt to simulate. Models A, B, and C may be simply wrongly constructed in one or more respects.
This leads into the second question. Has anyone actually tried to construct a model that has little or no CO2 effect, and then spend as much effort tweaking it to best fit past and current climate, as has been spent tweaking climate models that DO assume substantial warming.
My last two questions are a little more general. What possibile utility is there in modeling a relationship that cannot be physically measured, either directly or indirectly, e.g. the temperature response of the Earth as a function of CO2 concentration? If you don’t have the ability to measure a physical system at a level of detail that you need to perform a task, why would someone think that this deficiency could be overcome by simply modeling the system at the desired level of detail and testing the model against the things you can measure?
Finally, given that a computer does nothing more than what the programmer instructed it to do, how can merely running the computer model tell you something that you did not already know or assume about the system modeled? I can accept that a computer model can be highly useful as a shortcut in performing tasks too complex to be reliably done manually, once it has first been established that the computer model is accurate with respect to the specific task at hand. I have a much harder time believing that a computer model can TEACH you anything about the system modeled, particularly with respect to modeled relationships that can’t be independently verified apart from the simulations.

March 15, 2009 5:34 pm

The Law of Computer Models:
The results from computer models tend towards the desires and expectations of the modelers.
[source]

Steven Goddard
March 15, 2009 5:40 pm

John Philip,
Given the seven year drop in temperatures, I take it that you are predicting a rapid rise over the next year or two to get back on the scenario A curve?

FatBigot
March 15, 2009 5:41 pm

That nice Mr Sowell commented (14:57:20) :
T (12:51:52) :
“Just a shame that in the UK’s Kings North power station vandalism trial only one view was on show. The prosecution wasn’t about to disturb the consensus and put up anyone to dispute Mr Hansen’s views.”
I agree, it would have been great to have a well-spoken, knowledgeable expert to provide balance.”
Such evidence would not have been admissible. See this and the comments:
http://thefatbigot.blogspot.com/2008/09/is-gordon-criminal-damage.html

David Holliday
March 15, 2009 5:42 pm

“… it would not get much better until they had supercomputers 1,000 times more powerful than at present.”
The “we just need more computing power” line is both tired and false. Increased computational power does not convey increased understanding.

March 15, 2009 5:53 pm

FatBigot (13:56:55) :

“Incidentally, Mr Sowell (at 11.37.18) mentioned judicial notice. I was once involved in a trial where one of my opponents was trying to persuade the judge that he could not properly take judicial notice of a particular matter. In the course of his argument he was endeavouring to explain what “judicial notice” means and said “For example, Your Lordship would not be able to take judicial notice of the fact that Arsenal beat West Ham United at football yesterday.” To which the judge replied “I wouldn’t need to, I was at the match.””


Your opponent was either a brave man, or foolhardy, I would say. To attempt to explain to a judge what judicial notice means very likely would result in unfavorable rulings from that point on, as the judge would be insulted!

Austin
March 15, 2009 5:57 pm

I wonder what the systemic error of the model is given the very large elements they use and what is the systemic error due to feedback of the errors?
My guess is that its very, very large and the feedback completely dwarfs any meaningful analysis.
I know the current numerical models used by NWS and others suffer from tremendous feedback issues.

Richard M
March 15, 2009 5:59 pm

I often wonder about the typical definition of climate vs. weather. Why is climate 30 years or 60 years and not 1000 years? Do we really understand whether the last 50 years is not just weather?
This notion that we understand enough to determine what climate is is itself QUESTIONABLE . It seems to me that climate really should be closer to 1000 years than anything else I’ve heard. Of course, that would be a useless time frame so we’ve conveniently made it shorter. However, by doing this we may be confusing the picture which makes modelling a tricky job. We look for short term climate effects and probably miss other contributors.
All this continues to point out just how little we know.

kurt
March 15, 2009 6:06 pm

“when . . . people involved in near-time weather forecasting cannot accurately predict what the conditions are going to be 5 days from now, why should I trust our Climate experts to be accurate 10 – 20 – 50 – 100 years from now? I know a lot of the experts claim that Weather forecasting and Climate Forecasting are different things entirely.”
Lets say that I have a very weak, repetitive, audio signal that I’m sending across the country. On it’s path, that audio signal is subject to quite a bit of random noise. When received, over short intervals the signal is going to represent mostly the random spurrious noise, and reconstructing the repeating pattern will be almost impossible. With sufficient time, however, the original signal can be reconstructed because it’s pattern holds over time while the random noise cancels itself out. Once reconstructed, you can acurately predict the up and down trends of the signal so long as they are avereraged over a sufficient interval, but accurately predicting any instantanous point will be impossible.
Notice in this example, however, that the source of the signal and the source of the random noise are mathematically uncorrelated. This is what permits you to separate the short term random noise from the long term repeating signal. The same physical phenonena that produces weather however, produces climate. In fact, weather is nothing more than the instantaneous manifestation of climate. Those who argue that long term climate is predictable even though short term weather is not, are essentially making an unstated assumption that the chaotic nature of the earth’s climate system fades out over frequencies longer than four decades or so.
Which side of the argument you take depends on how reasonable you think this assumption is. I think it’s a lousy assumption. For example, when I look at long-term climate reconstructions, I don’t see any pattern in the various climate optimums and minimums. The standard deviation for the length of ice ages/interglacial periods, from eyeballing these reconstructions, also seems a lot longer than four or five decades. All of this seems to indicate to me that there is a lot of randomness in the Earth’s climate system that is not removed by taking averages over a period of decades.

G Alston
March 15, 2009 6:16 pm

kurt — Has anyone actually tried to construct a model that has little or no CO2 effect, and then spend as much effort tweaking it to best fit past and current climate, as has been spent tweaking climate models that DO assume substantial warming.
That’s the thing. These models were written from the ground up to simulate climate response to CO2. It’s what they do; why they exist. That anyone is surprised when the answer to the question is CO2 still amazes me.
Were you to make a model simulating traffic patterns where the primary variable under investigation is driver stupidity, the answer you will derive — no matter what — will be some form of driver stupidity. This will be true even if the traffic patterns are more influenced by construction timing, flow control and so on. If you’re looking for driver stupidity, you are guaranteed to find it.
If these models were designed originally to calculate albedo averages and the answer came back as CO2, this would be impressive. But that’s not what is happening. All these people are doing is using the word “computer” and claiming “really truly complicated science” and expecting us to all say “ooooh, an impartial computer said it so it must be true.” It’s psychological voodoo, not science.

idlex
March 15, 2009 6:19 pm

The less you know about something the easier it is to model.
That’s very encouraging. Now that my solar system orbital simulation model is working tolerably well, and I have a planet Earth in it spinning once every sidereal day, and orbiting just about once every year, I’m thinking that I may extend it to include the terrestrial atmosphere, which I’ll divide up into layers in some sort of grid. Knowing the thermal conductivity and heat capacity of air, sea, and land, and the incident solar radiation at any point on the planet surface, and the heat losss from the planet interior, I’ll be able to work out the conductive heat flow vertically to outer space at absolute zero, and horizontally through the atmosphere and ground.
I’m not sure at the moment what to do about air flows using this grid system. I’m wondering if, instead of having a fixed rectangular grid, I might have mobile air masses which would ‘jostle’ with each other as they expanded and contracted, no doubt forming warm and cold “fronts” surging to and fro, and spinning clockwise and anticlockwise under the influence of coriolis forces, and very likely northern and southern “jet streams” as well.
That should all be working by next Thursday. Then I’ll add in clouds and precipitation. I’ve heard this is difficult to do, but I can’t see any big problem there. As my mobile air masses jostle with each other, the warmer lighter humid ones will float to the top of the atmosphere, and will turn white as the water in them condenses into clouds and rain. I expect I’ll have cirrus and stratus and cumulus clouds appearing shortly after that. And I’ll be able to make weather predictions for the next few days, although obviously not quite as accurate as the Met Office’s (or perhaps more so?).
It should be early April by the time that’s working. Looking at climate over a long period of time should be straightforward. I’ll just have to leave the laptop running all night under my bad where Twinkle can’t find it and started treading on the keyboard like she enjoys doing when I’m working on it.
Mid-April now. And then there’ll be the final tweak to get the CO2 and global warming working, maybe just by decreasing air conductivity a bit, or boosting solar insolation by the same fraction.
Finally, in early May, I’ll release the results of my GCM to the world in a series of increasngly panic-stricken press releases that will describe how all my mobile air masses have been jostling and bumping into each other to produce the entirely new and hitherto unforeseen phenomenon of Global Swelling, which is what happens when warm sticky air masses form a “traffic jam” around the equator, and pile up higher and higher on top of each other, before rolling downhill towards the poles again, very likely destroying entire civilizations in the process – unless governments Do Something About It Before It’s Too Late – like buy me a newer and faster laptop with WiFi and BlueTooth and all the other latest fancy trimmings.

March 15, 2009 6:22 pm

John Philip (12:35:05) :
Global climate models are extraordinarily useful tools, and among other things have successfully predicted the rise in temperature as greenhouse gases increased…
That’s hilarious. Lots of people predicted the same with calculations on the backs of envelopes. Are you saying that it was just serendipity that the GCM’s produced this astonishing result. Some others consider that the models were designed and tuned with the assumption that the observed warming of the late 20th century was caused by GHGs. Thus, you get the expected result.

Philip_B
March 15, 2009 6:31 pm

My first question is whether I have fairly summarized both the modeling process of the IPCC, and the reasoning behind it’s finding of a causal effect between CO2 and warming;
You have.
This leads into the second question. Has anyone actually tried to construct a model that has little or no CO2 effect, and then spend as much effort tweaking it to best fit past and current climate, as has been spent tweaking climate models that DO assume substantial warming.
All attempts to produce numerical predictions of future climate are models. I think what you are asking is, are there Global Climate (Circulation) Models that have little or no CO2 effect?
The simple answer is we don’t know, because models are proprietary and only selected model runs are published.
However, with reference to your question 1, this means the IPCC’s case is built on an unverified and unscientific assumption. Namely, the published runs are representative of all possible runs.
What possibile utility is there in modeling a relationship that cannot be physically measured, either directly or indirectly, e.g. the temperature response of the Earth as a function of CO2 concentration? If you don’t have the ability to measure a physical system at a level of detail that you need to perform a task, why would someone think that this deficiency could be overcome by simply modeling the system at the desired level of detail and testing the model against the things you can measure?
The IPCC used to be upfront about the fact the models are wildly wrong on almost every climate metric except temperature. They then bizzarely concluded we can trust the model’s temperature predictions.
Anyone can see that the reason they get temperature right (when hindcasting known data) and all other values wrong, is the models are tuned to the temperature data.
Finally, given that a computer does nothing more than what the programmer instructed it to do, how can merely running the computer model tell you something that you did not already know or assume about the system modeled?
You are correct, a computer model doesn’t tell you anything you don’t already know or had assumed.
How the computer models work, is they iteratively process datasets with the output of one iteration becoming the input of the next iteration. Each iteration being a time period.
What computer models allow you to do in theory is to produce predictions of much greater accuracy.
There is no empirical evidence that climate models do in fact produce more accurate predictions than for example, drawing a line on sheet of graph paper or the computer equivalent.
The IPCC’s belief in the temperature predictions of the climate models is purely faith without any scientific basis.

Steve Hempell
March 15, 2009 6:32 pm

Steven:
In the introduction you state:
“During the 1980s and 1990s, the earth warmed at a faster rate than it did earlier in the century. ”
This is not correct. In Patrick Michaels newest book he states : Page 12
“The IPCC history shows two distinct periods of warming, on roughly from 1910 through 1945, and then another that begins rather abruptly in about 1975. Their warming rates are statistically indistinguishable. In the past three decades ending in2005 , the warming rate was 0.178 Deg C +/- 0.021 per decade. In the period 1916-45 the rate was 0.151 Deg C +/- 0.014 per decade.”
I have an old (only goes to 2001) Hadcrut version from the Hadley Site. The results are
1908-1941 33 yrs; change + 0.46; rate 0.139/decade
1975-2001 26 yrs; change +0.45; rate 0.173/decade
New “modified” Hadcrut V3 as of Feb 2009
1909-1944 35 yrs; change +0.651; rate 0.186/decade
1974-2005 31 yrs; change +0.705; rate 0.227/decade
Isn’t manipulation wonderful? Still the rate isn’t unprecedented.
By the way try using the “latest” Hadcrut and figure out
1878-1862 (must of had an El Nino) 16 Yrs ; change +0.651; Rate 0.404/decade
1878-1909 31 yrs; change -0.595; Rate -0.192/decade (hope this isn’t in the cards!!)
Also see:
http://www.climate4you.com/ under Climate Reflections.
This is a pet peeve of mine. No one has been able to explain why the early and late 20th century warmings are essentially the same. Until someone does, colour me skeptical. (Michaels thinks it is the sun, but I’m pretty securely in the Leif Salvgaard camp). You will have to get the book to see why Micheals thinks the way he does.

deadwood
March 15, 2009 6:34 pm

I am curious as to whether one of the modelers has recalibrated his model to the now 10 years of stagnation in warming.
If this has been done, I’ve not heard about it.
If not, then why not? (Other than, of course, it would mean the sensitivity of CO2 would need to be cranked way down!)