Guest post by Steven Goddard
In his recent article, NSIDC’s Dr. Meier answered Question #9 “Are the models capable of projecting climate changes for 100 years?” with a coin flipping example.
1. You are given the opportunity to bet on a coin flip. Heads you win a million dollars. Tails you die. You are assured that it is a completely fair and unbiased coin. Would you take the bet? I certainly wouldn’t, as much as it’d be nice to have a million dollars.2. You are given the opportunity to bet on 10000 coin flips. If heads comes up between 4000 and 6000 times, you win a million dollars. If heads comes up less than 4000 or more than 6000 times, you die. Again, you are assured that the coin is completely fair and unbiased. Would you take this bet? I think I would.
Dr. Meier is correct that his coin flip bet is safe. I ran 100,000 iterations of 10,000 simulated random coin flips, which created the frequency distribution seen below.

The chances of getting less than 4,000 or greater than 6,000 heads are essentially zero. However, this is not an appropriate analogy for GCMs. The coin flip analogy assumes that each iteration is independent of all others, which is not the case with climate.
[Note: Originally I used Microsoft’s random number generator, which isn’t the best, as you can see below. The above plot which I added within an hour after the first post was made uses the gnu rand() function which generates a much better looking Gaussian.]

Climate feedback is at the core of Hansen’s catastrophic global warming argument. Climate feedback is based on the idea that today’s weather is affected by yesterday’s weather, and this year’s climate is dependent on last year. For example, climate models (incorrectly) forecast that Arctic ice would decrease between 2007 and 2010. This would have caused a loss of albedo and led to more absorption of incoming short wave radiation – a critical calculation. Thus climate model runs in 2007 also incorrectly forecast the radiative energy balance in 2010. And that error cascaded into future year calculations. Same argument can be made for cloud cover, snow cover, ocean temperatures, etc. Each year and each day affects the next. If 2010 calculations are wrong, then 2011 and 2100 calculations will also be incorrect.
Because of feedback, climate models are necessarily iterative. NCAR needs a $500 million supercomputer to do very long iterative runs decades into the future. It isn’t reasonable to claim both independence (randomness) and dependence (feedback.) Climate model errors compound through successive iterations, rather than correct. How could they correct?
Speaking of Arctic ice cover and albedo, the sun is starting to get high in the sky in the Arctic, and ice extent is essentially unchanged from 30 years ago. How does this affect climate calculations?
GCMs are similar to weather models, with added parameters for factors which may change over time – like atmospheric composition, changes in sea surface temperatures, changes in ice cover, etc. We know that weather models are very accurate for about three days, and then quickly break down due to chaos. There is little reason to believe that climate models will do any better through successive iterations. The claim is that the errors average out over time and produce a regionally correct forecast, even if incorrect for a specific location.
A good example of how inaccurate climate forecasts are, is shown in the two images below. NOAA’s Climate Prediction Center issued a long range forecast for the past winter in February, 2009. Brown and orange represents above normal temperatures, and as you can see they got most of the US backwards.
NOAA CPC’s long range forecast for winter 2009-2010
NOAA’s reported results for winter 2009-2010
The UK Met Office seasonal forecasts have also been notoriously poor, culminating in their forecast of a warm winter in 2009-2010.
The Met Office climate models forecast declining Antarctic sea ice, which is the opposite of what has been observed.
NSIDC’s observed increase in Antarctic sea ice
Conclusion : I don’t see much theoretical or empirical evidence that climate models produce meaningful information about the climate in 100 years.
However, Willis claims that such a projection is not possible because climate must be more complex than weather. How can a more complex situation be modeled more easily and accurately than a simpler situation? Let me answer that with a couple more questions:1. You are given the opportunity to bet on a coin flip. Heads you win a million dollars. Tails you die. You are assured that it is a completely fair and unbiased coin. Would you take the bet? I certainly wouldn’t, as much as it’d be nice to have a million dollars.2. You are given the opportunity to bet on 10000 coin flips. If heads comes up between 4000 and 6000 times, you win a million dollars. If heads comes up less than 4000 or more than 6000 times, you die. Again, you are assured that the coin is completely fair and unbiased. Would you take this bet? I think I would.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.





R. Gates,
Explain to me wich forcing caused the warming between 1910 and 1945?
And you get this picture wirh it:
http://i39.tinypic.com/261p2tu.png
Wren (13:13:23) :
Sneak preview of something I am writing up. Here is GISS vs. CO2 concentration.
http://docs.google.com/View?id=ddw82wws_5998xrxhzhc
“R. Gates (13:10:23) :
[…]
It has been a major effort of climate scientists to weed out short term noise from the longer cycles to understand the role that CO2 plays.”
You mean like getting accustomed to moving averages? Well, they still stumble when using Hamming windows so there’s still some learning curve…
http://wattsupwiththat.com/2010/04/04/ipcc-how-not-to-compare-temperatures/
And in their “major effort to weed out noise” they might some day even learn to do a Fourier or Laplace transform, i’m not holding my breath, though.
You wanted to know what skeptics say to this “2010 hottest year on instrumental record.” Well.
My all-time favorite quote: It’s a minor short-term issue. [R. Gates]
Mike McMillan (13:06:23) :
I’m beginning to have some sympathy for climate scientists like Dr Meier.
I read the letters to the editor in Aviation Week & Space Technology, and some fellow will make a point about some jet engine, then next week he’ll get nailed by a dozen specialists in rocket science, engineering, corporate accounting, etc., who will detail why he’s wrong.
Climate scientists try to tease a climate signal out of weather noise, then extrapolate to the future. They have to gather data, use statistics, program computers, deal with politicians, get grants and funding, and a dozen other things, then publicly produce a conclusion/result/paper. They have to be generalists.
Meanwhile, here’s WUWT.
We have generalist climate scientists the equal of any. SurfaceStation volunteers and UAH know in detail about gathering data. We have statisticians who can spot misapplications in an instant. We have computer experts who deconstruct their shaky code and expose the tricks, accidental and otherwise. We have government types who see through the machinations to find the motivations. We have academics who note the violations of publication protocols.
There is no skill that a climate scientist uses that the collective expertise of WUWT doesn’t far surpass.
Big Climate can rely on only the likes of the perky Katie Couric to hide the facts, and the crooked politicians to keep up the funding. We’ll see how that climate changes in November.
A tip of the hat to Dr Meier and the others who have ventured over here to make their case.
————-
….suddenly, without warning, the blog “Watts Up With That” became self-aware, and the intelligence and personality of thousands of contributors melded into one vast, sentient and analytical consciousness!! Worldwide, the servers of climate research universities crashed from the multiple hits, seeking data and information….
Didn’t they make a movie about that once? Good comments, Mike, thanks! This seems to be a very eclectic group across many disciplines, and the sum of these posts is often quite amazing.
The one fault as I see it in Meier’s ‘$1m or death’ option is the reasoned and reasonable position that death is a certainty anyway, sooner or later, while this is probably the only chance to acquire $1m.
And the argument that is being postulated — namely that the precautionary principle demands action “just in case” — is flawed because every decision we take (or refuse to take, and that in itself is a decision), individually or collectively, opens up a further range of possible decisions and so on.
So far I have not seen sufficient evidence from real world observations that convince me that we are far enough along the road of irreversible global warming to demand that we impoverish the third world more than it is impoverished already and impoverish ourselves into the bargain.
The more so because all the “poster children” (and their baby brothers) are so easily debunked and the increasingly strident cherry-picking of factoids is enough to drive any right-minded person to the conclusion that “the lady doth protest too much, methinks”!
The Warmists use a two-headed coin.
@ur momisugly Steven Goddard
If I may, I’d suggest any of several high end PC Statistics packages if you intend to pursue this sort of thing. Don’t waste your time with Excel or other consumer products.
SAS: http://www.sas.com/
Statistica: http://www.statsoft.com/textbook/ , http://www.statsoft.com/
Mathematica: http://www.wolfram.com/products/mathematica/index.html
kadaka (13:32:07) :
Visual Studio is an unbeatable environment for developing C++ code. I just need to remember to run it on gnu platforms.
“Wren (13:27:02) :
[…]
Hansen’s projections made back in the 1980’s are looking better and better. No wonder he won an award for his contributions to climate modeling.”
Oh, that’s funny. Let’s look.
http://climateaudit.org/2008/01/16/thoughts-on-hansen-et-al-1988/
Looks… pretty bad for Hansen.
Ya, but, should we take “any” odds
when its Al Gore who tells us
the coin is unbiased and fair?
You need to correct the second question: You are given the opportunity to bet on 10000 coin flips. If heads comes up more than 5000 times, you win a million dollars. If heads comes up less than 5000 times, you die. Again, you are assured that the coin is completely fair and unbiased. Would you take this bet?
“Curiousgeorge (13:49:43) :
@ur momisugly Steven Goddard
If I may, I’d suggest any of several high end PC Statistics packages if you intend to pursue this sort of thing. Don’t waste your time with Excel or other consumer products.”
You forgot the wonderful and free R.
http://www.r-project.org/
DirkH (13:53:29) :
Thanks for the tip about R. Free software is always nice!
Put another way,,,
Looks like , “The House”,,,(D.C.) are the ones who set the odds on this
coin filp of chance before us now.
Never bet against “The House”.
Change The House, or find better odds some other way.
Steve Goddard (13:50:46) :
kadaka (13:32:07) :
Visual Studio is an unbeatable environment for developing C++ code. I just need to remember to run it on gnu platforms.
——————–
Reply:
Steve, jump into C#. It is a true object oriented language built that way from the ground up. It is a Microsoft product and completely compatible with Visual Studio. Indeed, some say VS was built for C#.
@ur momisugly DirkH (13:53:29) : Thanks, I wasn’t aware of that package. 🙂 I’ll take a look at it. When I was gainfully employed ( retired 10 years ago ) I was provided with the 3 I mentioned by the company. If you fly commercial, you are the beneficiary of some of my work. 🙂
I don’t see that a random number generator is needed to make the point. The problem is described so as to be well defined, i.e., with a known distribution of Heads for a given number of flips.
My own view on Question #9 “Are the models capable of projecting climate changes for 100 years?”
Current models do not seem capable because they are underspecified. In the future, with better specified models I see no reason why models could not project climate changes forward for periods roughly equivalent to the length of time for which we have reliable records for the variables that are incorporated in the models.
“RockyRoad (13:35:24) :
[…]
The US becoming a completely socialist/marxist nation–a recent study showed that countries that did so (and this wasn’t a future projection, prediction, or prognostication; it was based on case studies) saw an immediate 40% drop in GDP.”
USA GDP/capita in 2006: 43468 USD
Let’s see. Switch to socialism: minus 40%. 60% of 43468 are 26080.8 USD.
USA GDP/capita in 1994: 26247 USD
So the introduction of socialism will warp you back to 1994.
Numbers are from
EarthTrends (http://earthtrends.wri.org) Searchable Database Results
Our local weather bureau (Souh Australia) gives weather predictions like “There is a 50% chance that we will have a wetter than average winter” Without a hint of irony. I dont have much confidence in their predictions 100 years out.
Stock Market Analogy – Well, It is not a model for sure. It is a guess about the future based on observations of the past. We don’t know anything about the future stock market, in the particular (one stock) or aggregate (all stocks). The Quants modeled the hell out of stocks and the models worked fine until they didn’t work anymore. Wall Street crashed and burned, it was all terribly disruptive, remember? The quantitative models failed. You can say they were of limited value – climate modelers say their models have limitations. But a failed model is just that – the model doesn’t work. Climate models can’t work. All the coin toss bet models worked great because the variables were known and controlled. All climate models are doomed to failure because the variables are neither known nor controlled.
“RockyRoad (14:10:35) :
[…]
Steve, jump into C#. It is a true object oriented language built that way from the ground up. It is a Microsoft product and completely compatible with Visual Studio. Indeed, some say VS was built for C#.”
C# is the new flagship language for Microsoft, that’s why they focus their efforts with VS on it. VC++ is a second rank citizen there now.
C# is very nicely done, the architect behind is is a Dane called Anders Hejlsberg, US citizen for a long time now. He was also the mastermind behind Turbo Pascal / Borland Pascal /Delphi before defecting to Microsoft.
Caveat is, you get tied to the Microsoft platforms. There are attempts to provide a runtime and compiler for the language under Linux; Mono and Rotor are the project names, but while the language is in the standardization process of ECMA, Microsoft will not free their runtime class libraries.
Personally, i still prefer C++ because it’s available on so many different platforms.
Thank you, Steven Goddard!
The argument that the climate models are not the same as the weather models obfuscates the real issue. They are not identical, but the part that leads to the unpredictability of the future is basically the same.
‘Iterations feeding back into successive iterations! Aye, There’s the rub!’
I can Monte-Carlo my sub-surface models till the cows come home. But the models are still constrained by the parameters I elect to include, the ranges of parameters I think apply, the boundary conditions etc, etc. Simply performing Monte-Carlo will not necessarily give me the correct trend.
So too are climate models constrained. If you believe CO2 is the main driver performing Monte Carlo wont necessarily give you the right answer if other causes driving climate have not been included (e.g the current crop of cdlimate models) AND if you don’t calibrate and validate the climatic record.
DirkH (14:17:54) :
(….)
So the introduction of socialism will warp you back to 1994.
——————–
Reply:
And since when is that a good thing?
But let me ask you this: Could you live on 60% of what you’re currently making? I mean in real terms.
Could you pay for your house, your car and your food with 60% of what you’re making now?
It would be just as catastrophic to see a 10-15 degree drop in average temperature with the onset of the next Ice Age, which means only the southern fringe of the US could grow foodstuff crops. Now what–we become earthworms and eat the soil?
Consider what would happen to Denver, CO, if there was just a 10-degree drop in average low temps. Here’s the current monthly average for both highs and lows:
http://www.weather.com/weather/wxclimatology/monthly/graph/USCO0105?role=
Reduce them all by 10 degrees and there’s only 4 months (June, July, August, and September) where the average is above freezing. But remember, that’s the AVERAGE! Undoubtedly there would be several nights each month in the mix where the low falls below freezing–plants don’t go by averages, they’re killed by a single sub-freezing night. I just put a bunch of seeds in Jiffy pots for my garden and noticed the shortest required 62 days while some required 120 days.
This all means that Denver would be too cold for crops. And I’m predicting most of the US would be too cold for crops, too. No longer would it be a bread basket. You could call it a barren, starving freezer.
I hope you’re all fluent in Spanish.
Sorry… that monthly distribution for average temperatures in Denver is actually:
http://www.weather.com/weather/wxclimatology/monthly/graph/USCO0105?role=