Guest post by Steven Goddard
In his recent article, NSIDC’s Dr. Meier answered Question #9 “Are the models capable of projecting climate changes for 100 years?” with a coin flipping example.
1. You are given the opportunity to bet on a coin flip. Heads you win a million dollars. Tails you die. You are assured that it is a completely fair and unbiased coin. Would you take the bet? I certainly wouldn’t, as much as it’d be nice to have a million dollars.2. You are given the opportunity to bet on 10000 coin flips. If heads comes up between 4000 and 6000 times, you win a million dollars. If heads comes up less than 4000 or more than 6000 times, you die. Again, you are assured that the coin is completely fair and unbiased. Would you take this bet? I think I would.
Dr. Meier is correct that his coin flip bet is safe. I ran 100,000 iterations of 10,000 simulated random coin flips, which created the frequency distribution seen below.

The chances of getting less than 4,000 or greater than 6,000 heads are essentially zero. However, this is not an appropriate analogy for GCMs. The coin flip analogy assumes that each iteration is independent of all others, which is not the case with climate.
[Note: Originally I used Microsoft’s random number generator, which isn’t the best, as you can see below. The above plot which I added within an hour after the first post was made uses the gnu rand() function which generates a much better looking Gaussian.]

Climate feedback is at the core of Hansen’s catastrophic global warming argument. Climate feedback is based on the idea that today’s weather is affected by yesterday’s weather, and this year’s climate is dependent on last year. For example, climate models (incorrectly) forecast that Arctic ice would decrease between 2007 and 2010. This would have caused a loss of albedo and led to more absorption of incoming short wave radiation – a critical calculation. Thus climate model runs in 2007 also incorrectly forecast the radiative energy balance in 2010. And that error cascaded into future year calculations. Same argument can be made for cloud cover, snow cover, ocean temperatures, etc. Each year and each day affects the next. If 2010 calculations are wrong, then 2011 and 2100 calculations will also be incorrect.
Because of feedback, climate models are necessarily iterative. NCAR needs a $500 million supercomputer to do very long iterative runs decades into the future. It isn’t reasonable to claim both independence (randomness) and dependence (feedback.) Climate model errors compound through successive iterations, rather than correct. How could they correct?
Speaking of Arctic ice cover and albedo, the sun is starting to get high in the sky in the Arctic, and ice extent is essentially unchanged from 30 years ago. How does this affect climate calculations?
GCMs are similar to weather models, with added parameters for factors which may change over time – like atmospheric composition, changes in sea surface temperatures, changes in ice cover, etc. We know that weather models are very accurate for about three days, and then quickly break down due to chaos. There is little reason to believe that climate models will do any better through successive iterations. The claim is that the errors average out over time and produce a regionally correct forecast, even if incorrect for a specific location.
A good example of how inaccurate climate forecasts are, is shown in the two images below. NOAA’s Climate Prediction Center issued a long range forecast for the past winter in February, 2009. Brown and orange represents above normal temperatures, and as you can see they got most of the US backwards.
NOAA CPC’s long range forecast for winter 2009-2010
NOAA’s reported results for winter 2009-2010
The UK Met Office seasonal forecasts have also been notoriously poor, culminating in their forecast of a warm winter in 2009-2010.
The Met Office climate models forecast declining Antarctic sea ice, which is the opposite of what has been observed.
NSIDC’s observed increase in Antarctic sea ice
Conclusion : I don’t see much theoretical or empirical evidence that climate models produce meaningful information about the climate in 100 years.
However, Willis claims that such a projection is not possible because climate must be more complex than weather. How can a more complex situation be modeled more easily and accurately than a simpler situation? Let me answer that with a couple more questions:1. You are given the opportunity to bet on a coin flip. Heads you win a million dollars. Tails you die. You are assured that it is a completely fair and unbiased coin. Would you take the bet? I certainly wouldn’t, as much as it’d be nice to have a million dollars.2. You are given the opportunity to bet on 10000 coin flips. If heads comes up between 4000 and 6000 times, you win a million dollars. If heads comes up less than 4000 or more than 6000 times, you die. Again, you are assured that the coin is completely fair and unbiased. Would you take this bet? I think I would.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.





Good image of Jesus.
I flipped a coin once and it landed on it’s side standing up on a hard wood floor!
So it’s not 50-50 heads or tail, it’s 50-x, x*2, 50-x odds where x is the chance that the coin will and on it’s side and stay standing!
The Objective Reality of Nature doesn’t care how we model things, she does her own thing.
Randomness is inherent many simple systems, and the weather and climate certainly generate their own share of randomness. See Wolfram, A New Kind of Science – it’s something that the climate modelers aka soothsayers haven’t taken into account I’ll bet.
Your coin flip distribution is conspicuously symetrical…
I lost it when reference was made to the burden of proof being on the sceptics. It was our duty to prove it wasn’t all caused by human generated CO2.
No one has handed me evidence that the moon is not made of cheese.
Love your map of the US for weather projections by the NOAA. What’s their projection for next year, and is there any way I can get a color negative of it so it will be more accurate?
the frequency distribution :
is an outline of someone praying to win a million dollars and not die
http://climateinsiders.files.wordpress.com/2010/04/meierquestion9probabilityplot1.jpg?w=510&h=358
In general an excellent post, but one thing to keep in mind is that the point of GCM’s is not that we can predict exactly when we’ll see the arctic ice free to the exact year, or even what the trend will be over a short period, but rather, what the longer term trends will be. For example, there is a better than even odds chance (based on GCM’s) that we’ll see an ice free summer arctic by 2050, though no model can tell you exactly when that year will be.
Climate is is a chaotic process within a prescribed range.
If someone was to give me a bet that if the whole arctic was ice free on Dec. 31 2015 I’d die, and it it was not ice free on that same date, that I’d win a millin dollars, I’d take that bet. Yes, there are many factors that influence climate, but we understand the most important ones, and GHG forcing is a biggie. As long as GHG’s continue to rise, it will get warmer over the long term, with other influencers, such as solar cycles, ENSO, PDO, contributing to the chaotic process within a prescribed range phenomenon. The greatest influencer of climate in the longest term is of course the Milankovitch cycles.
http://en.wikipedia.org/wiki/File:MilankovitchCyclesOrbitandCores.png
It rules in the longest term, but right now, GHG’s seem to be the prime influencer.
RhudsonL (11:34:15) :
The symmetry and patterns are indicative of the fact that off the shelf random number generators are not much use for very sensitive Monte Carlo calculations. Something in the Visual Studio 2008 rand() function is generating cycles, which ideally should not be happening.
“climate models (incorrectly) forecast that Arctic ice would decrease between 2007 and 2010”
Not true (just look at your later graph), and the fact that you think this would be worth mentioning in any case demonstrates a profound misunderstanding of what climate models do. They are not meant to predict the variation in ice extent over any given three year period. Over three years, unforced variations are larger than long term changes. Unforced variations are not climate. Long term changes are.
Yet again, someone fails to understand the most basic distinction in climate science. Why? It’s really not hard.
Speaking of coin flips in the context of weather forecasting and climate modeling … so that’s how it’s done!
The coin flip analogy hinges on the concept of the fair coin. But would he take the bet if the person supplying the coin had to pay the $1 million or stood to collect on your inheritance? The coin may be perfectly fair for commerce, but not fair at all for flipping. Secondly, he “proves” his point by running a computer simulation. And the simulation assumes an ideal coin flipper. What happens if the flipper gets to control initial conditions and always starts with a heads up coin? It’s often assumptions like these that skeptics contest as being the problem with models.
Weather and climate forecasts are made by humans. Human shortcomings are part of those forecasts. We all know how wrong weathermen can be. And we can see by comparison with real world observation how wrong climate predictions are.
(“computer climate model outputs not matching observation”
http://www.scribd.com/doc/904914/A-comparison-of-tropical-temperature-trends-with-model-predictions
“models perform poorly”
http://www.scribd.com/doc/4364173/On-the-credibility-of-climate-predictions )
Some will say the difference for ‘climate’ is that a climate forecast for 100 years from now is made by a computer and not a human. But computers are machines built by humans. And computer programs that predict future climate are made by humans with their shortcomings a part of it. Computers cannot take on a life of their own that overcomes the human shortcomings in the computer program. That only happens in movies.
“Never make predictions, especially about the future.”
~~Yogi Berra (credited)
Monte Carlo computer runs done for calculations Poissonian statistics is a waste of perfectly good electrons.
Yes, there can be trend obscured by noise in a chaotic system. No, the climate models aren’t anywhere near complete enough to make useful trend predictions.
I tried generating the coin flip Gaussian using g++ rand() and it is remarkable how much better job it does than Visual Studio
http://docs.google.com/View?id=ddw82wws_597cn34skhf
I always thought that the important point about the GCMs was they they perform mulitple runs using slightly different starting points. The results from all these different simulations then give an average result – this was probably what Dr Meier was suggesting with the coin flip example.
And comparing short range forecsting models with long range models is simply misleading; they are completly different tools for very different tasks.
OOPS, trying again.
Monte Carlo computer runs done for Poissonian statistics is a waste of perfectly good electrons.
Yes, there can be trend obscured by noise in a chaotic system. No, the climate models aren’t anywhere near complete enough to make useful trend predictions.
Thanks, Fun article
“the idea that today’s weather is affected by yesterday’s weather, and this year’s climate is dependent on last year”
Isn’t this essentially true since the oceans retain heat longer then land mass and impart the impact of summer to winter via currents and trade winds?
Of course they were wrong 10 out of ten years. The nine years show their bias.
But the forecasts for 2035,2050 and 2100 are accurate. We are facing tax rates based on their accuracy. It tells me 2 things. They have bias and admit it. Fudging data will skew the models. Just stop cheating and changing the numbers and then try to do a prediction. It may get a little better.
Steven, your random number generator is garbage.
Of course Meir is right about the coin flips. It’s amazing that some posters seemed to think otherwise. Is it an “appropriate” analogy for GCM’s? That’s debatable. Analogies usually aren’t perfect.
I prefer the stock market analogy. I would have much more confidence in a forecasted rise in the Dow for 50- year period than a 1-year period. Why? Because I have observed the market has fluctuated a lot from year to year, but has risen over the long term. Average global temperature, like the stock market, fluctuates from year to year but has been rising over the long term.
So to avoid the accumulation of errors in climate models, you avoid iteration and for every forecast you calculate out for x amount of time from the same base year. Using your verified set of equations that only require the starting conditions and time elapsed, most likely.
Someone let me know if and when they can do that.
Tom_R (12:39:09) :
It is interesting how bad the Visual Studio rand() function is. I often use the KISS RN generator at work.
http://www.fortran.com/kiss.f90
The random number generator used in the coin toss trials above is obviously not very random. The dots have an almost mirror image on each side. The dots should be much more randomly distributed, though probably in something like a bell curve. Many programming languages have very poor random number generators. Even good random number generators often can’t even come close to ideal behavior. For example if a random number generator is asked to return a random number in a range, it is often the case that the generator is only capable of returning an extremely small portion of the possible numbers in the range before it starts to return the same sequence of numbers again. The Wikipedia article on this subject is interesting.
REPLY: See the updated run, refresh the page.
Wren (12:41:26) :
Your argument boils down to “temperatures are rising.” In that case, a simple extrapolation is more valuable than $500 million supercomputer GCM simulation.
http://docs.google.com/View?id=ddw82wws_5998xrxhzhc
According to the long term GISS trend, temperatures will rise about 0.6C by the end of the century.