Response to Dr. Meier's answer #9 – coin flips in the context of climate modeling

Guest post by Steven Goddard

In his recent article, NSIDC’s Dr. Meier answered Question #9 “Are the models capable of projecting climate changes for 100 years?with a coin flipping example.

However, Willis claims that such a projection is not possible because climate must be more complex than weather. How can a more complex situation be modeled more easily and accurately than a simpler situation? Let me answer that with a couple more questions:

1. You are given the opportunity to bet on a coin flip. Heads you win a million dollars. Tails you die. You are assured that it is a completely fair and unbiased coin. Would you take the bet? I certainly wouldnā€™t, as much as itā€™d be nice to have a million dollars.
2. You are given the opportunity to bet on 10000 coin flips. If heads comes up between 4000 and 6000 times, you win a million dollars. If heads comes up less than 4000 or more than 6000 times, you die. Again, you are assured that the coin is completely fair and unbiased. Would you take this bet? I think I would.

Dr. Meier is correct that his coin flip bet is safe. Ā  I ran 100,000 iterations of Ā 10,000 simulated random coin flips, which created the frequency distribution seen below.

Coin Flips using the gnu rand() function

The chances of getting less than 4,000 or greater than 6,000 heads are essentially zero. Ā However, this is not an appropriate analogy for GCMs. Ā  The coin flip analogy assumes that each iteration is independent of all others, which is not the case with climate.

[Note: Originally I used Microsoft’s random number generator, which isn’t the best, as you can see below. The above plot which I added within an hour after the first post was made uses the gnu rand() function which generates a much better looking Gaussian.]

Coin Flips using the Microsoft random number function

Climate feedback is at the core of Hansen’s catastrophic global warming argument. Climate feedback is based on the idea that today’s weather is affected byĀ yesterday’sĀ weather, and this year’s climate is dependent on last year. Ā For example, climate models (incorrectly) forecast that Arctic ice would decrease between 2007 and 2010. Ā  This would have caused a loss of albedo and led to more absorption of incoming short wave radiation – a critical calculation. Ā Thus climate model runs in 2007 also incorrectly forecast the radiative energy balance in 2010. Ā And that error cascaded into future year calculations. Ā Same argument can be made for cloud cover, snow cover, ocean temperatures, etc. Ā Each year and each day affects the next. Ā If 2010 calculations are wrong, then 2011 and 2100 calculations will also be incorrect.

Because of feedback, climate models are necessarily iterative. Ā NCAR needs a $500 million supercomputer to do very long iterative runs decades into the future.Ā Ā Ā It isn’t reasonable to claim both independence (randomness) and dependence (feedback.) Climate model errors compound through successive iterations, rather than correct. Ā How could they correct?

Speaking of Arctic ice cover and albedo, the sun is starting to get high in the sky in the Arctic, and ice extent is essentially unchanged from 30 years ago. Ā How does this affect climate calculations?

Source: Cryosphere Today

GCMs are similar to weather models, with added parameters for factors which may change over time – like atmospheric composition, changes in sea surface temperatures, changes in ice cover, etc. Ā We know that weather models are very accurate for about three days, and then quickly break down due to chaos. There is little reason to believe that climate models will do any better through successive iterations. Ā The claim is that the errors average out over time and produce a regionally correct forecast, even if incorrect for a specific location.

A good example of how inaccurate climate forecasts are, is shown in the two images below. Ā NOAA’s Climate Prediction Center issued a long range forecast for the past winter inĀ February, 2009. Ā Brown and orange represents above normal temperatures, and as you can see they got most of the US backwards.

NOAA CPC’s long range forecast for winter 2009-2010

https://i0.wp.com/www.hprcc.unl.edu/products/maps/acis/DJF10TDeptUS.png?resize=500%2C400

NOAA’s reported results for winter 2009-2010

The UK Met Office seasonal forecasts have also been notoriously poor, culminating in their forecast of a warm winter in 2009-2010.

The Met Office has now admitted to BBC News that its annual global mean forecast predicted temperatures higher than actual temperatures for nine years out of the last 10.

The Met Office climate models forecast declining Antarctic sea ice, which is the opposite of what has been observed.

Graph of Sea-ice area: Time series

Met Office sea ice forecast

https://i0.wp.com/nsidc.org/data/seaice_index/images/s_plot_hires.png?resize=500%2C300

NSIDC’s observed increase in Antarctic sea ice

Conclusion : I don’t see much theoretical orĀ empiricalĀ evidence that climate models produce meaningful information about the climate in 100 years.

However, Willis claims that such a projection is not possible because climate must be more complex than weather. How can a more complex situation be modeled more easily and accurately than a simpler situation? Let me answer that with a couple more questions:1. You are given the opportunity to bet on a coin flip. Heads you win a million dollars. Tails you die. You are assured that it is a completely fair and unbiased coin. Would you take the bet? I certainly wouldnā€™t, as much as itā€™d be nice to have a million dollars.2. You are given the opportunity to bet on 10000 coin flips. If heads comes up between 4000 and 6000 times, you win a million dollars. If heads comes up less than 4000 or more than 6000 times, you die. Again, you are assured that the coin is completely fair and unbiased. Would you take this bet? I think I would.

0 0 votes
Article Rating
206 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
RhudsonL
April 10, 2010 11:34 am

Good image of Jesus.

pwl
April 10, 2010 11:50 am

I flipped a coin once and it landed on it’s side standing up on a hard wood floor!
So it’s not 50-50 heads or tail, it’s 50-x, x*2, 50-x odds where x is the chance that the coin will and on it’s side and stay standing!
The Objective Reality of Nature doesn’t care how we model things, she does her own thing.
Randomness is inherent many simple systems, and the weather and climate certainly generate their own share of randomness. See Wolfram, A New Kind of Science – it’s something that the climate modelers aka soothsayers haven’t taken into account I’ll bet.

Kirk A
April 10, 2010 11:50 am

Your coin flip distribution is conspicuously symetrical…

Henry chance
April 10, 2010 11:51 am

I lost it when reference was made to the burden of proof being on the sceptics. It was our duty to prove it wasn’t all caused by human generated CO2.

Question 14: Regarding climate, what action (if any) should we take at this point?
This is of course an economic and political question, not a scientific question, though the best scientific evidence we have can and should inform the answer. So far there
…….isnā€™t any scientific evidence that refutes NH2 and we conclude that the processes that influenced climate ……….
in the past are doing so today and will continue to do so in the future. From this we conclude that humans are having an impact on climate and that this impact will become more significant in the future as we continue to increase GHGs in the atmosphere

No one has handed me evidence that the moon is not made of cheese.

RockyRoad
April 10, 2010 11:52 am

Love your map of the US for weather projections by the NOAA. What’s their projection for next year, and is there any way I can get a color negative of it so it will be more accurate?

April 10, 2010 12:00 pm

the frequency distribution :
is an outline of someone praying to win a million dollars and not die
http://climateinsiders.files.wordpress.com/2010/04/meierquestion9probabilityplot1.jpg?w=510&h=358

R. Gates
April 10, 2010 12:04 pm

In general an excellent post, but one thing to keep in mind is that the point of GCM’s is not that we can predict exactly when we’ll see the arctic ice free to the exact year, or even what the trend will be over a short period, but rather, what the longer term trends will be. For example, there is a better than even odds chance (based on GCM’s) that we’ll see an ice free summer arctic by 2050, though no model can tell you exactly when that year will be.
Climate is is a chaotic process within a prescribed range.
If someone was to give me a bet that if the whole arctic was ice free on Dec. 31 2015 I’d die, and it it was not ice free on that same date, that I’d win a millin dollars, I’d take that bet. Yes, there are many factors that influence climate, but we understand the most important ones, and GHG forcing is a biggie. As long as GHG’s continue to rise, it will get warmer over the long term, with other influencers, such as solar cycles, ENSO, PDO, contributing to the chaotic process within a prescribed range phenomenon. The greatest influencer of climate in the longest term is of course the Milankovitch cycles.
http://en.wikipedia.org/wiki/File:MilankovitchCyclesOrbitandCores.png
It rules in the longest term, but right now, GHG’s seem to be the prime influencer.

Steve Goddard
April 10, 2010 12:04 pm

RhudsonL (11:34:15) :
The symmetry and patterns are indicative of the fact that off the shelf random number generators are not much use for very sensitive Monte Carlo calculations. Something in the Visual Studio 2008 rand() function is generating cycles, which ideally should not be happening.

rw
April 10, 2010 12:05 pm

“climate models (incorrectly) forecast that Arctic ice would decrease between 2007 and 2010”
Not true (just look at your later graph), and the fact that you think this would be worth mentioning in any case demonstrates a profound misunderstanding of what climate models do. They are not meant to predict the variation in ice extent over any given three year period. Over three years, unforced variations are larger than long term changes. Unforced variations are not climate. Long term changes are.
Yet again, someone fails to understand the most basic distinction in climate science. Why? It’s really not hard.

Leon Brozyna
April 10, 2010 12:14 pm

Speaking of coin flips in the context of weather forecasting and climate modeling … so that’s how it’s done!

The Most Casual Observer
April 10, 2010 12:18 pm

The coin flip analogy hinges on the concept of the fair coin. But would he take the bet if the person supplying the coin had to pay the $1 million or stood to collect on your inheritance? The coin may be perfectly fair for commerce, but not fair at all for flipping. Secondly, he “proves” his point by running a computer simulation. And the simulation assumes an ideal coin flipper. What happens if the flipper gets to control initial conditions and always starts with a heads up coin? It’s often assumptions like these that skeptics contest as being the problem with models.

April 10, 2010 12:22 pm

Weather and climate forecasts are made by humans. Human shortcomings are part of those forecasts. We all know how wrong weathermen can be. And we can see by comparison with real world observation how wrong climate predictions are.
(“computer climate model outputs not matching observation”
http://www.scribd.com/doc/904914/A-comparison-of-tropical-temperature-trends-with-model-predictions
“models perform poorly”
http://www.scribd.com/doc/4364173/On-the-credibility-of-climate-predictions )
Some will say the difference for ‘climate’ is that a climate forecast for 100 years from now is made by a computer and not a human. But computers are machines built by humans. And computer programs that predict future climate are made by humans with their shortcomings a part of it. Computers cannot take on a life of their own that overcomes the human shortcomings in the computer program. That only happens in movies.

April 10, 2010 12:24 pm

“Never make predictions, especially about the future.”
~~Yogi Berra (credited)

Lon Hocker
April 10, 2010 12:24 pm

Monte Carlo computer runs done for calculations Poissonian statistics is a waste of perfectly good electrons.
Yes, there can be trend obscured by noise in a chaotic system. No, the climate models aren’t anywhere near complete enough to make useful trend predictions.

Steve Goddard
April 10, 2010 12:25 pm

I tried generating the coin flip Gaussian using g++ rand() and it is remarkable how much better job it does than Visual Studio
http://docs.google.com/View?id=ddw82wws_597cn34skhf

John Pattinson
April 10, 2010 12:25 pm

I always thought that the important point about the GCMs was they they perform mulitple runs using slightly different starting points. The results from all these different simulations then give an average result – this was probably what Dr Meier was suggesting with the coin flip example.
And comparing short range forecsting models with long range models is simply misleading; they are completly different tools for very different tasks.

Lon Hocker
April 10, 2010 12:25 pm

OOPS, trying again.
Monte Carlo computer runs done for Poissonian statistics is a waste of perfectly good electrons.
Yes, there can be trend obscured by noise in a chaotic system. No, the climate models arenā€™t anywhere near complete enough to make useful trend predictions.

John from CA
April 10, 2010 12:29 pm

Thanks, Fun article
“the idea that todayā€™s weather is affected by yesterdayā€™s weather, and this yearā€™s climate is dependent on last year”
Isn’t this essentially true since the oceans retain heat longer then land mass and impart the impact of summer to winter via currents and trade winds?

Henry chance
April 10, 2010 12:35 pm

The Met Office has now admitted to BBC News that its annual global mean forecast predicted temperatures higher than actual temperatures for nine years out of the last 10.

Of course they were wrong 10 out of ten years. The nine years show their bias.
But the forecasts for 2035,2050 and 2100 are accurate. We are facing tax rates based on their accuracy. It tells me 2 things. They have bias and admit it. Fudging data will skew the models. Just stop cheating and changing the numbers and then try to do a prediction. It may get a little better.

Tom_R
April 10, 2010 12:39 pm

Steven, your random number generator is garbage.

Wren
April 10, 2010 12:41 pm

Of course Meir is right about the coin flips. It’s amazing that some posters seemed to think otherwise. Is it an “appropriate” analogy for GCM’s? That’s debatable. Analogies usually aren’t perfect.
I prefer the stock market analogy. I would have much more confidence in a forecasted rise in the Dow for 50- year period than a 1-year period. Why? Because I have observed the market has fluctuated a lot from year to year, but has risen over the long term. Average global temperature, like the stock market, fluctuates from year to year but has been rising over the long term.

kadaka
April 10, 2010 12:51 pm

So to avoid the accumulation of errors in climate models, you avoid iteration and for every forecast you calculate out for x amount of time from the same base year. Using your verified set of equations that only require the starting conditions and time elapsed, most likely.
Someone let me know if and when they can do that.

Steve Goddard
April 10, 2010 12:51 pm

Tom_R (12:39:09) :
It is interesting how bad the Visual Studio rand() function is. I often use the KISS RN generator at work.
http://www.fortran.com/kiss.f90

Mindbuilder
April 10, 2010 12:55 pm

The random number generator used in the coin toss trials above is obviously not very random. The dots have an almost mirror image on each side. The dots should be much more randomly distributed, though probably in something like a bell curve. Many programming languages have very poor random number generators. Even good random number generators often can’t even come close to ideal behavior. For example if a random number generator is asked to return a random number in a range, it is often the case that the generator is only capable of returning an extremely small portion of the possible numbers in the range before it starts to return the same sequence of numbers again. The Wikipedia article on this subject is interesting.
REPLY: See the updated run, refresh the page.

Steve Goddard
April 10, 2010 12:56 pm

Wren (12:41:26) :
Your argument boils down to “temperatures are rising.” In that case, a simple extrapolation is more valuable than $500 million supercomputer GCM simulation.
http://docs.google.com/View?id=ddw82wws_5998xrxhzhc
According to the long term GISS trend, temperatures will rise about 0.6C by the end of the century.

April 10, 2010 1:00 pm

Wren (12:41:26),
Bad example.
If you adjust the stock market for inflation between 1966 and 1982, for example, stock appreciation was flat. In constant dollars, the real stock appreciation was zero.
No GCMs [computer climate models] predicted the flat temperatures over the past 15 years [per Phil Jones].
None of them was right. Not a single one. They all predicted rising temperatures. That’s because they are programmed by people who are paid to show a particular outcome. Garbage in, Gospel out.

April 10, 2010 1:02 pm

But think of the money we can make by claiming we know that which can never know.

Steve in SC
April 10, 2010 1:03 pm

Simulated coin flips prove only that your computer is functioning.
Test the validity of your model against real coins.
CO2 causing global warming is analogous to California not defaulting on its debt in the next 30 years.

April 10, 2010 1:06 pm

I’m beginning to have some sympathy for climate scientists like Dr Meier.
I read the letters to the editor in Aviation Week & Space Technology, and some fellow will make a point about some jet engine, then next week he’ll get nailed by a dozen specialists in rocket science, engineering, corporate accounting, etc., who will detail why he’s wrong.
Climate scientists try to tease a climate signal out of weather noise, then extrapolate to the future. They have to gather data, use statistics, program computers, deal with politicians, get grants and funding, and a dozen other things, then publicly produce a conclusion/result/paper. They have to be generalists.
Meanwhile, here’s WUWT.
We have generalist climate scientists the equal of any. SurfaceStation volunteers and UAH know in detail about gathering data. We have statisticians who can spot misapplications in an instant. We have computer experts who deconstruct their shaky code and expose the tricks, accidental and otherwise. We have government types who see through the machinations to find the motivations. We have academics who note the violations of publication protocols.
There is no skill that a climate scientist uses that the collective expertise of WUWT doesn’t far surpass.
Big Climate can rely on only the likes of the perky Katie Couric to hide the facts, and the crooked politicians to keep up the funding. We’ll see how that climate changes in November.
A tip of the hat to Dr Meier and the others who have ventured over here to make their case.

Larry
April 10, 2010 1:08 pm

The important thing with computer models is that they have been shown to have predictive power over several cycles. The upside and the downside can have completely different characteristics. The failure to predict the recent downturn (or lack of increase) demonstrates that a large proportion of the forcings must be not being modelled – or some inputs not being known in advance which completely counteract the ones being modelled. It is the inability to predict changes in direction which make long distance predictions implausible. I suspect the coin flippings were referring to a sort of monte-carlo style simulation. i.e. a lot of random events occur and so over a long period – because of the distribution you have shown above the output is closer to a different simulation run or the real system. The problem with that of course, is that it is suggesting the forcings in the model are essentially random.

Steve Goddard
April 10, 2010 1:10 pm

If anyone wants to reproduce the gaussian on their favorite platform
g++ -g -o coin_flip coin_flip.cc
I used gnuplot to generate the graph

// coin_flip.cc
#include
#include
int
main(int argc, char** argv)
{
size_t sample_size = (size_t)atoi(argv[1]);
int iterations = atoi(argv[2]);
std::vector ones_count_vector(sample_size, 0);
for (int i = 0; i < iterations; i++)
{
int number_of_ones = 0;
for (size_t j = 0; j < sample_size; j++)
{
int random_number = rand();
if ( (random_number & 1) == 1 )
{
number_of_ones++;
}
}
ones_count_vector[number_of_ones]++;
}
for (size_t i = 0; i < sample_size; i++)
{
if (ones_count_vector[i])
{
std::cout << i << " " << ones_count_vector[i] << std::endl;
}
}
}

R. Gates
April 10, 2010 1:10 pm

Wren said:
“I prefer the stock market analogy. I would have much more confidence in a forecasted rise in the Dow for 50- year period than a 1-year period.”
————-
Bingo. GCM’s are best when viewed with this kind of general trend analysis. The clmate undergoes a “random walk” on shorter time scales, but the biggest influncers (Milankovitch Cycles) work on the longest cycles, while the intermediate influencers (Greenhouse Gases) work on the medium-term cycles, and the msmallest influencers (solar cycles, ENSO, PDO, etc.) work on the shortest cycles and become the “noise” imposed on those longer cycles. It has been a major effort of climate scientists to weed out short term noise from the longer cycles to understand the role that CO2 plays. A good example to see this directly can be found in this chart:
http://www.climate4you.com/Sun.htm#Global temperature and sunspot number
Notice the noise of the solar cycles imposed on the background of the steadily increasing temperatures. The dips in temperature rise occur during the solar minimums as we saw in during the last few years, but now the rise continues once more. 2010 likely to be the warmest year on instrument record. To what do the AGW skeptics attribute this years heat?

Steve Goddard
April 10, 2010 1:12 pm

Looks like the html parser removed iostream and vector from the #include statements above

Wren
April 10, 2010 1:13 pm

Steve Goddard (12:56:54) :
Wren (12:41:26) :
Your argument boils down to ā€œtemperatures are rising.ā€ In that case, a simple extrapolation is more valuable than $500 million supercomputer GCM simulation.
http://docs.google.com/View?id=ddw82wws_5998xrxhzhc
According to the long term GISS trend, temperatures will rise about 0.6C by the end of the century.
======
A simple extrapolation assumes the rise is simply a function of time. We know better than that, don’t we?

Larry
April 10, 2010 1:14 pm

So to describe what I mean in easier terms if the coin flipper was instead a casino, and the casino owner was filing a fixed amount of the coin every week to increase the odds. The casino owner makes a computer model to predict the coin flip. You can imagine if his filing was not perfect the computer model would be better at predicting long range than short – the errors on filing would average out over time.

April 10, 2010 1:14 pm

The assertion is frequently made that it is easier to correctly model 100 years of climate than it is to model 10 years. The assertion is that errors would cancel out !
For this to be true each event must be totally random and independent of all previous events which is NOT TRUE OF CLIMATE EVENTS !
This is nonsense. A model is like an algebra test with 10 questions where the answer to #1 is the input to question #2 and the answer to question 2 is the input to question 3 and so on.
If the feed forward is POSITIVE errors are amplified so that a tiny error in the output of question #1 becomes monstrous by answer #10. Even if the basic equations are perfectly understood [which they aren’t] the errors compound until the output is worthless.
In a negative feedback system, which the climate alarmists assure is not the case, the errors would drive the answer to the set point and they would not accumulate as badly but they would still accumulate.

Steve Goddard
April 10, 2010 1:16 pm

Steve in SC (13:03:22) :
One billion coin flips are needed to reproduce. That would take 31 years, at one toss per second.
And you are going to come up with the same answer anyway.

DirkH
April 10, 2010 1:16 pm

At least Microsoft’s random function produces epiphanies…

Mike Davis
April 10, 2010 1:18 pm

However according to even longer term climate trends the temperature will cycle warmer and colder during the next 100 years and without accurate knowledge of where in the “LONG” term trend we are we can not know where we will be 100 years from now. We can only know where we are going after we get there unless we have an accurate road map.
The coin toss think is an attempt to draw attention away from the main issue of models based on bad input. “GIGO”

Mitch
April 10, 2010 1:20 pm

The WORST Gaussian distribution EVER!

Paul Coppin
April 10, 2010 1:26 pm

Steve Goddard (12:25:19) :
I tried generating the coin flip Gaussian using g++ rand() and it is remarkable how much better job it does than Visual Studio
http://docs.google.com/View?id=ddw82wws_597cn34skhf

I hope the irony implicit in this and subsequent discussion about statistical accuracy of coin flipping isn’t lost in the bigger concern over climate model tossing… (not picking on you or anyone specifically, Steve.)

Wren
April 10, 2010 1:27 pm

Smokey (13:00:45) :
Wren (12:41:26),
Bad example.
If you adjust the stock market for inflation between 1966 and 1982, for example, stock appreciation was flat. In constant dollars, the real stock appreciation was zero.
No GCMs [computer climate models] predicted the flat temperatures over the past 15 years [per Phil Jones].
None of them was right. Not a single one. They all predicted rising temperatures. Thatā€™s because they are programmed by people who are paid to show a particular outcome. Garbage in, Gospel out.
====
I know what you are saying about inflation, but 1966-82 doesn’t look like 50 years to me.
Flat temperatures over the past 15 years? If you are referring to what Jones said about “statistical significance” I’m afraid you are misinterpreting what the term means.
Hansen’s projections made back in the 1980’s are looking better and better. No wonder he won an award for his contributions to climate modeling.

TimiBoy
April 10, 2010 1:28 pm

“No GCMs [computer climate models] predicted the flat temperatures over the past 15 years [per Phil Jones].
None of them was right. Not a single one. They all predicted rising temperatures. Thatā€™s because they are programmed by people who are paid to show a particular outcome. Garbage in, Gospel out.”
…Exactly.

Bruckner8
April 10, 2010 1:31 pm

Are local weather forecasts even in the 50/50 range? I bet not…

kadaka
April 10, 2010 1:32 pm

Keep running the Microsoft random number generator, looks like a Time Lord is sending you a message.
(For those of us who wonder what “quirks” M$ programmers stuck in their code… Does an “easter egg” mystery image sound all that far-fetched?)

Michael
April 10, 2010 1:32 pm

I’m able to flip a fair coin so that it reliably comes up heads.
I’ve learned how much force to apply in the toss so the coin does one and a half rotations before coming down. It then becomes a matter of (by feel) making sure the coin is tails-up at the point that I toss it.

RockyRoad
April 10, 2010 1:35 pm

Wren (12:41:26) :
(….)
I prefer the stock market analogy.
————
I submit you’re forgetting two parallels:
The US becoming a completely socialist/marxist nation–a recent study showed that countries that did so (and this wasn’t a future projection, prediction, or prognostication; it was based on case studies) saw an immediate 40% drop in GDP.
The world dropping of the Holocene Epoch into the next Ice Age.
Both are catastrophic to their participants.

Bernie
April 10, 2010 1:38 pm

As others have pointed out, the context of the bet is important, i.e., Is the coin flipper an expert flipper? It also matters the conditions under which the bet is made since they actually change the expected value of the bet. If I have been given 3 months to live, the bet looks very different than if I was in good health and had every expectation of living another 40 years or more.
It still seems to me that significant variables relevant to understanding climate, and hence GCMs, are not modeled very effectively because of our limited understanding of key climate relevant processes and the interactions among both understood and not understood processes. If we don’t know, we don’t know and the uncertainty of the models has to be acknoweledge accordingly. Sometimes it is wise to say you don’t know and simply decline to make a prediction. As Matt Briggs keeps reminding us “too many people are too certain of too many things”.

Glenn
April 10, 2010 1:38 pm

“I ran 100,000 iterations of 10,000 simulated random coin flips”
Not the real thing. Go back and flip coins that many times, and don’t control the experiment.

C. Shannon
April 10, 2010 1:39 pm

@Pwl,
Wouldn’t it be: 50-(x/2), x, 50-(x/2)?
A coin only has one edge and the chances of it landing on that edge would detract evenly from the chances of it landing on any particular side.
Just a thought…

Ibrahim
April 10, 2010 1:43 pm

R. Gates,
Explain to me wich forcing caused the warming between 1910 and 1945?
And you get this picture wirh it:
http://i39.tinypic.com/261p2tu.png

Steve Goddard
April 10, 2010 1:44 pm

Wren (13:13:23) :
Sneak preview of something I am writing up. Here is GISS vs. CO2 concentration.
http://docs.google.com/View?id=ddw82wws_5998xrxhzhc

DirkH
April 10, 2010 1:44 pm

“R. Gates (13:10:23) :
[…]
It has been a major effort of climate scientists to weed out short term noise from the longer cycles to understand the role that CO2 plays.”
You mean like getting accustomed to moving averages? Well, they still stumble when using Hamming windows so there’s still some learning curve…
http://wattsupwiththat.com/2010/04/04/ipcc-how-not-to-compare-temperatures/
And in their “major effort to weed out noise” they might some day even learn to do a Fourier or Laplace transform, i’m not holding my breath, though.
You wanted to know what skeptics say to this “2010 hottest year on instrumental record.” Well.
My all-time favorite quote: It’s a minor short-term issue. [R. Gates]

CRS, Dr.P.H.
April 10, 2010 1:44 pm

Mike McMillan (13:06:23) :
Iā€™m beginning to have some sympathy for climate scientists like Dr Meier.
I read the letters to the editor in Aviation Week & Space Technology, and some fellow will make a point about some jet engine, then next week heā€™ll get nailed by a dozen specialists in rocket science, engineering, corporate accounting, etc., who will detail why heā€™s wrong.
Climate scientists try to tease a climate signal out of weather noise, then extrapolate to the future. They have to gather data, use statistics, program computers, deal with politicians, get grants and funding, and a dozen other things, then publicly produce a conclusion/result/paper. They have to be generalists.
Meanwhile, hereā€™s WUWT.
We have generalist climate scientists the equal of any. SurfaceStation volunteers and UAH know in detail about gathering data. We have statisticians who can spot misapplications in an instant. We have computer experts who deconstruct their shaky code and expose the tricks, accidental and otherwise. We have government types who see through the machinations to find the motivations. We have academics who note the violations of publication protocols.
There is no skill that a climate scientist uses that the collective expertise of WUWT doesnā€™t far surpass.
Big Climate can rely on only the likes of the perky Katie Couric to hide the facts, and the crooked politicians to keep up the funding. Weā€™ll see how that climate changes in November.
A tip of the hat to Dr Meier and the others who have ventured over here to make their case.
————-
….suddenly, without warning, the blog “Watts Up With That” became self-aware, and the intelligence and personality of thousands of contributors melded into one vast, sentient and analytical consciousness!! Worldwide, the servers of climate research universities crashed from the multiple hits, seeking data and information….
Didn’t they make a movie about that once? Good comments, Mike, thanks! This seems to be a very eclectic group across many disciplines, and the sum of these posts is often quite amazing.

Sam the Skeptic
April 10, 2010 1:46 pm

The one fault as I see it in Meier’s ‘$1m or death’ option is the reasoned and reasonable position that death is a certainty anyway, sooner or later, while this is probably the only chance to acquire $1m.
And the argument that is being postulated ā€” namely that the precautionary principle demands action “just in case” ā€” is flawed because every decision we take (or refuse to take, and that in itself is a decision), individually or collectively, opens up a further range of possible decisions and so on.
So far I have not seen sufficient evidence from real world observations that convince me that we are far enough along the road of irreversible global warming to demand that we impoverish the third world more than it is impoverished already and impoverish ourselves into the bargain.
The more so because all the “poster children” (and their baby brothers) are so easily debunked and the increasingly strident cherry-picking of factoids is enough to drive any right-minded person to the conclusion that “the lady doth protest too much, methinks”!

pat
April 10, 2010 1:47 pm

The Warmists use a two-headed coin.

Curiousgeorge
April 10, 2010 1:49 pm

@ Steven Goddard
If I may, I’d suggest any of several high end PC Statistics packages if you intend to pursue this sort of thing. Don’t waste your time with Excel or other consumer products.
SAS: http://www.sas.com/
Statistica: http://www.statsoft.com/textbook/ , http://www.statsoft.com/
Mathematica: http://www.wolfram.com/products/mathematica/index.html

Steve Goddard
April 10, 2010 1:50 pm

kadaka (13:32:07) :
Visual Studio is an unbeatable environment for developing C++ code. I just need to remember to run it on gnu platforms.

DirkH
April 10, 2010 1:51 pm

“Wren (13:27:02) :
[…]
Hansenā€™s projections made back in the 1980ā€™s are looking better and better. No wonder he won an award for his contributions to climate modeling.”
Oh, that’s funny. Let’s look.
http://climateaudit.org/2008/01/16/thoughts-on-hansen-et-al-1988/
Looks… pretty bad for Hansen.

igloowhite
April 10, 2010 1:51 pm

Ya, but, should we take “any” odds
when its Al Gore who tells us
the coin is unbiased and fair?

jim murta
April 10, 2010 1:51 pm

You need to correct the second question: You are given the opportunity to bet on 10000 coin flips. If heads comes up more than 5000 times, you win a million dollars. If heads comes up less than 5000 times, you die. Again, you are assured that the coin is completely fair and unbiased. Would you take this bet?

DirkH
April 10, 2010 1:53 pm

“Curiousgeorge (13:49:43) :
@ Steven Goddard
If I may, Iā€™d suggest any of several high end PC Statistics packages if you intend to pursue this sort of thing. Donā€™t waste your time with Excel or other consumer products.”
You forgot the wonderful and free R.
http://www.r-project.org/

Steve Goddard
April 10, 2010 2:03 pm

DirkH (13:53:29) :
Thanks for the tip about R. Free software is always nice!

igloowhite
April 10, 2010 2:08 pm

Put another way,,,
Looks like , “The House”,,,(D.C.) are the ones who set the odds on this
coin filp of chance before us now.
Never bet against “The House”.
Change The House, or find better odds some other way.

RockyRoad
April 10, 2010 2:10 pm

Steve Goddard (13:50:46) :
kadaka (13:32:07) :
Visual Studio is an unbeatable environment for developing C++ code. I just need to remember to run it on gnu platforms.
——————–
Reply:
Steve, jump into C#. It is a true object oriented language built that way from the ground up. It is a Microsoft product and completely compatible with Visual Studio. Indeed, some say VS was built for C#.

Curiousgeorge
April 10, 2010 2:12 pm

@ DirkH (13:53:29) : Thanks, I wasn’t aware of that package. šŸ™‚ I’ll take a look at it. When I was gainfully employed ( retired 10 years ago ) I was provided with the 3 I mentioned by the company. If you fly commercial, you are the beneficiary of some of my work. šŸ™‚

Bernie
April 10, 2010 2:14 pm

I don’t see that a random number generator is needed to make the point. The problem is described so as to be well defined, i.e., with a known distribution of Heads for a given number of flips.
My own view on Question #9 ā€œAre the models capable of projecting climate changes for 100 years?ā€
Current models do not seem capable because they are underspecified. In the future, with better specified models I see no reason why models could not project climate changes forward for periods roughly equivalent to the length of time for which we have reliable records for the variables that are incorporated in the models.

DirkH
April 10, 2010 2:17 pm

“RockyRoad (13:35:24) :
[…]
The US becoming a completely socialist/marxist nationā€“a recent study showed that countries that did so (and this wasnā€™t a future projection, prediction, or prognostication; it was based on case studies) saw an immediate 40% drop in GDP.”
USA GDP/capita in 2006: 43468 USD
Let’s see. Switch to socialism: minus 40%. 60% of 43468 are 26080.8 USD.
USA GDP/capita in 1994: 26247 USD
So the introduction of socialism will warp you back to 1994.
Numbers are from
EarthTrends (http://earthtrends.wri.org) Searchable Database Results

A C
April 10, 2010 2:21 pm

Our local weather bureau (Souh Australia) gives weather predictions like “There is a 50% chance that we will have a wetter than average winter” Without a hint of irony. I dont have much confidence in their predictions 100 years out.

April 10, 2010 2:24 pm

Stock Market Analogy – Well, It is not a model for sure. It is a guess about the future based on observations of the past. We don’t know anything about the future stock market, in the particular (one stock) or aggregate (all stocks). The Quants modeled the hell out of stocks and the models worked fine until they didn’t work anymore. Wall Street crashed and burned, it was all terribly disruptive, remember? The quantitative models failed. You can say they were of limited value – climate modelers say their models have limitations. But a failed model is just that – the model doesn’t work. Climate models can’t work. All the coin toss bet models worked great because the variables were known and controlled. All climate models are doomed to failure because the variables are neither known nor controlled.

DirkH
April 10, 2010 2:24 pm

“RockyRoad (14:10:35) :
[…]
Steve, jump into C#. It is a true object oriented language built that way from the ground up. It is a Microsoft product and completely compatible with Visual Studio. Indeed, some say VS was built for C#.”
C# is the new flagship language for Microsoft, that’s why they focus their efforts with VS on it. VC++ is a second rank citizen there now.
C# is very nicely done, the architect behind is is a Dane called Anders Hejlsberg, US citizen for a long time now. He was also the mastermind behind Turbo Pascal / Borland Pascal /Delphi before defecting to Microsoft.
Caveat is, you get tied to the Microsoft platforms. There are attempts to provide a runtime and compiler for the language under Linux; Mono and Rotor are the project names, but while the language is in the standardization process of ECMA, Microsoft will not free their runtime class libraries.
Personally, i still prefer C++ because it’s available on so many different platforms.

Jim Clarke
April 10, 2010 2:33 pm

Thank you, Steven Goddard!
The argument that the climate models are not the same as the weather models obfuscates the real issue. They are not identical, but the part that leads to the unpredictability of the future is basically the same.
‘Iterations feeding back into successive iterations! Aye, There’s the rub!’

FrankK
April 10, 2010 2:45 pm

I can Monte-Carlo my sub-surface models till the cows come home. But the models are still constrained by the parameters I elect to include, the ranges of parameters I think apply, the boundary conditions etc, etc. Simply performing Monte-Carlo will not necessarily give me the correct trend.
So too are climate models constrained. If you believe CO2 is the main driver performing Monte Carlo wont necessarily give you the right answer if other causes driving climate have not been included (e.g the current crop of cdlimate models) AND if you don’t calibrate and validate the climatic record.

RockyRoad
April 10, 2010 2:46 pm

DirkH (14:17:54) :
(….)
So the introduction of socialism will warp you back to 1994.
——————–
Reply:
And since when is that a good thing?
But let me ask you this: Could you live on 60% of what you’re currently making? I mean in real terms.
Could you pay for your house, your car and your food with 60% of what you’re making now?
It would be just as catastrophic to see a 10-15 degree drop in average temperature with the onset of the next Ice Age, which means only the southern fringe of the US could grow foodstuff crops. Now what–we become earthworms and eat the soil?
Consider what would happen to Denver, CO, if there was just a 10-degree drop in average low temps. Here’s the current monthly average for both highs and lows:
http://www.weather.com/weather/wxclimatology/monthly/graph/USCO0105?role=
Reduce them all by 10 degrees and there’s only 4 months (June, July, August, and September) where the average is above freezing. But remember, that’s the AVERAGE! Undoubtedly there would be several nights each month in the mix where the low falls below freezing–plants don’t go by averages, they’re killed by a single sub-freezing night. I just put a bunch of seeds in Jiffy pots for my garden and noticed the shortest required 62 days while some required 120 days.
This all means that Denver would be too cold for crops. And I’m predicting most of the US would be too cold for crops, too. No longer would it be a bread basket. You could call it a barren, starving freezer.
I hope you’re all fluent in Spanish.

RockyRoad
April 10, 2010 2:48 pm

Sorry… that monthly distribution for average temperatures in Denver is actually:
http://www.weather.com/weather/wxclimatology/monthly/graph/USCO0105?role=

R. Gates
April 10, 2010 2:48 pm

DirkH said:
“You wanted to know what skeptics say to this ā€œ2010 hottest year on instrumental record.ā€ Well.
My all-time favorite quote: Itā€™s a minor short-term issue. [R. Gates]”
_______________
Except 2010 follows on the heals of the warmest decade on instrument record, and so is part of an ongoing trend. You see, I have no problem with the leveling of the growth in temperatures in 2003-2008 timeframe as I can accept that the solar minimum would do this, just as is seen in past solar minimums, but now that the solar minimum is passed us, we would expect the signal of GHG forcing to once more be the dominant signal. In other words, if the 2010 warmth had occurred in the middle of 20 years of a downward trend in temperatures I would consider it a minor short-term “blip”, but coming after the warmest decade ever on instrument record, it is a continuation of a trend.
So far, except for saying “it’s all El Nino”, (which it is not), I’ve not heard a reasonable explanation for why 2010 is so warm from AGW skeptics, but AGW models say the odds are better than 50/50 we’ll see each successive decade warmer than the next, so with the flattening during the 2003-2008 time frame, it makes sense that to average out over the next 10 years, we’re now in for some warming…especially with the solar minimum behind us.

MikeA
April 10, 2010 2:54 pm

The rand function, in most implementations, is a pseudo-random number generator based on a seed. From the same seed the same sequence of random numbers is generated. I found this confusing at first, but if you’re trying to debug a program or reproduce a result, truly random data is not useful.
To achieve randomness for multiple runs use some other variable for a seed, I don’t know what current practice is for this. We used to use time when computers were slower.

April 10, 2010 2:58 pm

If climate models are better predictors for a 100 year span than a 1 year span, does that mean they can get it exactly right for 3010?
And as several have previously posted, do any of the models predict the start of the next glacial? If not, it seems to me that they are omitting THE major climate driver.

Svempa
April 10, 2010 3:05 pm

RhudsonL – It’s not Jesus it’s Madame Pompadour. You can see that from her bra.

R. Gates
April 10, 2010 3:15 pm

Ibrahim said (13:43:09) :
“Explain to me wich forcing caused the warming between 1910 and 1945?
And you get this picture wirh it:
http://i39.tinypic.com/261p2tu.png
____________
Could be multiple inputs (including GHG’s) and/or something that works on a multidecadal scale. Would have to look at all the available charts from the period, including whatever PDO, AMO, volcanism, etc. data that is available. I’ve not studied this in detail, but certainly know it has been a focus of much study and debate. If we take the stance the at least some of the warming during this period was caused by GHG’s then you could ask the reverse question, what caused the cooling from during the 1950-1977 period? Perhaps something else caused this period to be cooler than it would have been if whatever caused the forcing of the earlier period had continued.

Curiousgeorge
April 10, 2010 3:29 pm

One of the major issues with this little experiment, is that it is a Frequency distribution; as opposed to a Probability distribution. I should not have to explain the difference, but simply put they are not equivalent. Frequency distributions are historical, whereas Probability distributions are a statement regarding a future state and our knowledge ( inference ) thereof. A frequency distribution may inform a Probability calculation as prior information. But it should be recognized that a Probability is a statement concerning the future.
Probabilities do not exist, else they would be called Reality. A Reality is the point at which P=1 (an Event ).

April 10, 2010 3:43 pm

Steve Goddard;
So… you built a model… and expect THIS audience to accept the results? Of a MODEL?
Now enough with coin flipping. I went out on the bubble on a bad beat in my last 11 poker tournaments in a row. WUWT? I would have thought that global warming caused an even distribution of bad beats, but apparently not so. Is the CO2 changing the ratio of flushes to straights and no one has been studying this? Can I get a grant? Yes… a grant to study the effects of global warming on poker… finaly an aspect of climate science that I am qualified for. Whose in? I figure 10 people, one grant apiece, we can each come up with our own theory, apply the theory at the poker table, and who ever has the most grant money at the end gets to present their paper as proof of their theory with the other 9 being cited as “peer review”. Having established the relationship between global warming and poker (with no less than TEN scientists attesting to the results) we can then take the next step and challenge the warmists to put their grant money up against ours. I can see some potential problems though. The dealer might be heard to make certain statements:
No Mr Jones, you CANNOT erase the cards and draw your own numbers on them.
Mr Mann, I’m sorry, but you cannot bring your own cards to the table and just substitute some of them when it suits you.
Mr Briffa I can see you have the ace of spades, that’s very nice. But the rules don’t allow you to decide that the other 51 cards no longer count.
Back to you Mr Jones, where did you put those cards you were trying to erase? What do you mean you LOST them? No! We can’t just go on playing without them!
Mr Hansen, I’m sorry, you lost this hand. No Mr Hansen, a straight does not beat a flush. No Mr Hansen, you did not have four of a kind. Look Mr Hansen, you are not an officer of the law, so you can’t put me in jail because you lost the hand. And PREDICTING that you would have four of a kind is not the SAME as having four of a kind.
Put those chips back Mr Ravetz! Poker is about uncertainty, that doesn’t mean you can take half the chips “just in case”. What? Its “urgent”? Look, if you gotta go pee, then go, but you can’t take half the chips with you just because you MIGHT win the hand!
Who the heck are you now? Mugabe? Am I pronouncing that right? Yes you can enter a team late… no… you have to pay for your own chips you can’t make everyone else give you a few of theirs.
Oh for gosh sakes will the interruptions never end? Who are you now? Really? From the UN? I’m impressed…. what… NO! You can’t just decide who the winner is in advance, that’s not how poker works! Well I don’t CARE how many people studied it or how thick your report is… huh? Look, it doesn’t matter if 13,000 professional poker players reviewed your predictions, it doesn’t change how many chips Mr Jones has left!
JONES! Stop that right now! You can’t erase the numbers on the cards and you can’t change the numbers on the chips EITHER. What? You aren’t changing them you’re adjusting them? NO! You can’t adjust them, and you can’t compare your adjustments to Hansen’s adjustments… wait, you’re saying Hansen made adjustments too… Stop that, BOTH of you!
Sigh. Another late entry, sit down young lady. What was the name? Curry? Here are your chips Ms Curry. Now what team are you on, warmist or skeptic? Uh, no, you can’t play for both, you have to pick one. No you can’t wait until one team wins and then decide. OK warmist it is. Ok, skeptic. Ok, warmist…. Ok warmist it is.
Welcome back Mr Ravetz, you are looking much better now, not so uncertain anymore. Uh, yes, I see you brought your own rule book. Well yes, I can see where rule number 24 says it is urgent for you to win. I can see the white out you wrote on top of too… and that’s your rule book not the house rule book, it doesn’t count.
Mr Jones, you have to show me your cards, you can’t just declare yourself the winner of the hand. No, I am sorry, you have to show ME the cards. What? No, it doesn’t matter if you showed them to Briffa and Mann, I can’t just take their word for it, you have to show them to ME. What do you mean why? Because I am the dealer, THAT’s why. It’s my job to look at the cards to verify them. HEY! stop cutting up those cards! Mr Mann I SAW you slide your own cards onto the table while I was grabbing the scissors from Jones, you can’t DO that…. Briffa! Briffa! why are you throwing all the cards in the garbage? NO! keeping the ace of spades does NOT make it the most powerfull card in the world, I already TOLD you that. And origami is very nice but you shouldn’t fold the cards up like that. Yes I know what you made, you made it look like a hockey stick… NO! that doesn’t mean you won!
I QUIT! This is INSANE! I am taking my dog sled team and going back across the ice to Florida where I came from, you bozos can settle this global warming thing on your own!

Steve Goddard
April 10, 2010 4:06 pm

MikeA (14:54:19) :
rand() implementations are deterministic and will always produce the same number with the same seed. Iterating inside a program does not have that problem, though random number generators also have a repeat cycle length where they produce the same numbers.
People who need extremely good random number generators (like thermonuclear weapons designers) require huge supercomputers to generate the needed randomness.

rbateman
April 10, 2010 4:08 pm

Thanks Steve. The Hansen and MET GCM’s seem to rely on feedbacks acting as endless mirrors. They probably failed because the cause of conditions they attribute to feedback are transitory. Other phenomena are able to displace conditions that cause warming. Such models are sucking all the fun out of learning how the natural cycles work.

Pascvaks
April 10, 2010 4:08 pm

I understand that The Farmer’s Almanac has an ultra secret method of weather forecasting; that it’s a very ‘hands on kinda get-your-butt-out-of-the-office” thing. Perhaps NOAA&Friends would do well to buy the rights to use it for a few years before spending $500million on an unproven supercomputer (and another $5 to purchase the latest share-ware ‘super dupper’ program to run the data on it).
To say that Climatology is in its infancy is wrong. It hasn’t even been properly conceived yet.

April 10, 2010 4:17 pm

I am only a simpleminded engineering type but I will say that the models can’t make predictions and never will unless the mechanism which they are trying to describe is understood. Not only is it not understood now, but the mechanism currently in vogue is wrong.

Steve Goddard
April 10, 2010 4:31 pm

George Steiner (16:17:28) :
The bottom line may turn out to be that climate is impossible to forecast on scales shorter than ice age cycles.

Bob Koss
April 10, 2010 5:07 pm

Steve,
Since your Visual Studios chart has no tails and only a spread of +-100 from what would be expected. I suggest there is the possibility an error occurred between keyboard and chair. šŸ™‚
I just ran the same simulation in VB6 which is part of the Visual Studio 6.0 package and came up with a similar distribution as your gnu implementation. Spread +-200. 4999 heads occurred most often and happened 869 times.

April 10, 2010 5:23 pm

RockyRoad (11:52:08) :
“Love your map of the US for weather projections by the NOAA. Whatā€™s their projection for next year, and is there any way I can get a color negative of it so it will be more accurate?”
Simply save the image. In the case of this post’s image:
Install the Imagemagick suite of graphic utilities. From the command line, do
gt@koko:~$ convert cpcwinter2010forecast1.jpg -negate cpc-negative.jpg
all on one line. That will produce your negative of the original.
cheers

cotwome
April 10, 2010 5:31 pm

“Coin Flips using the Microsoft random number function”
It looks more like a Rorschach Test than anything ‘random’.

Anticlimactic
April 10, 2010 5:34 pm

GCMs did not / do not predict the future, they do not correspond with real observations, they are scientifically invalid.
We are thus beyond science and in to the art of climastrology. Cross their palms with silver and they will tell you your doom. Quiver before the new climastrologer priests, who can say no wrong. Give up your worldly possessions, turn your back on civilisation, and recreate your ancestors life in the honest toil of serfdom.
[The rich are not affected by these predictions and must make their own arrangements]

Docmartyn
April 10, 2010 5:35 pm

If you like plots, try this random number series. (rand()-0.5) + (rand()-0.5)……(rand()-0.5)n.
This gives pseudo-random walk, you can get some very nice hockey sticks. This is actually much closer to a real random process that has ‘memory’; each time you flip a coin there is a loss of memory as you return to the same state each time. In my series, there is a memory of the previous state. Take a series or 100 random steps and the value should have a mean of 0; but what sort of distribution do you get? It’s rather funkey actually.

Steve Goddard
April 10, 2010 5:40 pm

Bob Koss (17:07:43) :
Both plots came from compilations of the same source code. I’m using Visual C++ 2008 professional. VS6 which you are using is a much older version. Perhaps MS has diminished the quality of their rand() function recently.
Also, there is no reason to believe that the VB libraries are written by the same people who write their C++ libraries.

April 10, 2010 5:49 pm

God does not play dice with the Universe.
There is only one future, not an infinite number of runs. The Law of Large Numbers does not apply.
GC models are GIGO. Running them a zillion times does not change that.
The “You Bet Your Life” alarmist scenario is preposterous, irrational, and hysterical. I reject the proposition utterly.
WARMER IS BETTER!!!!!!!!!!!!!

Mike Davis
April 10, 2010 5:55 pm

Steve Goodard: (16:31:16)
You might be right as this Ice Age
the Pliocene-Quaternary glaciation, started about 2.58 million years ago and the interglacials do not follow a set pattern that can be predicted.
http://en.wikipedia.org/wiki/Ice_age
Pardon the Wiki link (It was a “quiki” look up

Steve Goddard
April 10, 2010 6:00 pm

Bob Koss (17:07:43) :
Tried it on on MS.Net 2003 pro and get the same numbers as VS8 pro. Probably a VB/VC++ difference.

barry
April 10, 2010 6:06 pm

The classic climate period is 30 years. Meier’s comments on GCMs refer to global climate models projecting out through the coming century. This post refers to seasonal climate models for a fraction of the globe (US), which is about weather, temporally and spatially. There aren’t enough coin flips to smooth out the distribution here. The people who run models say that weather can be predicted with reasonable accuracy over a couple of weeks at the most, and that long-term GCMs are reasonable over several decades. For anything in between they say there is not much skill, so they agree with Steve Goddard on that point.
But there is a good analogy here regarding the seasons. We can’t tell what the temperature will be on a given day in winter. We might be surprised to find that one warm winter’s day has a higher temperature than a cool summer’s day (in a specific location). But we can tell with a great deal of confidence that winter will be cooler than summer.
As for climate sensitivity – a 23 degree tilt in the Earth’s axis resulting in an annual redistribution of insolation is sufficient to demonstrate that small energy changes are enough to cause significant changes in the biosphere.
Speaking of Arctic ice cover and albedo, the sun is starting to get high in the sky in the Arctic, and ice extent is essentially unchanged from 30 years ago.
Monthly anomalies are, of course, weather. Using the classic 30-year period, we see that there has been a significant trend in Arctic sea ice extent – not to mention volume, which was nowhere near normal last month. If by July the sea ice extent dropped well below normal, it would likewise be invalid to rely on that month’s anomaly to say anything about sea ice climatic trends. We’re talking about 2 or 3 coin flips here. We could easily get all heads or all tails. Weather is not climate.
I donā€™t see much theoretical or empirical evidence that climate models produce meaningful information about the climate in 100 years.
If you take the A1B model runs used for the IPCC AR4, don’t smooth them but leave in the interannual variability, and then overlay the instrumental record, you’ll see that obs lie within the range of projections to data – even for such a short time span (from 1990). No one model replicates the year-to-year temperature of course, but taken together they capture the distribution – even for the relatively cool 2008. So far, they’re doing ok. We’ll see what happens down the road.
Climate feedback is based on the idea that todayā€™s weather is affected by yesterdayā€™s weather, and this yearā€™s climate is dependent on last year.
Climate feedback in the context Meier is working with is a long-term process. There’s no such thing as ‘this years’ climate’. It’s an oxymoron. Just as with forcings, feedbacks are not expected to dominate interannual variability. The signal emerges over decades. Truncate the data, spatially or temporally, and you’re talking about weather. Compare two subsequent days or years and you’re only working with two coin flips.

Wren
April 10, 2010 6:37 pm

Steve Goddard (13:44:30) :
Wren (13:13:23) :
Sneak preview of something I am writing up. Here is GISS vs. CO2 concentration.
http://docs.google.com/View?id=ddw82wws_5998xrxhzhc
Steve, you might be interested in this if you haven’t already seen it:
http://moregrumbinescience.blogspot.com/2009/03/does-co2-correlate-with-temperature.html
Regarding your first chart, notice how the line through the GISS temperature historical series usually doesn’t touch the actual values. It does a poor job of describing short-term changes in temperature, but a good job of describing changes over 100+ years.

Ibrahim
April 10, 2010 6:46 pm

R. Gates (15:15:00) :
Ibrahim said (13:43:09) :
ā€œExplain to me wich forcing caused the warming between 1910 and 1945?
And you get this picture wirh it:
http://i39.tinypic.com/261p2tu.pngā€
____________
….. “Iā€™ve not studied this in detail, but certainly know it has been a focus of much study and debate …..
—————–
I wouldn’t have asked the question if it was allready answered.

April 10, 2010 6:49 pm

davidmhoffer (15:43:40),
Very good, I enjoyed that.

April 10, 2010 6:51 pm

Re the difference between tossing a computer “coin” and a real coin, a real coin might have a bias, and its runs might centre on, for example, 5,394 rather than 5,000. But you’ll still get a Gaussian (unless it is so biased that you get almost no examples of one side or the other and you are pushed up against a “wall”, thus deforming the symmetric shape). So no, for a real and unknown coin, I would not take the bet.

Joel Shore
April 10, 2010 6:56 pm

Steve Goddard says:

The bottom line may turn out to be that climate is impossible to forecast on scales shorter than ice age cycles.

Really? Here is a prediction for you: The climate here in Rochester will be at least 20 C colder next January than it is this July.
Much of your post rests on a false analogy between predicting things such as weather this summer will be colder or warmer than normal and predicting future climate in response to forcings. While predicting warmer or colder than average seasons may seem like a climate prediction, it is still basically an initial value problem. The prediction of the future climate 100 years from now in response to a significant change in radiative forcing is a boundary value problem and is much more analogous to predicting the seasonal cycle (as I talked about above).
Admittedly, with the seasonal cycle things are easier in the sense that we have better past data to go on…and can more quickly verify future predictions. Nonetheless, the fact that this can be successfully modeled basically undermines your notion that there is somehow an inherent unpredictability in climate that makes any such prediction impossible.

R. Craigen
April 10, 2010 7:00 pm

Kirk A (11:50:49) :
Your coin flip distribution is conspicuously symetricalā€¦

Kirk, look up “Central Limit Theorem”. Or if you prefer, the binomial distribution (for large numbers of trials). In any experiment in which any given random variable, regardless of how it is distributed, is sampled a large number of times, the resulting sum is essentially normally distributed (i.e., the classical bell-shaped curve). As the number of trials goes to infinity the error in this approximation goes to zero. After 50 or so trials it is considered VERY good. After 10,000, the error is infinitesimal.
The only really silly thing about Mr. Goddard’s experiment (besides using a poor random number generator initially) is that it wastes CPU cycles performing 100,000 x 10,000 = 1,000,000,000 digital coin flips (bet that took a while!). He should have simply plotted the appropriate binomial variable to get the same curve.

Wren
April 10, 2010 7:08 pm

DirkH (13:51:04) :
ā€œWren (13:27:02) :
[…]
Hansenā€™s projections made back in the 1980ā€™s are looking better and better. No wonder he won an award for his contributions to climate modeling.ā€
Oh, thatā€™s funny. Letā€™s look.
http://climateaudit.org/2008/01/16/thoughts-on-hansen-et-al-1988/
Looksā€¦ pretty bad for Hansen.
=====
Are you kidding? That evaluation is outdated. Actual temperatures are catching up to Hansen’s projections, already reaching his Scenario C projection and closing in on his Scenario B projection.

DirkH
April 10, 2010 7:10 pm

“Wren (18:37:22) :
[…]
http://moregrumbinescience.blogspot.com/2009/03/does-co2-correlate-with-temperature.html
Regarding your first chart, notice how the line through the GISS temperature historical series usually doesnā€™t touch the actual values. It does a poor job of describing short-term changes in temperature, but a good job of describing changes over 100+ years.”
The history of the various GISS temperature adjustments of course doesn’t harm this correlation as they have mostly been upwards…

Wren
April 10, 2010 7:22 pm

ā€œRockyRoad (13:35:24) :
[…]
The US becoming a completely socialist/marxist nationā€“a recent study showed that countries that did so (and this wasnā€™t a future projection, prediction, or prognostication; it was based on case studies) saw an immediate 40% drop in GDP.ā€
USA GDP/capita in 2006: 43468 USD
Letā€™s see. Switch to socialism: minus 40%. 60% of 43468 are 26080.8 USD.
USA GDP/capita in 1994: 26247 USD
So the introduction of socialism will warp you back to 1994.
Numbers are from
EarthTrends (http://earthtrends.wri.org) Searchable Database Results
=====
Is that a promise ?

CarlPE
April 10, 2010 7:25 pm

Equating coin flips with climate calculations over long time periods is obviously false. Coin flips are truly random, each flip being a unique event independent of what went before and with no influence on the next flip. Climate predictions for next year are strongly influenced by this year, and will strongly influence the next year. If there is a small error, say 10%, the cumulative effect over time would make the results meaningless. Add to that the fact that interactions and amplitudes are to a large extent just guesses, and were established with the desire to show that CO2 is a problem. The idea that all the errors cancel each other out is wishful thinking.
With the large number of parameters in a climate model, some of which have large influence on temperatures over multi year periods, climate models should, I would think, go through some wild gyrations over various time periods. The year to year, decade to decade variations would likely exceed long term climate changes in a model that incorporates a significant number of forcings.
The forcings that caused the last ice age were clearly dominant, and we are due for another ice age. Do any of the models show an ice age? If they don’t, shouldn’t they.

April 10, 2010 7:26 pm

Wren (19:08:25):
“Actual temperatures are catching up to Hansenā€™s projections, already reaching his Scenario C projection and closing in on his Scenario B projection.”
That’s like your stock market argument. With your market analogy, inflation was the unstated variable [and the 14 years of no real stock appreciation is about the same as Phil Jones’ 15 years of no real temperature increase].
With Hansen’s predictions, the natural temperature rebound from the LIA that is still going on is the unstated variable.
James Hansen used the Texas Sharpshooter Fallacy [shoot a hole in a barn door, then draw a bullseye around it] when he made his three separate predictions.
Now, his sidekicks pick the least wrong scenario as an example of how he was almost right. Of course, almost right = wrong.

Wren
April 10, 2010 7:35 pm

DirkH (19:10:59) :
ā€œWren (18:37:22) :
[…]
http://moregrumbinescience.blogspot.com/2009/03/does-co2-correlate-with-temperature.html
Regarding your first chart, notice how the line through the GISS temperature historical series usually doesnā€™t touch the actual values. It does a poor job of describing short-term changes in temperature, but a good job of describing changes over 100+ years.ā€
The history of the various GISS temperature adjustments of course doesnā€™t harm this correlation as they have mostly been upwardsā€¦
====
Really? I wonder how they missed that last La Nina induced temperature spike that UAH and RSS recorded. Wouldn’t that have been reason for an upward adjustment?
Anyway, how do adjustments relate to my point about short-term and long-term temperature trends?

DirkH
April 10, 2010 7:38 pm

“Steve Goddard (16:06:53) :
[…]
People who need extremely good random number generators (like thermonuclear weapons designers) require huge supercomputers to generate the needed randomness.”
No, Steve, computer power doesn’t help you much here, but a simple method for perfect randomness is to get input from a physical process like radioactive decay, so let a Geiger Counter or some other random source (noise from Zener diodes is also good) influence your random number generator and you break out of any cyclicity.
If you need high quality randomness with exact reproducibility, record your physical process’s signal on a large harddisk and use the recording to influence the random number algorithm.
Of course you can also just record physical noise and use that as the random source – this is called a one-time pad.
The immanent cyclicity of even the simplest random number algorithms can be arbitrarily expanded by using longer internal words. For instance, using 8 bits of internal state will make your sequence repeat after at most 256 steps; 16 bits gives a potential maximum of 2^16 or 65536, 32 gives you about 4 billion, 64 gives you 4 billion squared…. Together with a little unpredictability from radioactive decay you’re as safe as you want from quick repetitions (just modify your internal state each time the Geiger Counter clicks, for instance by toggling a bit of the state.).
Of course, expanding the word size will slow down the random number generation but as you see, a doubling of the word size doubles the x in 2^x where 2^x defines the cycle length of the random sequence (in the absence of an extra physical input).

DirkH
April 10, 2010 7:42 pm

“Wren (19:22:19) :
[…]
So the introduction of socialism will warp you back to 1994.
Numbers are from
EarthTrends (http://earthtrends.wri.org) Searchable Database Results
=====
Is that a promise ?”
No, it’s a prediction. Just introduce socialism and verify it.
But be warned: AFTER the initial 40% reduction in GDP you will find out that your GDP will now shrink further. That’s another prediction. Venezuela is just testing this hypothesis.

Wren
April 10, 2010 7:45 pm

Smokey (19:26:53) :
Wren (19:08:25):
ā€œActual temperatures are catching up to Hansenā€™s projections, already reaching his Scenario C projection and closing in on his Scenario B projection.ā€
Thatā€™s like your stock market argument. With your market analogy, inflation was the unstated variable [and the 14 years of no real stock appreciation is about the same as Phil Jones’ 15 years of no real temperature increase].
With Hansenā€™s predictions, the natural temperature rebound from the LIA that is still going on is the unstated variable.
James Hansen used the Texas Sharpshooter Fallacy [shoot a hole in a barn door, then draw a bullseye around it] when he made his three separate predictions.
Now, his sidekicks pick the least wrong scenario as an example of how he was almost right. Of course, almost right = wrong.
====
Well, you got me on that one. I will admit I haven’t given any though to using the CPI to take the rising cost of living out of average global temperature.
BTW, I am invested in a S&P 500 index fund that has lost a little over the past 10 years even without an adjustment for inflation. Fortunately, I don’t have a lot in it.

DirkH
April 10, 2010 7:49 pm

“Wren (19:08:25) :
[…]
Looksā€¦ pretty bad for Hansen.
=====
Are you kidding? That evaluation is outdated. Actual temperatures are catching up to Hansenā€™s projections, already reaching his Scenario C projection and closing in on his Scenario B projection.”
Scenario C was assuming that CO2 emission rise stopped in 2000. This has not happened in reality so we can discard that one.
And in case you have missed it we just had an El Nino so we have a temperature spike. Like in 1998, this will be a short-lived effect. Or as we hobby climatologists say, a minor short-term issue.

DirkH
April 10, 2010 7:55 pm

“Wren (19:35:55) :
[…]
Anyway, how do adjustments relate to my point about short-term and long-term temperature trends?”
You were linking to an article with a chart that scatterplots CO2 concentration against temperature. This chart omits the time dimension. BUT, as OVER TIME upwards adjustments have been made in GISS, this conveniently helps to create a SPURIOUS correlation. So there, on a plate for you. Ein Schelm, der bƶses dabei denkt.

Paul M. Parks
April 10, 2010 7:59 pm

@Steven Goddard: Your sample code in a comment above does not call the srand() function, so with C runtime libraries that use a pseudo-random generator the output will very likely be the same on every run. Seeding the random number generator with, for example, the return value of the time() function will generate a slightly more random output, though of course still nowhere near true randomness.

April 10, 2010 8:02 pm

Wren (19:22:19) :
“Is that a promise ?”
So. That shows you’re arguing not from science, you’re arguing CAGW with a political goal in mind.
Just so we know you’re not being a hypocrite, let’s see you get by on 40% less income. You go first, OK? Show the proletariat how it’s done.
And donate your money to WUWT, so we can verify that you’re not just telling the rest of how we have to live our lives with 40% less.
Walk your talk.

Mike Davis
April 10, 2010 8:23 pm

I think Walt needs to study this for a better nderstanding of global climate:
http://www.21stcenturysciencetech.com/Articles%202005/ComingPresentIceAge.pdf
Excerpt:
We are now in an ice age and have been for about the
past 2 million years. Over the past 800,000 or so
years, the Earth’s climate has gone through eight distinct
cycles of roughly 100,000-year duration. These cycles are
driven by regular periodicities in the eccentricity, tilt, and precession
of the Earth’s orbit. In each of the past eight cycles, a
period of glacial buildup has ended with a melt, followed by a
roughly 10,000-year periodā€”known as an interglacialā€”in
which relatively warm climates prevail over previously icecovered
northern latitudes.
30 years is only weather šŸ˜‰

Bob Koss
April 10, 2010 8:36 pm

Steve,
One last comment and then I’ll bow out.
Is it likely that MS would put out two programs five years apart with the same defective commonly used function? I expect they would have been swamped with complaints when the first program was issued. That they would then not fix it seems to be quite a stretch considering my even older program handles it correctly.

Harry Eagar
April 10, 2010 9:01 pm

Let’s ask Meier if he would put ALL his money in a new hedge fund run on exactly the same lines as Long-Term Capital Management, which is the exact parallel to climate models.
(If anybody has forgotten, LTCM looked great for four years, then went bust.)

April 10, 2010 9:07 pm

DirkH (13:51:04) :
ā€œWren (13:27:02) :
Hansenā€™s projections made back in the 1980ā€™s are looking better and better.
………………………………………………………………………………………………………………….
DirkH,
I think Wren won’t like this video:

especially from :51 to 1:22

Noelene
April 10, 2010 9:07 pm

davidmhoffer
I enjoyed it too. Very funny. It captures how stupid some scientists are,or just venal, to back any science Jones, Briffa or Mann espouses.

April 10, 2010 9:15 pm

R. Craigen (19:00:56) :
Kirk A (11:50:49) :
“Your coin flip distribution is conspicuously symmetricalā€¦”
Kirk, look up ā€œCentral Limit Theoremā€.

It is simpler than that. The number of heads plus the number of tails must equal 10,000 in every run.

April 10, 2010 9:21 pm

…..I’m reminded of a climate scientist one time who said, “My model is right. It’s the real world that’s wrong”.
Dave Legates, John Christy, and Sallie Baliunas give some thoughts on climate models:

Toto
April 10, 2010 9:26 pm

The coin tossing is a red herring. Walt Meier said:”Now of course, weather and climate are different than tossing a coin.” Agreed, way different.
Walt Meier said: “Whereas coin flips are governed largely by statistical laws, weather and climate are mostly governed by physical laws.”
The coin flips are governed by physical laws. There aren’t any statistical laws. Probability is just a mathematical structure to replace vague concepts like “luck” and “chance”. He is right that often the large scale is easier than the minute scale because we can just deal with averages. Recall that Einstein was quite upset when he found determinism failed (quantum mechanics), so we can’t fault Walt too much for clinging to the belief that climate is simpler than weather because the weather averages out. Yet that belief has not been proven (nor has it been proven to be chaotic).

April 10, 2010 9:27 pm

DirkH (19:38:47) :
ā€œSteve Goddard (16:06:53) : […]
People who need extremely good random number generators

Several years ago I wrote a book on machine coding for IBM’s AS/400 machine. One of the chapters was about how to generate a ‘truly’ random number [without attaching new and special hardware]: http://www.leif.org/as400/mlp022.doc The details are of course machine specific, but the approach is not.

April 10, 2010 9:42 pm

If the met office used a coin flip to predict the weather for the last 10 years they would have been more right 4-6 times correct than 1 in 10. LOL
I get it. he is right!!!!!!!!!!!!!!!!

Digsby
April 10, 2010 9:43 pm

If it is easier to accurately model climate 100 years ahead than it is 10 years ahead – as apparently CAGWers believe – then is it not logically inescapable that it must be even easier to do it 1,000 years ahead and then an absolute doddle 1,000,000 years ahead. Although none of us will be around to actually check the veracity of the last claim 1,000,000 years from now, I hope that there isn’t anyone who would consider it to be anything but utterly absurd. So, if climate models are not accurate in the short term, nor in the extremely long term, why should anyone think that they improve somewhere in between?

R. Gates
April 10, 2010 9:57 pm

Dirk H posted this:
“I think Wren wonā€™t like this video:

especially from :51 to 1:22″
———–
I hope the guy on the video is not a pastor of a church (it at least appears he is associated with some form of church), as he makes a huge lie toward the end of the video, claiming that global temperatures have “not even gone up a fraction of a degree” since 1988. In fact they’ve gone up at least 3/10 of a degree, and the last I checked, that was a fraction of a degree. Do people believe this kind of nonsense just because it is put out by something with the name of “ministries” attached to it?

Steve Goddard
April 10, 2010 10:11 pm

Paul M. Parks (19:59:40) :
The code does produce the same output on every invocation, and would even if I called the srand() function. Then it would just produce slightly different output depending on the seed it started with.
Each time rand() is called, the seed changes – one billion times (10,000 * 100,000) in this case.
Bob Koss (20:36:27) :
The Microsoft rand() function is not technically “defective” it just seems to have a short repeat cycle and possibly some other unwanted deterministic behaviour. If you look closely at the gnu output, it also shows some non-random patterns.

Wren
April 10, 2010 10:12 pm

DirkH (19:55:29) :
ā€œWren (19:35:55) :
[…]
Anyway, how do adjustments relate to my point about short-term and long-term temperature trends?ā€
You were linking to an article with a chart that scatterplots CO2 concentration against temperature. This chart omits the time dimension. BUT, as OVER TIME upwards adjustments have been made in GISS, this conveniently helps to create a SPURIOUS correlation. So there, on a plate for you. Ein Schelm, der bƶses dabei denkt.
====
That’s not what I’m talking about regarding the short- and long terms. I will explain what I mean, but you will have to refer back to Steve’s chart
Steve’s chart shows GISS temperatures from 1890 to a current year with a regression line through the series to represent the trend over the entire 100+ years. Suppose back in 1890 you had projected this trend. As you can see from the chart, you would have been wrong about most of the year to year changes in temperature even though you got the long-term trend right.
So about 10 years or so into that 1890-2009 projection, skeptics would have been saying DickH your temperature projection sucks. How do you expect to forecast for a Century when you can’t even get next year right? But you would have had the last laugh if you and your critics weren’t dead by then.

Editor
April 10, 2010 10:16 pm

There’s really no need to model this with a simulation, it’s readily computable if you don’t mind dealing with some really large number. One feature of the language Python is that when numbers start getting too big for the registers on the underlying machine, Python changes to extended precision.
To solve the puzzle, the easiest way to approach it is by computing the probability of getting heads 0-3999 times. That will also be the probability of getting heads 6001-10000 times. if that’s “p”, 2*p is the probability of dying, 1-2*p is the probability of winning.
I wrote a little Python program for that:

# Hack to compute odds of the first k outcomes in row n of a binomial
# distribution.  In more mathematical terms, we need to compute
# sigma(C(n,i)) for i from 0 to k and compare it to 2^n.  According
# Knuth Vol 1, there's no handy short cut for computing that sum,
# but it's easy enough to track it while computing each term.
#
import sys
#
def binsum(n, k):
    coeff = 1      # First term is always 1 (no terms)
    sum = 1
    for i in range(k):
        coeff = coeff * (n - i) / (i + 1)
        sum += coeff
        # print 'coeff', coeff, 'sum', sum
    return sum
#
if len(sys.argv) != 3:
    print 'Usage: binsum binomial-order terms'
    sys.exit(1)
n = int(sys.argv[1])
k = int(sys.argv[2])
chances = binsum(n, k)
combinations = 2 ** n
odds = combinations / chances
print 'Odds of up to %d out of %d are 1:%d' % (k, n, odds)

Does it work? Let’s try some simple cases:
If we flip a coin 5 times, the probability of getting 0, 1, 2 heads is 1/2,
as is the probability of 3, 4, or 5 heads:
$ python binodd.py 5 2
Odds of up to 2 out of 5 are 1:2
Getting no heads is one chance out of the 32 permutations:
$ python binodd.py 5 0
Odds of up to 0 out of 5 are 1:32
And getting up to 5 heads is an absolute certainty:
$ python binodd.py 5 5
Odds of up to 5 out of 5 are 1:1
Back to our test, the odds of getting 0 to 3999 heads out of 10000 flips:
$ python binodd.py 10000 3999
Odds of up to 3999 out of 10000 are 1:172542638129728354324544909641063289590541370504525129218969625689320496886879530815228312
The odds are dying are 1:086271…151. So yeah, I’d go for it.
Oh, one more check – the graph above (the believable one) shows 100 chances out of 100,000 flips to get at least 4900 (a bit less) heads, 4883 is the actual:
$ python binodd.py 10000 4883
Odds of up to 4883 out of 10000 are 1:101
I like to use 1 in a million as a threshhold for certain risky activities (like crossing the street), that would be 4762:
$ python binodd.py 10000 4755
Odds of up to 4755 out of 10000 are 1:1992858
(Remember there are equal odds at the other side of the bell curve, so one
in two million here, one in two million there is two is two million.)
Since it is my life, and we’re only talking a million bucks, I’d like a bigger buffer, one in a billion is:
$ python binodd.py 10000 4694
Odds of up to 4694 out of 10000 are 1:2030843522

Steve Goddard
April 10, 2010 10:17 pm

DirkH (19:38:47) :
Why do you think that Los Alamos and Livermore have usually had the most powerful supercomputers on the planet?
It was pretty difficult to digitize the output of radioactive decay into input for a computer program in 1947.
I used to work with one of the authors of this paper.
https://docs.google.com/viewer?url=http://www.lanl.gov/history/hbombon/pdf/00285876.pdf

Steve Goddard
April 10, 2010 10:23 pm

Wren (18:37:22) :
Looks like the graph you linked agrees with mine. I don’t see any indication of non-linearity which would lead me to believe climate sensitivity is greater than the extrapolation of that line. In fact, sensitivity should decrease somewhat at higher levels of CO2.

Steve Goddard
April 10, 2010 10:24 pm

Wren (18:37:22) :
One more thing – crutemp vs. CO2 has a considerably lower slope than gistemp.

April 10, 2010 10:25 pm

Smokey
Noelene (21:07:14) :
davidmhoffer
I enjoyed it too. Very funny. It captures how stupid some scientists are,or just venal, to back any science Jones, Briffa or Mann espouses>>
Glad you enjoyed it. Time and inspiration permitting, I might make it longer and throw some characertistics of certain skeptics into it as well. Willis insisting on figuring out how the thermostat works while complaining that his chip stack is logarithmic, Anthony claiming the cards must pass certain landing standards or be discarded. I even figure on Al Gore walking in just some oil tycoon starts handing out big grants, shouting aha! I knew it and then going silent when all the checks goto CRU scientists. Tomorrow perhaps.
In the meantime, If I made a couple of people laugh today, then I had a very good day.

Wren
April 10, 2010 10:26 pm

Digsby (21:43:11) :
If it is easier to accurately model climate 100 years ahead than it is 10 years ahead ā€“ as apparently CAGWers believe ā€“ then is it not logically inescapable that it must be even easier to do it 1,000 years ahead and then an absolute doddle 1,000,000 years ahead. Although none of us will be around to actually check the veracity of the last claim 1,000,000 years from now, I hope that there isnā€™t anyone who would consider it to be anything but utterly absurd. So, if climate models are not accurate in the short term, nor in the extremely long term, why should anyone think that they improve somewhere in between?
===
The “do nothings” on CAGW have an implicit forecast of no change in global temperature or a forecast of no change that man could do anything about, depending on which “do-nothing” you ask. These forecasts are based on wishful thinking rather than climate models.

Steve Goddard
April 10, 2010 10:37 pm

Bob Koss (20:36:27) :
You can download a free version of VC Express and try the code (posted above) for yourself. No need to speculate about it what Microsoft might or might not be doing.

Wren
April 10, 2010 10:43 pm

Tom in Texas (14:58:44) :
If climate models are better predictors for a 100 year span than a 1 year span, does that mean they can get it exactly right for 3010?
====
Nope. Not exactly right for 3010 or any other year. But the model’s projection should be more accurate than a projection of no change.

Steve Goddard
April 10, 2010 10:48 pm

RockyRoad (14:46:10) :
Don’t know about a ten degree drop in Colorado, but this past winter seemed to go on endlessly. Probably is not over yet.

April 10, 2010 10:52 pm

Can a more complex situation be modeled more easily and accurately than a simpler situation?
I am still with Meier in saying, in the way he means, that YES, very often it it is easier. Consider crowd behaviour. When exiting a stadium during a fire alarm its harder to predict what an individual will do than it is to predict what the crowd will do. Likewise with windblown sand grains and sand dunes, with the brownian motion of a gas molecule and a gases tendency through its motion to fill a container,..etc.
And, likewise, it seems easier to predict the general frequency of storm fronts hitting Melbourne over a winter than it is of predicting whether one will hit on 1 July — and these days we can even predict the increase or decrease depending on Indian ocean dipole etc.
There is always greater complexity in the scale, or level, below the one in which we are working – this acutally makes the ‘simpler’ situation more complex. Mandelbrot is very good in discussing these scaling issues, and while they are somewhat arbitary, and there are emergent effects across them, this does not mean that the answer to our question is aways ‘NO’.
I would be the last to say that existing GCM can predict +100 years, but that is not the specific point that Meier is bringing into dispute with his example, and thus much of what Goddard says about feedback and model performance is irrelevant.

Wren
April 10, 2010 10:58 pm

DirkH (19:49:01) :
ā€œWren (19:08:25) :
[…]
Looksā€¦ pretty bad for Hansen.
=====
Are you kidding? That evaluation is outdated. Actual temperatures are catching up to Hansenā€™s projections, already reaching his Scenario C projection and closing in on his Scenario B projection.ā€
Scenario C was assuming that CO2 emission rise stopped in 2000. This has not happened in reality so we can discard that one.
And in case you have missed it we just had an El Nino so we have a temperature spike. Like in 1998, this will be a short-lived effect. Or as we hobby climatologists say, a minor short-term issue.
=====
And the 2000-2009 decade was warmer than the 1990-1999 decade despite that 1998 El Nino, wasn’t it? Warmer still will be the 2010-2019 decade.

Steve Goddard
April 10, 2010 11:14 pm

RockyRoad (11:52:08) :
The CPC projection for next winter is very similar to last year’s forecast.
http://www.cpc.ncep.noaa.gov/products/archives/long_lead/gifs/2010/201002temp.gif
So far, their April-June forecast from that same page is inverted.

MaxL
April 10, 2010 11:21 pm

Generating random numbers is at the crux of our work on stochastic boundary layer dispersion modeling. If anyone needs a good random number generator in their code check out Numerical Recipes: http://www.nrbook.com/a/bookcpdf.php
The online version is a bit dated now, but the code is the same.
There are excellent descriptions of the weaknesses and strengths of various methods.

pwl
April 11, 2010 12:10 am

Within the realms of probability.
The coin toss.

Gary Oldman and Tim Roth are excellent in this film. Really funny.

pwl
April 11, 2010 12:35 am

C. Shannon (13:39:20) : @Pwl, Wouldnā€™t it be: 50-(x/2), x, 50-(x/2)?
Yes, I also caught the mistake… I rewrote it on another post as this:
I flipped a coin once and it landed on itā€™s side standing up on a hard wood floor! So itā€™s not 50-50 heads or tail, itā€™s 50-x, x*2, 50-x odds where x*2 is the chance that the coin will and on itā€™s side and stay standing!
Glad to see someone checking for mistakes. If only the Real Climate Deniers would admit their lies, damned lies and statistical follies.

Bart
April 11, 2010 1:42 am

Digsby (21:43:11) :
“So, if climate models are not accurate in the short term, nor in the extremely long term, why should anyone think that they improve somewhere in between?”
Because that is our experience with a wide variety of systems. Short term behavior is typically very difficult to predict, due to limits on computational loads and data sampling requirements. Long term behavior is difficult to predict, because random events accumulate, generally increasing variability as a fractional power of time. But, in the mid-term, for which a valid model is available and whose signal to noise ratio is large compared to the short term variability, we can successfully project behavior. Example applications include navigation of air and naval vessels, and economic modeling.
The key to the above, you will note, is “for which a valid model is available”.

mooli
April 11, 2010 1:52 am

Way to miss the point.
It was an analogy to explain how a complex scenario can be easier to model than a simple one. It was illustrating that asserting the complexity of modelling single weather events compounds to make climate even more difficult and complex to predict is logically unsound. Saying that climate is not like coin flipping doesn’t exactly rebut his point. Taking short term weather events and saying that averaged predictions for long tem trends didn’t predict them doesn’t rebut his point either.
I can’t believe you bothered to include *2* plots of distribution curves from rand generation…
And comparing observed ice trends with projections and saying they don’t match is a bit silly when the projections *don’t really start to decline* in the antarctic for another five years. Perhaps you should overlay the two graphs, no?

Xi Chin
April 11, 2010 1:56 am

Microsoft’s predictible “random number generator”…
http://www.theregister.co.uk/2007/11/13/windows_random_number_gen_flawed/

An Inquirer
April 11, 2010 2:22 am

davidmhoffer (15:43:40). Thank you so much. It is good to laugh!

An Inquirer
April 11, 2010 2:29 am

R. Gates (21:57:40) : Concerning your claim of a “lie.” The video was made in summer of 2008. According to UAH (and others), the summer of 2008 was no warmer than the summer of 1988. Yes, anomalies have risen since summer of 2008, but he was being taped in the summer of 2008 and not make predictions in 2010. I would agree that his approach is not the most scientific-pleasing method, but your charge is even more off base.

Vincent
April 11, 2010 3:18 am

Wren,
“The ā€œdo nothingsā€ on CAGW have an implicit forecast of no change in global temperature or a forecast of no change that man could do anything about, depending on which ā€œdo-nothingā€ you ask.”
Completely wrong, and a misunderstanding of the sceptical position. Lindzen summarized it best. When asked about his prediction for average temperatures in 2100, he said they could be warmer than today, the same as today or cooler than today.
Except in your imagination, no sceptic has ever predicted no change.
” Warmer still will be the 2010-2019 decade.”
Can you please share with me the suppier of your crystal ball? Mine seems to have stopped working.

Daniel H
April 11, 2010 3:19 am

Sometimes I’ve used the decimal expansion of pi for non-crypto purposes when a good random number source is needed. You just loop over each digit in the expansion and do something like: if (d < 5) then heads, else tails. Or it might be more intuitive and efficient to use the binary expansion of pi. There are lots of fast algorithms available for generating millions of digits (or bits) of pi and it can be a fun programming project! You could just easily use the decimal expansions of e to accomplish the same thing. This is a personal preference.
A possible problem with using a binomial distribution to prove a point about climate models is that you are assuming a Frequentist statistical methodology is applicable when the model projections are probably using a Bayesian approach. Tamino discussed the differences between the Frequentist and Bayesian approaches to coin flipping in his blog a while back:
http://tamino.wordpress.com/2008/07/10/reverend-bayes/

MrPete
April 11, 2010 3:51 am

Re: Bob Koss (Apr 10 20:36),

Is it likely that MS would put out two programs five years apart with the same defective commonly used function? I expect they would have been swamped with complaints when the first program was issued. That they would then not fix it seems to be quite a stretch considering my even older program handles it correctly.

From experience: yes they would, and they have.
How does this happen?
– Neither vendors nor users are careful to test those defective functions.
– A careful test requires understanding and skill
– Those who receive defect reports rarely have that skill
– Bad policy caused my comprehensive report to be ignored
Why was my report ignored? Because “the lab” tried to independently produce the problem, without looking at the details. They couldn’t reproduce, so they ignored my report.
In my case, there was a flaw, for six years, in a fundamental C library math function. I had proof, I even created a binary patch that fixed the problem. (It happened to kill our commercial mapping software for customers who used certain CPU’s.)
We could not get MS to accept our bug report for six years. So we patched their library.
A little humility never hurt anyone, especially vendors.

John Finn
April 11, 2010 4:00 am

Wren (12:41:26) :
Of course Meir is right about the coin flips. Itā€™s amazing that some posters seemed to think otherwise. Is it an ā€œappropriateā€ analogy for GCMā€™s? Thatā€™s debatable. Analogies usually arenā€™t perfect.
I prefer the stock market analogy. I would have much more confidence in a forecasted rise in the Dow for 50- year period than a 1-year period. Why? Because I have observed the market has fluctuated a lot from year to year, but has risen over the long term. Average global temperature, like the stock market, fluctuates from year to year but has been rising over the long term

I prefer the “throwing a stone of a cliff” analogy. There will be times when the stone bounces back up off the rough cliff face. These will be difficult to predict, but the probability is that it will still end up at the bottom of the cliff. So although we might not be able to track the exact path taken by the stone we can calculate the right end result.

MrPete
April 11, 2010 4:01 am

By the way, there’s an inherent bias in finite-precision computer calculations that has to be carefully coded for in certain situations involving tiny differences… and I suspect climate models are subject to this bias.
Sadly, I’ve not had time to re-research and recreate a demonstration. A friend proved it mathematically in an unpublished (internal R&D lab) paper that I can’t find (nor can I find him), but I have code that implements the solution somewhere in my ancient archives.
The nature of the problem:
– Imagine a huge sheet of graph paper; all available finite-precision numbers are at the grid points.
– When you do a calculation, by definition the answer will be rounded to a grid point.
– Most of the time, this rounding is not a problem.
– Some of the time, this rounding causes terrible trouble.
My case: the intersection of two sloping lines.
Depending on how you do the calculation, and in what order (sorry, don’t recall the exact details, that’s what I need to go dig up), the rounded intersection-gridpoint will be different for the same two lines.
Which means the intersection gridpoint is shifted in a particular direction.
These errors can rapidly accumulate for highly iterative functions… like mapping… and perhaps climate models.
I’d be happy for someone else to grab the glory on this… my time is quite limited these days šŸ™‚

Chris
April 11, 2010 4:09 am

area is not extent!

Digsby
April 11, 2010 4:20 am

@ Wren
FWIIW, I am a “do nothing” person unless it can be demonstrated to me beyond reasonable doubt that “doing something” is all of the following: necessary, possible, desirable, and not likely to do more harm than good.

Steve Goddard
April 11, 2010 5:12 am

Xi Chin (01:56:42) :
Thanks for the link! Looks like the poor Microsoft random number generator is a story in itself.
http://www.theregister.co.uk/2007/11/13/windows_random_number_gen_flawed/

Windows random number generator is so not random
The pseudo-random number generator used by Microsoft in Windows is flawed, according to security researchers.
A team of cryptographers led by Dr. Benny Pinkas from the Department of Computer Science at the University of Haifa, Israel were able to unravel how the CryptGenRandom function Windows 2000 worked, without assistance from Microsoft. This analysis revealed that random number generation in Windows 2000 is far from genuinely random – or even pseudo-random.

Steve Goddard
April 11, 2010 5:15 am

Wren (22:58:27) :
“Warmer” is a relative term. So far we are running well below all three of Hansen’s A,B and C scenarios.

Digsby
April 11, 2010 5:17 am

@Wren
And BTW, I used to be a CAGWist too until fairly recently when, thanks to “Climategate”, I discovered that my trust in climate scientists to give me honest, unbiased information on which to base my opinions and actions had been seriously misplaced.
Since then my eyes have become opened ever wider to the true state of affairs and I now actually feel almost like someone who has escaped from the grasp of Scientology.

Steve Goddard
April 11, 2010 5:37 am

It is worth noting that even in the better g++ plot, there is a lot of non-random behaviour exhibited. There are somewhat symmetrical gaps, bunching and other patterns, particularly near the top.
http://wattsupwiththat.files.wordpress.com/2010/04/meierquestion9probabilityplotgcc.jpg
None of the RNG issues make any difference though in choosing “weather” or not to take the bet.

Steve Goddard
April 11, 2010 5:59 am

Daniel H (03:19:20) :
My objection to the coin toss analogy has nothing to do with choice of statistics. Rather it has to do with physical processes – basically that each iteration of a GCM run is dependent on all previous ones.
Suppose that in year one, the GCM predicts a decrease in Arctic ice and cloud cover, when in fact we see the opposite. This is going to compound into even worse errors in year two. Then again the error gets worse in year three. Suppose you run your model on a granularity of one day – that would be 36,525 iterations of compounded error in 100 years.
I can’t think of a single reason to believe that GCMs have validity 100 years into the future.

April 11, 2010 6:18 am

I read up on how to get wordpress to ignore things that look like html tags. Here is another attempt to post the C++ code, hopefully in it’s entirety this time.

#include <iostream>
#include <vector>
int
main(int argc, char** argv)
{
size_t sample_size = (size_t)atoi(argv[1]);
int iterations = atoi(argv[2]);
std::vector<int> ones_count_vector(sample_size, 0);
for (int i = 0; i < iterations; i++)
{
int number_of_ones = 0;
for (size_t j = 0; j < sample_size; j++)
{
int random_number = rand();
if ( (random_number & 1) == 1 )
{
number_of_ones++;
}
}
ones_count_vector[number_of_ones]++;
}
for (size_t i = 0; i < sample_size; i++)
{
if (ones_count_vector[i])
{
std::cout << i << " " << ones_count_vector[i] << std::endl;
}
}
}

Editor
April 11, 2010 6:20 am

Steve Goddard (13:10:20) :

for (size_t j = 0; j < sample_size; j++) {
    int random_number = rand();
    if ( (random_number & 1) == 1 )
        number_of_ones++;
}

A common problem with psuedo random number generators is that the low bits often cycle over a small range. If you had looked at all the bits I suspect you would have gotten a better result, e.g. “if (random_number > (RAND_MAX/2))”
Other functions can be embarrassingly bad. When I joined DEC in 1974 I brought with me program I had written at CMU to display abused forms of a curve called the Rose (see http://wermenh.com/roses/index.html ). Before porting the assembly language version to their graphics hardware, I wrote a Fortran program to show what it looks like. The “0th order” rose is just a circle and I was astounded to see my circle was lumpy, sort of cross between an octagon and circle. It was much closer to a circle than an octagon, but my sine & cosine code in my assembler code simply had a table of since between 0 and 90 degrees and did linear interpolation to increase resolution to 1/8 degree. The value of the result was only 14 or 15 bits wide. Fast and far better than what the Fortran library gave me! DEC fixed it in the next release.
Old PDP-11 hackers(*) – while it didn’t save me any code, the linear interpolation was just a string of seven add instructions preceded with a an “add r0, pc” instruction. I always want to use a math instruction on the pc register, but was disappointed that it wasn’t tighter code than a more sensible form.
*: Are there any young PDP-11 hackers? Probably not….

Paul Coppin
April 11, 2010 7:05 am

Wren (22:26:02) :
[…]
===
The ā€œdo nothingsā€ on CAGW have an implicit forecast of no change in global temperature or a forecast of no change that man could do anything about, depending on which ā€œdo-nothingā€ you ask. These forecasts are based on wishful thinking rather than climate models.

Same difference, I believe…

Dishman
April 11, 2010 7:19 am

There is no such thing as an “unbiased” coin.
There are only coins for which the bias is not detectable within whatever measurement/calculation method used.
Any real coin will have a center of mass which differs from the center of its exterior. There are numerous other bias elements which will vary from test to test.
In effect, Dr. Meier’s wager is whether or not the bias of the coin is detectable with a specific method. It also assumes that the specific method will not introduce bias (ie, deform the coin by repeated tosses).

Tom T
April 11, 2010 7:34 am

If I were to lose a $ million if the coin flip came out of the range I would not make the bet, because I can’t afford $million. But that is what those who propose cap and trade or carbon taxes are wanting us to do. Actually they want us to bet trillions of dollars that THEY are right. But if they are wrong we have lost trillions of dollars for nothing.
It is worse than that, the bet is that the computer models are right and that the cost of waiting to see if they are right is higher than the cost of making drastic changes now. As bad as GCM are, economic models are even worse at knowing what conditions will exist in 50-100 years especially when a solution to what ever problem we might have then could rest in technology that does not exist now.

Bill Illis
April 11, 2010 7:48 am

Steve Goddard (13:44:30) :
Sneak preview of something I am writing up. Here is GISS vs. CO2 concentration.
http://docs.google.com/View?id=ddw82wws_5998xrxhzhc

Steve, you have to convert the temperature trend fit to the Ln(CO2).

Tom T
April 11, 2010 7:50 am

John Finn (04:00:28) : Quote “I prefer the ā€œthrowing a stone of a cliffā€ analogy. There will be times when the stone bounces back up off the rough cliff face. These will be difficult to predict, but the probability is that it will still end up at the bottom of the cliff. So although we might not be able to track the exact path taken by the stone we can calculate the right end result.” End Quote
Yes that is because we know that the only forcing is gravity and the only feedbacks are air resistance and the cliff. We don’t know that the main forcing is only CO2 and we really don’t know what the feedbacks are. if the feedbacks are negative, in your analogy the rock could come back and hit you in the face, or in the case of climate change the climate could get colder.

Arn Riewe
April 11, 2010 7:54 am

“The Met Office has now admitted to BBC News that its annual global mean forecast predicted temperatures higher than actual temperatures for nine years out of the last 10.”
I have a suggestion for our Brit friends. You may be aware of “Punxsutawney Phil” our national forecaster of spring arrival. Now Phil is busy on February 2, but available for the other 364 days per year.
According to Wikipedia… “As to his accuracy, according to the StormFax Weather Almanac and records kept since 1887, Phil’s predictions have been correct just 39% of the time.”
http://en.wikipedia.org/wiki/Punxsutawney_Phil
With an accuracy rate almost 4 rimes greater than the Met, I’m sure the citizens of Punxsutawney would be willing to rent out Phil for your forecasting needs in exchange for the annual budget of the Met Office. Think about it.. almost 4 times the accuracy at no additional cost. And when you think about the added benefit of carbon footprint reduction of replacing that energy gobbling computer and elimination of staff commuting, how can you resist! Groundhogs of the world.. unite!

Steve Goddard
April 11, 2010 8:01 am

Bill Illis (07:48:53) :
I understand your desire to make the GISS trend look logarithmic, but so far it appears linear with CO2. I’m just plotting the data, not generating it.

Joe
April 11, 2010 8:04 am

Steve,
Ric Werme is correct. You are using the output from a random number generator incorrectly.
Unless you know specifically what algorithm is used, you should never use the lowest bits of the generators state. The fact that GNU’s standard library uses a different algorithm that does not have this short-comming in no way means that Microsofts generator is flawed.
Microsoft is using the Linear Congruential Generator (LCG) method, which is fairly standard and robust as long as good constants are used. All LCG’s have poorly behaving low bits, regardless of the constants chosen.
In general cases, you should be dividing the output of rand() by (RAND_MAX + 1) to get a value 0 <= x < 1.0, and then multiply this value by the range of values you want (if you want 2 ouputs, multiply by 2), finally truncating this resulting value to an integer.
For further reference, see Knuth – The Art of Computer Programming.

Steve Goddard
April 11, 2010 8:09 am

Ric Werme (06:20:06) :
I tried using some different bits from rand() in VS. Bit 1 gave a very weird plot, but bit 12 did reasonably well.
http://docs.google.com/View?id=ddw82wws_602dp5ffdtb

Tom_R
April 11, 2010 8:26 am

>> R. Gates (14:48:12) : Except 2010 follows on the heals of the warmest decade on instrument record, and so is part of an ongoing trend. … <<
The instrument record only goes back to 1979. The late 70's were notoriously cold. It's hardly surprising that one could find a three month period of record warmth on a noisy half-sinusoid starting at the bottom, nor that the top of the sinusoid is warmer than the incline.
If you want to use temperatures before 1979, then you need to compare apples to apples, and only use the current temperatures from those areas that were previously covered by thermometers. That eliminates the oceans, as well as much of Africa and South America. The early 2010 temperatures in the US, Europe, and Asia were near historic lows.

April 11, 2010 8:58 am

Joe (08:04:00) :
I expect a random number to be random in all bits. Microsoft doesn’t warn you to expect otherwise.
Do you work for Microsoft?

Steve Goddard
April 11, 2010 9:09 am

Ric Werme (06:20:06) :
I thought some more about the issues of low order bits being flawed. Your suggestion of (random_number > (RAND_MAX/2)) would work for this case, but there are plenty of applications – like this one – which expect randomness in all bits. The distribution below would work with your algorithm, but is not random.
16383, 16382, 16383, 16382, 16384, 16385, 16384, 16385
If a RNG can’t be counted on to generate randomness in even/odd (bit 0) distribution, that definitely is not a good thing.

Steve Goddard
April 11, 2010 9:24 am

Joe (08:04:00) :
Integer division on any microprocessor is a very slow operation, typically taking between 15 and 80 clock cycles. Logical & takes one clock cycle.
Thus, the algorithm you suggest is much slower than what I am using. One billion iterations is too slow already.
I’ll stick with gnu in this case, thanks.

DirkH
April 11, 2010 10:08 am

“Steve Goddard (09:24:57) :
Joe (08:04:00) :
Integer division on any microprocessor is a very slow operation, typically taking between 15 and 80 clock cycles.”
CPU architectures that contain a barrel shifter – amongst them the Core 2 [Duo] – can do very quick multiplications, and divisions are somewhat accelerated, though often not as much as multiplications.
Here’s an interesting table, contrasting the Pentium IV (no barrel shifter) with the Core 2 Duo:
http://www.behardware.com/articles/623-5/intel-core-2-duo-test.html

DirkH
April 11, 2010 10:22 am

“Wren (22:12:28) :
[…]
So about 10 years or so into that 1890-2009 projection, skeptics would have been saying DickH your temperature projection sucks. ”
Oho, i made Wren use invectives… Running out of arguments so fast?

Joe
April 11, 2010 10:36 am

Steve,
No generator gives good randomness in all the bits of its state.
You are using the C language, where the entire state is exposed by the standard library random function. The legacy of the C language is such that things cannot be changed for the better with the standard library function, which is why there are so many alternative functions described both on the internet and in reference material.
The industry standard (used by State Gaming Comissions and so forth) for random number generation is the Multiply-With-Carry methodology using a large number of generators. it is far more heavyweight than you will find in a standard library:
http://en.wikipedia.org/wiki/Multiply-with-carry
As far as integer division. Yes, its slow. But I didn’t suggest integer division. You cannot get a result 0 <= x < 1.0 with integer division. I suggested floating point work there.
In any case, RAND_MAX is most often of the form (2^n-1) where n is the number of bits of state that the generator keeps.
This means that you could use fixed point methods and shifting, instead of a division operation. The methodology I gave was only an illustration of what needs to be done in order to use rand() safely. There is also the option of multiplying by a floating point constant instead of dividing (all constant divisions can be converted into constant multiplications.)
There are plenty of examples of proper usage in C on the internet, such as:
http://members.cox.net/srice1/random/crandom.html
No I do not work for Microsoft. The fact is that the weakness of LCG's is common knowledge among many software developers. Like I had said, you can refer to Knuth – The Art of Computer Programming. This is the bible for most serious developers. Other references on the subject will cite this reference.
For example, wikipedia:
http://en.wikipedia.org/wiki/Pseudorandom_number_generator
http://en.wikipedia.org/wiki/Linear_congruential_generator
Both cites Knuth, the second even mentions the problem you are experiencing.
As far as Microsofts documentation. Its probably bad. My experience with MSDN is that its gotten worse every year since the late 90's. I would not be surprised if they only gave the minimum information necessary in order to generate a single random number.
Now, in your use case I do not think that the slowness of division is an issue. There is a much more alarming performance concern, and that is that you are branching down two different code paths randomly. The branch prediction logic in a processor simply cannot cope with random branching, and the worst case for the predictor would in fact be a 50/50 split. I suggest that performance is already severely degraded because of the pipeline dumps that random branching creates, and avoiding that would be first priority if performance is a concern.
An inner loop body something like:
int coin = (int) (rand() * (1.0 / (RAND_MAX + 1)));
number_of_ones += coin;
The properties of this are such that the multiplication is by a constant (1.0, RAND_MAX, and 1 are all constants, so will be converted into a single coefficient with optimizations turned on at compile time,) but you do incurr both an integer to float and a float to integer conversion (pretty fast on modern processors, actually)
No branching is required, because coin is either 0 or 1
The only issue is that (RAND_MAX + 1) must be a power-of-two in order to give an absolute long term balance of 0.5 heads and 0.5 tails because of the way floating point approximates fractional non-powers-of-two. After doing a bit of searching, I couldn't find a single standard library implementation where RAND_MAX was not a 2^n-1, so you should be safe regardless of where you port (the power of two makes sense, after all)

rw
April 11, 2010 10:38 am

(other) rw (12:50:45)

… demonstrates a profound misunderstanding of what climate models do. They are not meant to predict variation in ice extent over any given three year period

This may be so, but it then appears that many of the modellers are guilty of the same misunderstanding. How else to explain the claims of an ice-free Artic in 2008 or by 2015, etc., as well as the UK Met persistent seasonal forecasts, which I believe are based on their models?

Joe
April 11, 2010 10:58 am

Wooops, that should have been:
int coin = (int) (rand() * (2.0 / (RAND_MAX + 1)));
number_of_ones += coin;
That will teach me not to double check things before posting.

DaveF
April 11, 2010 11:08 am

Arn Riewe 07:54:58:
Thanks for the offer of the services of Punxsutawney Phil, but if he’s been forecasting since 1887 he’s a very old groundhog by now, so he might not stand the journey! Anyway, we’ve got plenty of seaweed.

R. Gates
April 11, 2010 11:30 am

Wren said:
“And the 2000-2009 decade was warmer than the 1990-1999 decade despite that 1998 El Nino, wasnā€™t it? Warmer still will be the 2010-2019 decade.”
———-
Indeed it will be (unless we get a series of big Pinatubo type volcanoes). Trends are what AGW models are all about– not specific events or exact timing. Ocean heat content already does indeed have a many decades of warming “baked into the system”. Depsite the protests of many on this site, the oceans have absorbed 80-90% of the heat from AGW, and this heat must eventually be released.
I actually think the Met Office was a bit bold in predicting that 2010 would be the warmest year on record, as it would be more prudent to say one of the next three to five years, but I don’t disagree that it is likely 2010 will be, but these kinds of specific predictions have less to do with climate science than with long range weather prediction. Just as I think that we’ll see a new record low summer arctic sea ice sometime in the next 3 to 5 years. So it’s just better just to say the next ten years will be warmer than the last ten and leave it at that…

Mike Haseler
April 11, 2010 2:12 pm

netdr: “The assertion is frequently made that it is easier to correctly model 100 years of climate than it is to model 10 years. The assertion is that errors would cancel out ! For this to be true each event must be totally random and independent of all previous events which is NOT TRUE OF CLIMATE EVENTS !
This is nonsense. A model is like an algebra test with 10 questions where the answer to #1 is the input to question #2 and the answer to question 2 is the input to question 3 and so on.”
Netdr, for me the main evidence that global warming is absolute twaddle, is the frequency curve that indicates a form of noise “with memory”. I don’t need to know why the current events are being affected by historic events, or why today’s climate is not independent from last decades. Nor do I have to know what causes the climate to change, all I need to know is the normal signature of the climate, and I can tell you it has “memory” and that the signature is still perfectly consistent with normal variation in the climate.

April 11, 2010 2:24 pm

I have a confession to make here.
I think I may be responsible for this discussion about coin flipping.
“A New and effective climate model”
“Politicians cost lives (14:49:22) :
I fail to even see the point in this exercise. This so called model, if indeed it qualifies as such, is based on far too many unfounded suppositions.
The only thing we need be concerned with, is whether or not CO2 causes the atmosphere to warm?
The answer of course is NO it does not.
What else do we need to know?
Climate Models will never be able to predict the unpredictable. It is impossible to consistently predict the flip of a coin even with a million super computers, yet there is only two possible variables involved,
1. Which side the coin is on when you flip it.
2. How hard you flip the coin.
and only two possible outcomes.
Heads or tails!
Itā€™s just climate change for Christā€™s sake. Weā€™ve only had our entire evolutionary existence to get used to it.
As a species we should always be prepared for the onset of cooling or warming. That way we can stop wasting billions in taxpayers money on pseudo science and get on with more important things.”

If so, then I think that I should clarify what I meant with regards to super computers predicting coin tosses.
I was of course making the point that a million super computers would not be able to consistently predict the flip of a coin by a human hand.
I assumed that would be clear enough by the two variables 1 and 2, which were meant to illustrate that even just two natural and unpredictable variables make such a system impossible to accurately predict, even for the worlds most powerful computers.
The point being that no computer can accurately predict the outcome of a coin toss if it doesn’t even know which way up the coin was before hand, let alone exactly how hard the coin was actually flipped.
Anymore than a computer can know that for example:
1. convective parameterisation is a pitiful representation of real convection
or that
2. in order to frame CO2 as a “greenhouse gas”, it was necessary to spend almost 150 years developing the fallacy that Oxygen and Nitrogen are “practically transparent to radiant heat”.

GaryPearse
April 11, 2010 4:18 pm

Walt’s coin toss analogy to climate may not be that bad. “If the models are coorect then we would be faced with remediation costs for ngatives and bonuses for positives- good crop yields etc. If the models are wrong and we set about to cool off the climate, we die (some of us would survive in prehistoric conditions in the tropics) . Also, let us note that the coin toss model is infinitely superior to climate models. Going along with them is far less certain than a coin toss. What would be the safest bet with them – bet on them to be wrong.

April 11, 2010 4:25 pm

I’ve said this before but the output of GCMs unless internally constrained in some way will be a random walk.
I’ve done this with the output of solid state accelerometers. There is a small amount of output noise which when integrated (added up) to get velocity is a random walk. The errors don’t average out to zero even when the thing is sitting on the bench. All inertial navigation systems display this behaviour and after a certain time when the error exceeds the users desired accuracy they must be re-initialised with data derived from some absolute source.
Dr Meier has a severe misunderstanding of the problem if he compares this to multiple coin tosses. A better (but still dodgy) coin toss analogy is assign a value of +1 to a heads and -1 to a tails and ADD each result to the total so far. Plot the result. Do it multiple times and see the differences in the graphs.
Anyone care to simulate this on a PC and post the results here?

p.g.sharrow "PG"
April 11, 2010 5:21 pm

No matter how costly the computer, GI-GO. Climatologists are bad guessers on how things work, so, the projections are wrong. This is not flipping coins here, with a 50/50 chance of being right. If you are wrong at the start of a program it just gets more wrong as it projects into the future. The earth self corrects, look at the record. The models show runaway heating. Therefor the models are wrong, Wrong, WRONG!
Start over, build new models. And use over 60 years of information. Basing projections on 30 years, or less, of data is lazy and stupid. Use data that is real, and not adjusted to fit preconceptions. Then the computer projections may be some resembalance of future reality.

Editor
April 11, 2010 5:24 pm

I occurred to me as I headed out for the day that my suggestion of doing “if (random_number > (RAND_MAX/2))” is really just checking the high order bit and does not count as “If you had looked at all the bits.” However, as you found, using high order bits is better than using low order bits.
Don Knuth wrote “By far the most successful random number generators known today” (Copyright 1969, hey I paid $18.50 for this book desipte it having a tear in the cloth spine) are special cases of the following scheme….
X[next] = (aX + c) mod m
This is called a linear congruential sequence.”
Assuming that rand () is an LCM, then m is RAND_MAX+1. That seems to be a power of 2 on most systems, easy to compute.
“c” is almost always odd, otherwise if “a” is even, then we add a zero bit on each call, or if X is seeded with an even number, the low bit will always be zero. If “c” is odd, then the last bit would alternate between 0 and 1. Pretty useless, but better than converging on zero for all bits.
I forget what Knuth says about that in the 160 pages he wrote, and I won’t dig it down. Some systems I’ve seen shift out the low 8-16 bits. The introduction of real randomness like Leif’s posts is not mentioned at all. Hey, you were lucky to have a time-of-day clock. One of the bootstrap operations was to enter the date and time on most systems of that era.
——-
Steve, see http://home.comcast.net/~ewerme/wuwt/ for some notes on using <pre> or <code>. Oops, not there. I’ll add it. Look at http://wattsupwiththat.com/resources/ and search for 15:11.
You’re being done in by blank lines terminating the <pre> Let me experiment with a couple things:

Using <pre>, next line is blank.
line  with  text  and  two  spaces  between  words.
Using <pre>, next line is has &nbsp;.
 
line  with  text  and  two  spaces  between  words.

Using <code>, next line is has &nbsp;.
 
line with text and two spaces between words.

Editor
April 11, 2010 5:27 pm

Hmm, I guess I need the fourth case. Perhaps all you need is to use <pre> for code, and use <code> for, well, I don’t know what it’s good for.
Using <code>, next line is blank.
line with text and two spaces between words.

April 11, 2010 5:29 pm

Joe (10:36:23) :
Seems to me that a random number generator should be able to generate a decent distribution of even/odd numbers, which is what I am using for heads/tails.
gnu can, and there really isn’t a good excuse for Microsoft not to.

Steve Goddard
April 11, 2010 5:32 pm

I don’t buy the idea that somehow climate models magically correct themselves over time. That sounds like religion, not science.

richard verney
April 11, 2010 5:55 pm

Steven Goddard’s example is not demonstrating that it can be easier to model a compex system than a simple system. To model the system of 10,000 coin flips, he would have to tell us in which order each and every one of the coin flips comes up heads and tails. All he is doing is creating such a wide margin of error, that the result has almost certianly to fall within the bandwith of error. His example, is little more than saying that there is a 50% chance that tomorrow will be at least as warm if not warmer than today. Create a wide enough error margin and you will always be able to predict results within that error margin.
If you want to look at the modelling test to see how the system behaves, it is much easier to model the reults of one flip. Forget, the landing on the edge, your computer model will get the right answer approximately 50% of the ime. Now model the order of 10,000 flips and I bet you that the model will never get the running order correct.
Complex systems must instrinsically be more complex to model especislly when you do not know all the components within the system still less how the behave and interact.
An example such as Syeve Goddard’s does not assist the debate.

peter ashwood-smith
April 11, 2010 6:12 pm

Just a quick note about doing things like (rand() modulo 1) .. or modulo anything small for that matter. Like the even/odd approach above …
The random number generators can have very short cycles in any given bit, or subset of bits so its unwise to take a bit to use as your ‘coin’ toss result. You will get better results by checking the value is above or below the mid point of the range of the random number generator.
I discovered this a great many years ago as a student helping verify new adaptive compression algorithms (LZF). We needed random input to give to the new super duper compression algorithms that my professor was designing (random sequence don’t compress of course). So I used rand()%1 to generate random bit strings .. well his compression tool promptly compressed the megs of what I thought were random bits in to a neat 1024 length bit string and a replication factor .. my prof thought he had screwed up the algorithm but I ran an FFT on the output of rand()%1 and sure enough there were distinct cycles (peaks) in the different bit positions. If I remember correctly each bit exhibited a 1024 or so cycle with differing phases.
Anyway the point being that its the entire N-bit number that is pseudo random with the claimed properties. Any given bit or subset of bits will not be so well behaved and will possibly have short cycles. Its the combination of all those non overlapping cycles in each bit that give the desired behavior to the full N-bit number.

April 11, 2010 6:39 pm

Ric Werme (17:24:05) :
The introduction of real randomness like Leifā€™s posts is not mentioned at all.
Knuth is a mathematician, so physical entropy items were beneath his radar back then. I still like my own definition of a [real] random number, namely a number I can compute, but that you cannot.

April 11, 2010 9:14 pm

Joe (10:36:23) :
The branch misprediction in the code is much less expensive than an integer divide. Both taken and not taken paths are in the first level cache – so you are looking at a few cycles, 50% of the time, to flush the front end of the pipe.
Integer divide costs 15-80 clocks, 100% of the time.

kwik
April 11, 2010 11:02 pm

Steve, you are following a red herring in my opinion.
We can all agree upon its possible to make a gaussian distribution in a computer.
That isnt the point.
The point is ; A GCM is different. Its not giving a gaussian distribution of result.
So what is the objective here? Surely it must be to understand how the climate scientist is handling this?
What is their approach? What do they believe will work? After all, back in 2001, IPCC concluded GCM’s couldnt predict climate. Do the IPCC believe they can now, in 2010? Why?

Steve Goddard
April 12, 2010 8:10 am

kwik (23:02:40) :
After reading your comment, I’m wondering if you read the article.

Crispin in Waterloo
April 12, 2010 11:59 am

R. Gates (14:48:12) :
“Except 2010 follows on the [heels] of the warmest decade on instrument record, and so is part of an ongoing trend. You see, I have no problem with the leveling of the growth in temperatures in 2003-2008 timeframe as I can accept that the solar minimum would do this…”
Depends on your definiton of ‘instrument’. The problem with your point R Gates is that 2003-2008 was not really a time of ‘solar minimum’. That is a 5 year stretch of a 13 year cycle. It reads as if there as a solar minimum in the middle of 2003-2008 so the temperatures being ‘high’ somehow must be explained by the CO2, not something solar-rooted.
I frequently find CO2 proponents mis-stating the main solar influence as TSI, though it is actually cloudiness. I find the meme, ‘It was hot 2003-2008 and the sun’s activity was at a minimum so it can’t be solar, it must be CO2 forcing,’ mischievous. Surely it is well known by now that there is a delay between solar activity and atmospheric temperature of about 4 years? Cooling from solar inactivity should follow on for at least 4 years after the drop in the AA Index. Lo and behold it does.
I find the reluctance by CO2 proponents to properly discuss all solar and galactic impacts on terrestrial climate as strange as the refusal by some solar proponents to discuss barycentric tidal influences on the sun, as if the sun lives in glorious isolation from its planets. It is patently obvious this is not the case, hence the far greater accuracy of the Farmer’s Alamanc, compared with the MET Office.
Next year’s main weather everts are being calculated right now by the barycentric Almanac staff. Farmers don’t care about CO2 or an isolated sun, they care about the weather so they take an inclusive view.
Remember 5 feet of snow in Washington this year? “Major US storm 18-22 Feb 2010”. Farmer’s Almanac, written in Feb 2009. Not bad work if you can do it.

Joe
April 12, 2010 12:22 pm

Steve,
I am sorry that you just dont get it.
First, there is no integer division in either of the algorithms I gave. Really. None-at-all.
The first algorithm implies a *floating point* division, but that can be avoided when implementing it, while the second is actual code and multiplies by a constant expression (thus, a constant single VALUE, with no divisions at all)
Secondly, you should *measure* performance instead of making claims.
Integer division vs flushing the pipeline .. your thinking was valid a decade or two ago, but not any longer. Its not so simple these days.
You are only looking at instruction latency, but not instruction throughput. A pipeline flush destroys insruction throughput. Not only does it waste the previous efforts of speculative execution by the execution units, the execution units must further wait for the pipeline to fill back up before doing anything. Thats at least 14 cycles on a Core2, and 20 cycles on P4’s, and more than 20 on P4E’s.
To quote Agner Fog:
http://www.agner.org/optimize/
(from the Optimizing C++ document)
——-
“There is an important distinction between the latency and the throughput of an execution unit. For example, it may take three clock cycles to do a floating point addition on a modern CPU. But it is possible to start a new floating point addition every clock cycle. This means that if each addition depends on the result of the preceding addition then you will have only one addition every three clock cycles. But if all the additions are independent then you can have one addition every clock cycle.
The highest performance that can possibly be obtained in a computationally intensive program is achieved when none of the time-consumers mentioned in the above sections are dominating and there are no long dependency chains. In this case, the performance is limited by the throughput of the execution units rather than by the latency or by memory access”
——
(from the microarchitecture document)
“The Core 2 microarchitecture allegedly has a pipeline of only fourteen stages in order to reduce power consumption, speculative execution and branch misprediction penalty. However, my measurements indicate that the pipeline is approximately two stages longer in the Core2 than in PM. This estimate is based on the fact that the branch misprediction penalty is measured to at least 15, which is 2 clock cycles more than on PM.”
——
In short: If you are worried about latency, then you arent optimizing correctly. Instruction latency is an issue that can be made moot in practice, by maximizing for instruction throughput when optimizing. Pipeline flushes prevent ALL instruction throughput for many cycles, and EVEN waste previous cycles of work performed by the CPU.
I really don’t care what you believe. I’ve said my peace. If you are really concerned about performance than you’ve got a lot to learn. You are focused on instruction latency when thats just about the least important thing to be concerned about these days: Instruction Throughput and maintaining Cache Coherence are the #1 performance concerns on modern processors.

Jason
April 12, 2010 2:17 pm

Well, i don’t see much differnce between the last two graphs.

Steve Goddard
April 12, 2010 6:36 pm

Jason,
One graph is going up, and the other graph is going down. Is that different?

Jason
April 13, 2010 8:39 am

Steve,
Well, yeah, but between 2000 and 2010 they are basically the same, and that’s the only part they are together. At least, so far šŸ˜‰ I don’t find it very useful to compare a trend obtained with past data to a trend that only takes place in the future in that model.

MikeF
April 13, 2010 10:15 am

I have to agree with Joe. He is absolutely correct, in every way.
The way he suggest to use rand() is how you should be using it.
Multiplication by reciprocal (whenever possible) is what you do in signal processing to avoid division. (knowledge of this particular trick is how I distinguish a embedded DSP person vs wannabe during job interview)
Pentiums can do floating point math very efficiently, so you save very little by avoiding it. Pipeline flashing would be major reason for performance degradations here, by an order of magnitude at least.
If you worried about performance you should look into random number generators that are much more efficient than microsoft’s supplied, you’d get much better bang for the buck that way. I had a need for a good random number generator that I could use with Matlab and with my own C program a few years back. I found one that was much better then rand() in performance and quality of pseudo-random sequence, available in source code. It has been a while, but I could dig it up if you interested.

Bob Koss
April 13, 2010 12:15 pm

Wow!
My mentioning the possibility of poor implementation of the MS Rand() function rather than Rand() being defective certainly generated a healthy exchange of views. Interesting stuff about instruction costs and cache flushing.
My take-away. MS Rand() has not been demonstrated to be worse than the Gnu one. It just operates differently than was assumed. Relying on the low bit state while using an unfamiliar generator is not wise.