Guest essay by Michael R. Smith, C.C.M.
Forbes, “Absolute Return” column, April 21, 2008, page 246:
Here’s another name you should own, Freddie Mac ($29 per share)…Freddie is cheap at 1.1 times book [value].
Less than five months later, Freddie Mac’s stock was worth 25¢ per share, a loss of 99%. It has since recovered to 70¢ per share, so the loss is “only” 97.6%.
A forecast of a stock of a single company five months into the future seems easy. The company had government backing (federally sponsored corporation). What could go wrong?
Yet, the forecast published by Forbes, short of an outright bankruptcy, could not have been more inaccurate. It is worth examining how a situation that seemed rock solid (government-backed securities!) became catastrophic to see if there are any lessons that might apply to the atmospheric sciences.
The assumptions that Freddie Mac (and other financial stocks) were low risk was primarily a result of computer models. As one expert stated (using pseudonym at http://blogs.zdnet.com/Murphy/?p=1265 ),
The problem is inherently complex – imagine being asked to value a portfolio of 10,000 residential mortgages issued to a total of something like 17,652 individuals. Each mortgage balances some issue amount against some payment stream; each has had zero or more payments recorded against it, each has an initial interest rate; an interest computation method; zero or more early payment opportunities; some mention of late or missed payment penalties and conditions, and an expiry, renegotiation, or call date.
While I do not doubt that is “complex,” the level of complexity is miniscule when compared to the complexity of the earth-atmosphere-ocean system and their interactions. Yet, faith in these model valuations led to a prediction that Freddie Mac stock was “cheap” when a meltdown of the financial system, largely due to the incorrect valuations and risk estimates by computer models, was less than 180 days away.
After the meltdown occurred, a second Forbes article stated, “All existing models for calculating risk, he [Nassim Taleb] says, should be thrown out because they underestimate extreme price swings. ‘The track record of economists in predicting events is monstrously bad,’ he says.” (February 2, 2009, p. 21) Of course, we learn this after our home values and values of our 401K’s are wrecked.
Given the failure of these models to predict the implosion six months hence, would you invest the remainder of your 401K on what the same model predicts for the next six years or, if you are in your 20’s, what it predicts for sixty? I don’t know what your answer might be, but common sense would indicate applying the forecasts from these models to your portfolio with extreme caution.
June 1, 2009, we learned from The New York Times that “Models’ Projections for Flu Miss Mark by Wide Margin.” The model predicted, according to the Times, “by the end of May, there would only be 2,000 to 2,500 cases in the United States… On May 15, the Centers for Disease Control and Prevention estimated there were upwards of 100,000 cases in the country…”
Just six months earlier, the models’ predictive capability were touted because of real time input from Google (www.cidrap.umn.edu/cidrap/content/influenza/panflu/news/nov1308google-jw.html ). Now, the flu has been declared a “Pandemic” by the World Health Organization (/www.pandemicflu.gov/ ) in spite of the modest number of cases projected to be in existence by June, 2009, by the models. Another critical short-term modeling failure.
Question: If the model predicts low risk for the next six months, would you decide to forego a flu shot? Again, your answer might be different, but common sense would dictate getting the shot.
How do these examples relate to climate modeling and policy?
We currently have climate models that have missed the fact that atmospheric temperatures peaked 11 years ago and that oceanic heat content has, at best, failed to increase. See: http://climatesci.org/2009/03/04/large-uncertainty-in-the-simulation-of-the-global-average-surface-temperature-by-the-ipcc-models-a-study-reported-on-the-weblog-the-blackboard/ , http://climatesci.org/2009/02/09/update-on-a-comparison-of-upper-ocean-heat-content-changes-with-the-giss-model-predictions/ , among many others.
Given the inadequate performance of these models over the last 5 to 10 years, why do we believe we can make accurate, highly specific forecasts 50 to 100 years in the future? Is it because we are so close to the problem we are blinded to the dangers like the economists who did not see the meltdown coming?
Almost no one familiar with meteorology or climate models would disagree that they are more complex than the mortgage valuation or influenza prediction models. The basic processes of the earth-ocean-atmosphere are incompletely understood and we barely understand many of their interactions.
We also know that forecasting the weather beyond five days is dicey at best. Then why are we making 29,000-day weather forecasts? Don’t think we are doing that? Consider the following:
“By the period 2080-2099, devastating heat waves of the kind that killed more than 700 people in Chicago in 1995 will occur three times per year.” (USCCP, p. 119, citation below)
That is a weather forecast – a forecast of specific meteorological conditions at a specific time and place. The document is filled with similar predictions, along with recommendations based on those predictions.
We are sometimes told that climate forecasts can be made because the “weather” errors will be cancelled out because they are “random.” Here is what was said about the mortgage computer models,
Now, because you can predict roughly the probable range for most of these assumptions but not the actual values the variables involved will have for each of the time periods you have to consider, what you do is write a monte carlo simulation in which you try tens of thousands of value combinations and plot the results to see what, on average expectations, the portfolio might be worth.
Notice, that at this point even something as large as 0.0005% error in the outcome would be completely insignificant – so randomization error should have no effect, right? (op. sit.)
It was believed by most the mortgage instruments were safe because the errors (i.e., a higher default rate of subprime lenders) would cancel out (because the risks were spread) and because, if desired, default insurance could be purchased from institutions like AIG. Of course, AIG used similar models to determine its risk. We just learned how well that worked.
In spite of these spectacular failures of less complex computer modeling in economics and public health, the atmospheric sciences seem to be making similar miscalculations. If your common sense would lead you to disregard these models’ forecasts when planning your portfolio and whether you get a flu shot, I would suggest we adopt a much more modest approach to the use of climate models. While they are useful research tools, the numerous uncertainties (cloud feedback, particulates, volcanic ash, the current quiet sun, etc.) are so great we cannot claim to have forecast skill decades into the future.
Otherwise, when I read, during a period of falling temperatures and ocean heat content,“Global warming is unequivocal,”* I hear, “Freddie Mac is cheap.”
- U.S. Climate Change Program, Key Finding #1, January, 2009, http://downloads.climatescience.gov/sap/usp/prd2/usp-prd-executive-summary.pdf
Michael R. Smith is CEO of WeatherData Services, Inc., An AccuWeather Company, and a Fellow of the American Meteorological Society. This weblog represents his personal opinion. AccuWeather’s Global Warming Blog can be accessed at: http://global-warming.accuweather.com/ .
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

Since this has popped up again and sent me back to my log files to dig out this reply from a prior thread, I’ll probably make a formal posting of it so I can just put a link here. But until then, this is my take on the “subprime mess”:
Per the ‘talking dirt’ about Economics being mostly mumbo jumbo: It isn’t. It does have soft edges as do most social sciences, but it has a numeric core.
I have an econ degree, so maybe I’m biased, but micro econ makes significant use of linear programming (algebra) and other sub fields use a modest amount of calculus. It’s not all supply and demand curves after you get past Econ 1 for non-majors…
On the housing bubble: The CDO’s and CDS’s were sold world wide. It was started in the US and caused by the broken notion that we could make houses affordable to all. We then packaged up the mortgages in sausage casings and sold slices world wide. When these became suspect, it impacted financial companies world wide.
When our financial institutions went bust, the value of their stock, bond, and preferred stock issues went bust. All the financial institutions world wide that had sucked up trillions of dollars of these other ‘assets’ suddenly had balance sheet issues. Fanny & Freddy preferred stock were some of the most widely held by banks world wide. This was the way our housing issue went global.
Also, the ‘mark to market‘ rule was adopted across much of the world. This was one of the key feedback loops on the path to destruction. Every stock trader knows the wise phrase: “The market can remain irrational longer than you can remain solvent’ as a warning about the way bear markets can price things to ‘crazy low’ prices. Accountants are now learning this with your money.
Mark to Market forced banks to take massive paper losses because no one wanted to buy a paper asset in huge dollar numbers Right Now. There was about a 5% default risk, but the security was marked down 50% to 80%. This then made the bank ‘insolvent’. Just bogus.
They could hold the mortgages to maturity and get the 90+%, except MtoM said they couldn’t. They had plenty of money in hand and plenty of securities making cash flow, but the accountants said they were broke via paper losses.
We institutionalized the irrationality of market panic pricing into determining the solvency of our banks. The EU is now reconsidering the lunacy of mandating a MtoM when no market exists…
The housing bubble started with the CRA of 1977, but was of manageable scale. The modifications of 1999, signed by Clinton, took it open loop. The ACORN et. al. political push to get everyone a home even if unqualified sent it over the edge. Freddy and Fanny pushed massive money at the (mostly Democratic) congress for political favors adding fuel to the heap. Then the Republicans decided to join in and let the Financial Reform Act of 1999 remove Glass Steagall and set the stage for the financial sausage making of CDOs and CDSs.
The Democrats lit the fire in the basement and started breaking up the furniture for kindling, then the Republicans ran around shutting off the fire sprinklers and smoke alarms. A pox on both their houses.
It took until now to blow up because so much foreign money was willing to fund the scheme. That caused the linkage to the rest of the world and drug them into the crisis.
The unfortunate side effect of all that global money filtered through CDO’s into our housing market was that prices went bubble high. Simple economics. More money chasing a single product drives the price ever higher. This made housing unaffordable for all the poor folk who the Democrats wanted to help via their silly housing Ponzi scheme of mortgage buy downs.
The root cause was (a mostly Democratic) political movement that thought houses could be made cheaper by increasing the demand and buggering prudent mortgage standards. The (mostly Republican) addition of gasoline to the fire was the deregulation mantra that let banks, insurance companies, and stock brokers play in each others pond; BUT kept the banks with access to The Fed discount window and limited to 12:1 leverage while it left brokers with no lender of last resort (no Fed window) and with functionally unlimited leverage.
Lehman or Bear Stearns with 40:1 leverage made more money just until the (inevitable) business cycle downturn came. Then they had no lender of last resort when various predatory short sellers started a run on the “bank” and started the cascade failure of other institutions holding Bear and Lehman securities and CDS’s.
The Republican administration has ownership of the removal of the “uptick” rule that let the shorts conduct old fashioned “Bear Raids” of a type not seen since the 1930’s. The argument that the uptick rule would not work in a world with penny prices is bogus. When stocks were priced in 1/4 or 1/8 dollar increments it worked, so just make the ‘new uptick rule’ based on a 25 cent or 12 cent “uptick”. The SEC is either incredibly stupid on this point or wants the bear raids to be done. Take your pick. Malice or stupidity.
Finally, the CDS is a kind of life insurance on a financial asset. That they were forbidden to be regulated is a crime. It lets anyone, even a predatory short seller, write ‘life insurance’ in unlimited quantities on their target backed up by no assets, then short their target into insolvency. Wouldn’t you like to be able to take out life insurance on anyone you didn’t like and then go shoot them?
Now roll this all together and you get what we have now. A free fire zone for hedge funds and shorts to take out life insurance on investment banks, then short them in classical bear raids until their stock drop and rising CDS rates caused rating agencies to downgrade their debt, then the run on the bank begins and they have no lender of last resort and the implosion is guaranteed. This then sets up the failed stock and bond collateral of the next target. Repeat until all investment banks are gone. Golly, come to think of it, the are all either gone or converted to ‘regular’ banks now…
And we owe it all to the CRA as amended in 1999 and the ‘financial reform’ of 1999, seasoned with the SEC removing the uptick rule and letting shorts run wild.
From start to end this problem was caused by stupid government behaviors. Period.
Ah, found it. From:
http://www.bloomberg.com/apps/news?pid=newsarchive&sid=ajs7BqG4_X8I
Michael Milken, the junk bond king, created the first CDO in 1987 at now-defunct Drexel Burnham Lambert Inc., says Das, author of `Credit Derivatives: CDOs & Structured Credit Products’ (John Wiley & Sons Inc., 850 pages, $120). Until the mid-1990s, CDOs were little known in the global debt market, with issues valued at less than $25 billion a year, according to Morgan Stanley.
Drexel and other investment banks realized that by bundling high-yield bonds and loans and slicing them into different layers of credit risk, they could make more money than they could from holding or selling the individual assets.
From the wiki: http://en.wikipedia.org/wiki/Michael_Milken
After he was sent to prison on finance-related charges, his detractors cited him as the epitome of Wall Street greed during the 1980s, and nicknamed him the Junk Bond King.
The wiki has an oddly defensive tone to it…
At any rate, we have a CDO device created by a (now) felon. We have a government that is shoving the banking industry kicking and screaming into making loans they KNOW are no good. It’s a match made in heaven.
In order to palm off the “junk” they turned to the trick invented by the “junk bond king” and packaged their sausage just like he did. Worked great until it didn’t…
Frankly, IMHO, the only place that computer models came into it was as a Financial Ponzi Scheme cover story. The FINANCE guys (not Economists) cooked up a computer model to show that the risk was dispersed, so was not relevant any more.
Other than the rocks tossed at Economists, that I think ought to be tossed at Finance peddlers, by Michael R. Smith, the article is generally good. But I might be biased… one Michael Smith to another 😎
Mark Young (04:33:31) :
Now, with climate modeling, we know that we have a chaotic system. We also know that there is order in chaos, but that like markets they tend to be self correcting. Actually, more so, since climate isn’t effected by emotion nor leverage. The problem lies in that in addition to being chaotic, climate is vastly more complex and still poorly understood, at least in places.
Complexity and chaos are at the frontier of research in many disciplines, from physics to biology.
Unfortunately, contrary to expectations the knowledge of the chaotic nature of climate is given lip service in the IPCC used models. The GCM models themselves use numerical methods and boundary condition assumptions appropriate to solutions of well behaved equations in what in the end is a perturbative expansion method.
The lip service to chaos is given by the spaghetti graphs in the report plots: they vary the input/starting parameters so as to “simulate” chaos. It is not errors that those spaghetti graphs show, but the stability of the solutions. Considering the wrong premise used, that the solutions are expandable, it is not wonder that after a number of time steps, the models diverge: the higher order terms kick in.
This does also happens with weather predictions, which use the parent models of the climate CCMs , and we see that they cannot be given for more than a week. Climate, with larger time steps and broader assumptions diverges after a few years.
That the modelers are so arrogant as to assert confidence in prediction isn’t surprising. That smart people believe that they can reliably predict climate, however, is.
Unfortunately, modeling, like video games, tends to be addictive, and addiction does not look to intelligence.
>>Australian scientist and campaigner Tim Flannery<<
Tim Flannery is not a scientist, he is a pseudo scientist. He believes in "Gaia", which is unscientific nonsense and also believes that "our intelligence is here for a pupose." (I have seen him say that on a TV program that he made.) No person that believes that "our intelligence is here for a pupose." can properly be called a scientist.
acementhead (21:36:57) :
He’s a scientist, but nothing to do with climate sciences. He’s a Paleontologist.