Climate Model Deception – See It for Yourself

From a workshop being held at the University of Northen Colorado today:"Teaching About Earth's Climate Using Data and Numerical Models" - click for more info

Guest post by Robert E. Levine, PhD

The two principal claims of climate alarmism are human attribution, which is the assertion that human-caused emissions of carbon dioxide are warming the planet significantly, and climate danger prediction (or projection), which is the assertion that this human-caused warming will reach dangerous levels. Both claims, which rest largely on the results of climate modeling, are deceptive. As shown below, the deception is obvious and requires little scientific knowledge to discern.

The currently authoritative source for these deceptive claims was produced under the direction of the UN-sponsored Intergovernmental Panel on Climate Change (IPCC) and is titled Climate Change 2007: The Physical Science Basis (PSB). Readers can pay an outrageous price for the 996 page bound book, or view and download it by chapter on the IPCC Web site at http://www.ipcc.ch/publications_and_data/ar4/wg1/en/contents.html

Alarming statements of attribution and prediction appear beginning on Page 1 in the widely quoted Summary for Policymakers (SPM).

Each statement is assigned a confidence level denoting the degree of confidence that the statement is correct. Heightened alarm is conveyed by using terms of trust, such as high confidence or very high confidence.

Building on an asserted confidence in climate model estimates, the PSB SPM goes on to project temperature increases under various assumed scenarios that it says will cause heat waves, dangerous melting of snow and ice, severe storms, rising sea levels, disruption of climate-moderating ocean currents, and other calamities. This alarmism, presented by the IPCC as a set of scientific conclusions, has been further amplified by others in general-audience books and films that dramatize and exaggerate the asserted climate threat derived from models.

For over two years, I have worked with other physicists in an effort to induce the American Physical Society (APS) to moderate its discussion-stifling Statement on Climate Change, and begin to facilitate normal scientific interchange on the physics of climate. In connection with this activity, I began investigating the scientific basis for the alarmist claims promulgated by the IPCC. I discovered that the detailed chapters of the IPCC document were filled with disclosures of climate model deficiencies totally at odds with the confident alarmism of the SPM. For example, here is a quote from Section 8.3, on Page 608 in Chapter 8:

“Consequently, for models to predict future climatic conditions reliably, they must simulate the current climatic state with some as yet unknown degree of fidelity.”

For readers inclined to accept the statistical reasoning of alarmist climatologists, here is a disquieting quote from Section 10.1, on Page 754 in Chapter 10:

“Since the ensemble is strictly an ‘ensemble of opportunity’, without sampling protocol, the spread of models does not necessarily span the full possible range of uncertainty, and a statistical interpretation of the model spread is therefore problematic.”

The full set of climate model deficiency statements is presented in the table below. Each statement appears in the referenced IPCC document at the indicated location. I selected these particular statements from the detailed chapters of the PSB because they show deficiencies in climate modeling, conflict with the confidently alarming statements of the SPM, and can easily be understood by those who lack expertise in climatology. No special scientific expertise of any kind is required to see the deception in treating climate models as trustworthy, presenting confident statements of climate alarm derived from models in the Summary, and leaving the disclosure of climate model deficiencies hidden away in the detailed chapters of the definitive work on climate change. Climategate gave us the phrase “Hide the decline.” For questionable and untrustworthy climate models, we may need another phrase. I suggest “Conceal the flaws.”

I gratefully acknowledge encouragement and a helpful suggestion given by Dr. S. Fred Singer.

Climate Model Deficiencies in IPCC AR4 PSB
Chapter Section Page Quotation
6 6.5.1.3 462 “Current spatial coverage, temporal resolution and age control of available Holocene proxy data limit the ability to determine if there were multi-decadal periods of global warmth comparable to the last half of the 20th century.”
6 6.7 483 “Knowledge of climate variability over the last 1 to 2 kyr in the SH and tropics is severely limited by the lack of paleoclimatic records. In the NH, the situation is better, but there are important limitations due to a lack of tropical records and ocean records. Differing amplitudes and variability observed in available millennial-length NH temperature reconstructions, and the extent to which these differences relate to choice of proxy data and statistical calibration methods, need to be reconciled. Similarly, the understanding of how climatic extremes (i.e., in temperature and hydro-climatic variables) varied in the past is incomplete. Lastly, this assessment would be improved with extensive networks of proxy data that run up to the present day. This would help measure how the proxies responded to the rapid global warming observed in the last 20 years, and it would also improve the ability to investigate the extent to which other, non-temperature, environmental changes may have biased the climate response of proxies in recent decades.”
8 Executive Summary 591 “The possibility that metrics based on observations might be used to constrain model projections of climate change has been explored for the first time, through the analysis of ensembles of model simulations. Nevertheless, a proven set of model metrics that might be used to narrow the range of plausible climate projections has yet to be developed.”
8 Executive Summary 593 “Recent studies reaffirm that the spread of climate sensitivity estimates among models arises primarily from inter-model differences in cloud feedbacks. The shortwave impact of changes in boundary-layer clouds, and to a lesser extent mid-level clouds, constitutes the largest contributor to inter-model differences in global cloud feedbacks. The relatively poor simulation of these clouds in the present climate is a reason for some concern. The response to global warming of deep convective clouds is also a substantial source of uncertainty in projections since current models predict different responses of these clouds. Observationally based evaluation of cloud feedbacks indicates that climate models exhibit different strengths and weaknesses, and it is not yet possible to determine which estimates of the climate change cloud feedbacks are the most reliable.”
8 8.1.2.2 594 “What does the accuracy of a climate model’s simulation of past or contemporary climate say about the accuracy of its projections of climate change” This question is just beginning to be addressed, exploiting the newly available ensembles of models.”
8 8.1.2.2 595 “The above studies show promise that quantitative metrics for the likelihood of model projections may be developed, but because the development of robust metrics is still at an early stage, the model evaluations presented in this chapter are based primarily on experience and physical reasoning, as has been the norm in the past.”
8 8.3 608 “Consequently, for models to predict future climatic conditions reliably, they must simulate the current climatic state with some as yet unknown degree of fidelity.”
8 8.6.3.2.3 638 “Although the errors in the simulation of the different cloud types may eventually compensate and lead to a prediction of the mean CRF in agreement with observations (see Section 8.3), they cast doubts on the reliability of the model cloud feedbacks.”
8 8.6.3.2.3 638 “Modelling assumptions controlling the cloud water phase (liquid, ice or mixed) are known to be critical for the prediction of climate sensitivity. However, the evaluation of these assumptions is just beginning (Doutraix-Boucher and Quaas, 2004; Naud et al., 2006).
8 8.6.4 640 “A number of diagnostic tests have been proposed since the TAR (see Section 8.6.3), but few of them have been applied to a majority of the models currently in use. Moreover, it is not yet clear which tests are critical for constraining future projections. Consequently, a set of model metrics that might be used to narrow the range of plausible climate change feedbacks and climate sensitivity has yet to be developed.”
9 Executive Summary 665 “Difficulties remain in attributing temperature changes on smaller than continental scales and over time scales of less than 50 years. Attribution at these scales, with limited exceptions, has not yet been established.”
10 10.1 754 “Since the ensemble is strictly an ‘ensemble of opportunity’, without sampling protocol, the spread of models does not necessarily span the full possible range of uncertainty, and a statistical interpretation of the model spread is therefore problematic.”
10 10.5.4.2 805 “The AOGCMs featured in Section 10.5.2 are built by selecting components from a pool of alternative parameterizations, each based on a given set of physical assumptions and including a number of uncertain parameters.”
0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

110 Comments
Inline Feedbacks
View all comments
Pamela Gray
October 21, 2010 6:02 am

Since I believe most of our CO2 increase is being caused by human breath, I propose that those with the hottest air wear the CO2 scrubbers. That would be anyone who offered, penned, edited, or presented a single word in that book. Problem solved. Enjoy the return of cooler temperatures.

A C Osborn
October 21, 2010 6:26 am

At http://nofrakkingconsensus.wordpress.com/
Donna is now looking at the qualifications and experience of the Authors & Lead Authors that the IPCC used.
Quite an interesting approach too.

Chris B
October 21, 2010 6:38 am

Should have been the Summary for Aspiring Policymakers (SPAM).

Richard M
October 21, 2010 7:02 am

John Marshall says:
October 21, 2010 at 1:59 am
I have run a GCM on my computer and by changing the CO2 residence time from that stated by the IPCC, 100-200 years, to what current research has shown it to be, 5-10 years, the supposed critical temperature rise becomes a fall in temperature, everything else being equal. Just shows how you can confuse with rubbish inputs into models. Do they work with chaotic systems? I do not think that they do.

What do you get at 35-40 years? Have you tried doing a plot over values from 5-200 showing the trend? That would be very interesting.

Dave Springer
October 21, 2010 7:07 am

richard telford says:
October 21, 2010 at 12:12 am

“Current spatial coverage, temporal resolution and age control of available Holocene proxy data limit the ability to determine if there were multi-decadal periods of global warmth comparable to the last half of the 20th century.”
This is not a criticism of the models but of the proxy-data.
I can think of much better places to conceal uncertainty than in the IPCC report.

It indirectly indicts the models. The first test any model must pass is an accurate reconstruction of past climate change. Sometimes this is called “training” the model. If it can’t reconstruct the past climate with reasonable accuracy the model or its assumptions or both are changed until it gets the past climate right. If the climate history the model is designed to replicate is wrong then the model, by design, is wrong too.

Dave Springer
October 21, 2010 7:18 am

Readers here should consider that the OP author, Dr. Levine, raised a red flag about scientific integrity to his fellow APS members almost two years ago here:
http://www.aps.org/units/fps/newsletters/200901/levine.cfm
Imagine how Levine must have reacted to the revelations surrounding climategate when almost a year earlier he’d alerted the APS members of his concern and at that time was already alarmed about the potential damage that CAGW advocacy could do to public trust in scientific integrity. In modern parlance it must have been an OMG moment to say the least.

Enneagram
October 21, 2010 7:30 am

The fact is that we do not have tools to deal with actual reality. Now being long forgotten of rejected the real universal laws, represented by analogical symbols transmitted by traditions from the distant past of humanity.
Thus, we, self conceited and self deluded people, believing ourselves to be members of the most advanced and educated civilization of the history of the world, while not even knowing why the heck zillion of tons of water FLOAT above our heads defying the most Holy Newton’s Law!, However, now, there is a positive way of knowing how much total energy is released by the earth. Now you can play calculating the actual energy emitted by the whole emission system of the Earth, by using the Unified Field equation:
E= (Sin y + Cos y)(V/D)
http://www.scribd.com/doc/38598073/Unified-Field
Where Gravity/10= Sin Y= 0.981
Rest of the Field=Cos Y =-0.019, where it is added 1 (total field)- 0.981 = 0.019 x 10= 0.19 Nm (a positive emission field- 19% of the total field= 10 Nm)
V=Earth velocity around its axis in m/s
D=Earth Diameter in meters.
And, of course, the result is in Joules/second.
Now, you can have, also in consideration the Moon which “sucks” at perigee and emits at apogee:
Moon (a) at eccentricity=0,026
-2,24915291288904 Nm
Moon (b) at eccentricity=0,077
+9,40962149507112 Nm
http://www.scribd.com/doc/39678117/Planets-Moon-Field

artwest
October 21, 2010 7:31 am

More fascinating stuff about the IPCC reports here:
http://nofrakkingconsensus.wordpress.com/2010/10/19/lead-author-lacked-a-masters-degree/
http://nofrakkingconsensus.wordpress.com/2010/10/20/more-grad-student-expertise/
http://nofrakkingconsensus.wordpress.com/2010/10/21/meet-the-ipccs-youngest-lead-author/
Apparently having virtually no experience or qualifications qualified you amply for being a coordinating lead author, e.g. (last link above):
“The coordinating lead authors (usually there are two of them) are a chapter’s most senior personnel. (…) Klein was promoted to the most senior IPCC authorial role when he [was] just 28.
(…) he didn’t earn his PhD until six years after that – in 2003. So Klein served as an IPPC author four times while he was still a graduate student.
The fact that he was comically young didn’t disqualify him. The fact that he’d recently worked for Greenpeace didn’t disqualify him. While still in his twenties, while still years away from completing his doctorate, those in charge of the IPCC decided Klein was one of the worlds top experts. “

Geoff
October 21, 2010 7:44 am

Dear Professor Ryan,
I understand you don’t deal professionally with climate models, so you may be interested in the recent paper by a well known climate modeler on just this issue of ensembles. One representative quote – “averaging models leads to unwanted
effects like smoothing of spatially heterogeneous patterns, so it is unclear whether an
average across models is physically meaningful at all”.
You can read the full paper (fortunately open access) at http://www.springerlink.com/content/97132434001l7676/fulltext.pdf .
(The End of Model Democracy? by Reto Knutti)

AnonyMoose
October 21, 2010 7:50 am

The IPCC has repeatedly confirmed deficiencies in its science and methods. Just read any of the reports after the first one and notice how much improvement they claim over their previous report. Nobody revisits the decisions made based upon the flaws in the preceding report.

james
October 21, 2010 8:13 am

“How is one supposed to interpret “ensemble of opportunity”?
(English is not my mother tongue)”
my rough and ready translation would be “cherry picked”

james
October 21, 2010 8:16 am

averaging models where everyone is competing to show the scariest outcome does not yield truth.

Steven Kopits
October 21, 2010 8:32 am

Professor Ryan:
There is a significant difference between financial forecasts and climate forecasts. In financial forecasts, the attitude is positive: no one has a stake in any given outcome. In a climate forecast, the attitude is often–and for modelers, presumed to be uniformly–normative. Those working on these models are presumed to believe that CO2 leads to warming and that man is the cause.
Thus, for example, an equity analyst at Barclays Capital has no stake in whether the shares of Exxon go up or down (except during an offering). By contrast, a climate change modeler has every incentive to show temperatures going up. It would be like asking a broad cross section of analysts working for the Democratic Party whether Republicans are good. I would venture a guess that, even if you dropped the outliers, the answer would be ‘no’–because the sample itself is not random. If the forecasters have a stake in the outcome, the ensemble approach is unlikely to produce reliable results.

bob
October 21, 2010 8:33 am

You list self identified deficiencies listed in the AR4 and you say the IPCC is being deceptive???????
With no analysis of what problems you have with the list of deficiencies.
A rather incomplete post IMHO.
Oh, and climate models and weather models are very much beasts of a diffenent taxonomy if you will.
You know that there were climate models well before there were any decent weather models.

Mike Haseler
October 21, 2010 8:33 am

NS says: “Much of the IPCC’s caveats highlighted in this post are comments about the deficiencies of individual models which could be surmounted by a well constructed ensemble methodology and research design. In my view this is the direction of travel that climate research should follow.”
You sound like a person lost on a hill saying: “I think this way”, and strolling boldly over a cliff.
What climate “science” needs is some basic common sense: like a walker needs a map and compass, and even better, someone who can read them, likewise climate “science” needs some basic information, like a reliable measure of local temperatures and someone who can understand what these temperatures mean.
And whether it is one person or a thousand people who haven’t got a clue where they are going because they never got the basic necessary information first, it really doesn’t matter if you average one or the entire “consensus”, the average of those who haven’t got a clue will always be: THEY HAVEN’T GOT A CLUE!

October 21, 2010 8:36 am

From what I have seen on the topic, and what common sense tells me is that no model can be programed that can possibly account of all the factors that control and dictate climate. I think computer modeling is simply a high tech guessing game and has no place in science.

October 21, 2010 8:38 am

[snip – off color humor ~mod]

Ken Hall
October 21, 2010 8:39 am

“ensemble of opportunity”
As stated above, it is backing every eventuality, then claiming that you were right all along. So long as they at least one model that shows cooling in some phase (enough to be safely ignored) and at least one that shows no change in temps in some phase, then they can continue the hype about runaway warming and even if the earth cools, they can ignore that and still claim that they were right all along.

Mike Haseler
October 21, 2010 8:43 am

artwest says:
http://nofrakkingconsensus.wordpress.com/2010/10/21/meet-the-ipccs-youngest-lead-author/
“…The fact that he’d recently worked for Greenpeace didn’t disqualify him. While still in his twenties, while still years away from completing his doctorate, those in charge of the IPCC decided Klein was one of the worlds top experts. ”
That is completely outrageous. It’s like in the cold war, discovering some top Russian spy was working for the CIA – not covertly, but openly having been appointed apparently for no other reason than that they were a Russian spy.
What does that tell you about the top of the organisation that appointed them? … everything!

Jean Parisot
October 21, 2010 9:04 am

There is a top to bottom problem in the climate modeling and weather data with regards to the spatial error components. The gridding, the land/sea relationships, the clustering of data sources in inhabited areas, etc. It needs a professional look from several people.

artwest
October 21, 2010 9:55 am

Mike Hessler, it’s not just the conflict of interest it’s the lack of experience and qualifications of major contributors.
More examples:
“So, in the 15 years prior to earning her PhD, Kovats served once as a contributing author and twice as a lead author for the IPCC.
Which means governments around the world have been relying on the expertise of grad students when they make multi-billion-dollar climate change decisions.”
http://nofrakkingconsensus.wordpress.com/2010/10/20/more-grad-student-expertise/
and
“(…) a mere two years after Patz achieved his Masters, with no relevant publications whatsoever, he was one of nine people chosen to be a lead author of the IPCC’s health chapter. Remember, this is a report that is supposed to have been written by the world’s top experts.”
http://nofrakkingconsensus.wordpress.com/2010/09/09/the-new-graduate-who-served-as-ipcc-lead-author/
and of course experience as a climate activist was not seen as any hindrance to being an IPCC author:
http://nofrakkingconsensus.wordpress.com/2010/09/03/the-book-the-ipcc-plagiarized/
http://nofrakkingconsensus.wordpress.com/2010/08/25/ipcc-author-profile-alistair-woodward/

AJ
October 21, 2010 9:57 am

If we use Newton’s “Law” of Cooling, then the committment is very small. This law uses the term e^(-rt) to model the unrealized change. So at t=0, 100% of the change is unrealized and as time goes to infinity the change is fully realized.
You can estimate the “r” value by analyzing temperature lags to a cyclical forcing. For example, the hottest time of day is not high noon, but a few hours later. The approach is described here:
http://www.math.montana.edu/frankw/ccp/cases/newton/overview.htm
This method shows that with a high “r” value, the lag is essentially zero and the thermal potential is fully met. Conversely, as “r” approaches zero, the lag approaches 1/4 of the cycle period and very little of the thermal potential is met. I’ve modeled this approach in the following spreadsheet:
https://spreadsheets.google.com/ccc?key=0AiP3g3LokjjZdHcxcVhUb2NyQTdXQmpPcjZzVFpYUUE&hl=en
For my lag analysis I chose to look at mid-latitude Sea Surface Temperatures (SSTs). I chose this because at 45 degrees latitude the yearly cycle of daily solar irradiance follows a sinusoidal pattern. By contrast, at the equator there are two peaks at the equinoxes and in the polar regions there is complete darkness in the winter. My analysis shows a seasonal lag of ~75 days which gives -r=-1.80 on a yearly basis. So 99% of the forcing is realized in ln(.01)/-1.80 ~ 2.5 years. Here are my spreadsheets of this analysis:
https://spreadsheets.google.com/ccc?key=0AiP3g3LokjjZdDg3UDNPR21QRk9TUmVZZWlORU5uU3c&hl=en&authkey=CL-ow8wM
https://spreadsheets.google.com/ccc?key=0AiP3g3LokjjZdG8zdW5aZ2xTbWN5Zmd4ZWdMU0ZJM3c&hl=en&authkey=CM-G48cI
This analysis also allows us to ponder two hypothetical cases. If the tilt of the earth became zero degrees, then the temperature between the north and south hemispheres would be 99% equalized within 2.5 years. The second case is that at 75 days lag, the model amplitude is about 25% of its potential. So if the earth’s tilt became stuck at the summer solstice, the resulting temperature amplitude would be 4 times the actual amplitude. So instead of increasing 3C above mean, the increase would be 12C and again would be 99% realized with 2.5 years. Neither one of these results seems unreasonable to me.
Newton’s “Law” works well with small scale experiments, but does it work at a global scale? As stated here, the IPCC doesn’t think so. To test their argument, however, I would present the Roe paper that shows zero lag between temperature and Milankovitch forcings:
http://motls.blogspot.com/2010/07/in-defense-of-milankovitch-by-gerard.html
The zero lag in these cycles agrees with Newton’s Law. It is given that r=1.80 on a yearly basis will translate into 1.80 * 26,000 over the earth’s precession (wobble) cycle. This high rate agrees with the observation that the lag is essentially zero. On the other hand, I’ll assume the IPCC models have a variable “r” value which diminishes over time. But shouldn’t this predict a measurable lag? Can their models match the observations? Maybe, but I’d be surprised.
Thanks, AJ

james
October 21, 2010 10:58 am

“You list self identified deficiencies listed in the AR4 and you say the IPCC is being deceptive???????”
The gap between the main text and the summary allows us to “peer into the heart” of the authors. The summary allows one to pick and choose what to emphasize. The fact that the authors choose to emphasize certainty as opposed to uncertainty in the summary does seem deceptive and provides prima facia evidence for the direction of the IPCC’s bias imo.

Noblesse Oblige
October 21, 2010 11:15 am

As often happens with IPCC, you have to keep your eye on the pea. Let’s take the passage on the model ensemble approach (p 754) and go a little deeper. The model ensemble approach allows modelers and the IPCC to skirt the fact that different models contain different physics. These differences account for the wide variation between the calculated climate sensitivities. Normally in science, such differences are used in combination with observations to learn which assumptions, as represented by the choice of various parameters, do a better job of reproducing nature. Not so in climate modeling. In IPCC, these differences are thrown aside, and all models are treated equally. In effect, you can say that they are circling the wagons around all the models.

stumpy
October 21, 2010 11:30 am

Whilst this document is intended for hydrological / hydraulic modellors like myself, the IPCC climate modellors should also give it a read since they have no standards of their own:
http://www.ciwem.org/media/144832/WAPUG_User_Note_13.pdf

Verified by MonsterInsights