[note, footnote links will only work if you go to original article ~ ctm]
The following is Patrick Frank’s controversial article challenging data and climate models on global warming. Patrick Frank is a Ph.D. chemist with more than 50 peer-reviewed articles. He has previously published in Skeptic on the noble savage myth, as well as in Theology and Science on the designer universe myth and in Free Inquiry, with Thomas H. Ray, on the science is philosophy myth.A Climate of Belief
The claim that anthropogenic CO2 is responsible for the current warming of Earth climate is scientifically insupportable because climate models are unreliable
by Patrick Frank
“He who refuses to do arithmetic is doomed to talk nonsense.”
— John McCarthy1
“The latest scientific data confirm that the earth’s climate is rapidly changing. … The cause? A thickening layer of carbon dioxide pollution, mostly from power plants and automobiles, that traps heat in the atmosphere. … [A]verage U.S. temperatures could rise another 3 to 9 degrees by the end of the century … Sea levels will rise, [and h]eat waves will be more frequent and more intense. Droughts and wildfires will occur more often. Disease-carrying mosquitoes will expand their range. And species will be pushed to extinction.”
So says the National Resources Defense Council,2 with agreement by the Sierra Club,3 Greenpeace,4 National Geographic,5 the US National Academy of Sciences,6 and the US Congressional House leadership.7 Concurrent views are widespread,8 as a visit to the internet or any good bookstore will verify.
Since at least the 1995 Second Assessment Report, the UN Intergovernmental Panel on Climate Change (IPCC) has been making increasingly assured statements that human-produced carbon dioxide (CO2) is influencing the climate, and is the chief cause of the global warming trend in evidence since about 1900. The current level of atmospheric CO2 is about 390 parts per million by volume (ppmv), or 0.039% by volume of the atmosphere, and in 1900 was about 295 ppmv. If the 20th century trend continues unabated, by about 2050 atmospheric CO2 will have doubled to about 600 ppmv. This is the basis for the usual “doubled CO2” scenario.
Doubled CO2 is a bench-mark for climate scientists in evaluating greenhouse warming. Earth receives about 342 watts per square meter (W/m2) of incoming solar energy, and all of this energy eventually finds its way back out into space. However, CO2 and other greenhouse gasses, most notably water vapor, absorb some of the outgoing energy and warm the atmosphere. This is the greenhouse effect. Without it Earth’s average surface temperature would be a frigid -19°C (-2.2 F). With it, the surface warms to about +14°C (57 F) overall, making Earth habitable.9
With more CO2, more outgoing radiant energy is absorbed, changing the thermal dynamics of the atmosphere. All the extra greenhouse gasses that have entered the atmosphere since 1900, including CO2, equate to an extra 2.7 W/m2 of energy absorption by the atmosphere.10 This is the worrisome greenhouse effect.
On February 2, 2007, the IPCC released the Working Group I (WGI) “Summary for Policymakers” (SPM) report on Earth climate,11 which is an executive summary of the science supporting the predictions quoted above. The full “Fourth Assessment Report” (4AR) came out in sections during 2007.

Figure 1 shows a black-and-white version of the “Special Report on Emission Scenarios” (SRES) Figure SPM-5 of the IPCC WGI, which projects the future of global average temperatures. These projections12 were made using General Circulation Models (GCMs). GCMs are computer programs that calculate the physical manifestations of climate, including how Earth systems such as the world oceans, the polar ice caps, and the atmosphere dynamically respond to various forcings. Forcings and feedbacks are the elements that inject or mediate energy flux in the climate system, and include sunlight, ocean currents, storms and clouds, the albedo (the reflectivity of Earth), and the greenhouse gasses water vapor, CO2, methane, nitrous oxide, and chlorofluorocarbons.
In Figure 1, the B1 scenario assumes that atmospheric CO2 will level off at 600 ppmv, A1B assumes growth to 850 ppmv, and A2 reaches its maximum at a pessimistic 1250 ppmv. The “Year 2000” scenario optimistically reflects CO2stabilized at 390 ppmv.
The original caption to Figure SPM-5 said, in part: “Solid lines are multi-model global averages of surface warming (relative to 1980–99) for the scenarios A2, A1B and B1, shown as continuations of the 20th century simulations. Shading denotes the plus/minus one standard deviation range of individual model annual averages.”
Well and good. We look at the projections and see that the error bars don’t make much difference. No matter what, global temperatures are predicted to increase significantly during the 21st century. A little cloud of despair impinges with the realization that there is no way at all that atmospheric CO2 will be stabilized at its present level. The Year 2000 scenario is there only for contrast. The science is in order here, and we can look forward to a 21st century of human-made climate warming, with all its attendant dangers. Are you feeling guilty yet?
But maybe things aren’t so cut-and-dried. In 2001, a paper published in the journal Climate Research13 candidly discussed uncertainties in the physics that informs the GCMs. This paper was very controversial and incited a debate.14But for all that was controverted, the basic physical uncertainties were not disputed. It turns out that uncertainties in the energetic responses of Earth climate systems are more than 10 times larger than the entire energetic effect of increased CO2.15 If the uncertainty is larger than the effect, the effect itself becomes moot. If the effect itself is debatable, then what is the IPCC talking about? And from where comes the certainty of a large CO2 impact on climate?
With that in mind, look again at the IPCC Legend for Figure SPM-5. It reports that the “[s]hading denotes the plus/minus one standard deviation range of individual model annual averages.” The lines on the Figure represent averages of the annual GCM projected temperatures. The Legend is saying that 68% of the time (one standard deviation), the projections of the models will fall within the shaded regions. It’s not saying that the shaded regions display the physical reliability of the projections. The shaded regions aren’t telling us anything about the physical uncertainty of temperature predictions. They’re telling us about the numerical instability of climate models. The message of the Legend is that climate models won’t produce exactly the same trend twice. They’re just guaranteed to get within the shadings 68% of the time.16
This point is so important that it bears a simple illustration to make it very clear. Suppose I had a computer model of common arithmetic that said 2+2=5±0.1. Every time I ran the model, there was a 68% chance that the result of 2+2 would be within 0.1 unit of 5. My shaded region would be ±0.1 unit wide. If 40 research groups had 40 slightly different computer models of arithmetic that gave similar results, we could all congratulate ourselves on a consensus. Suppose that after much work, we improved our models so that they gave 2+2=5±0.01. We could then claim our models were 10 times better than before. But they’d all be exactly as wrong as before, too, because exact arithmetic proves that 2+2=4. This example illustrates the critical difference between precision and accuracy.
In Figure 1, the shaded regions are about the calculational imprecision of the computer models. They are not about the physical accuracy of the projections. They don’t tell us anything about physical accuracy. But physical accuracy — reliability — is always what we’re looking for in a prediction about future real-world events. It’s on this point — the physical accuracy of General Circulation Models — that the rest of this article will dwell.
h/t dbstealey
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
What a beautiful an utterly complete destruction of the alleged usefullness of the “holy” GCMs of the IPPC and any predictions deduced from them. Great paper!
Such detailed and informed commentary, skewering IPCC climate-models that reduce to change = .1188C x TF/BF (Total Forcing / Base Forcing in degrees Centigrade), is no more than a confirmation of Edward Lorenz’s 1960s insight that gave rise to Chaos Theory: Complex dynamic systems such as Earth’s atmosphere are sensitively dependent on initial conditions (the “butterfly effect”), whereby factors indiscernible in principle produce non-random but wholly indeterminate outcomes on all potential scales.
“Chaotic” results in fact fall into patterns known as “strange attractors,” self-similar on every scale (what Benoit Mandelbrot called “fractal-geometric” figures). This bears on Newton’s classic “three-body problem,” introducing Lorenz’s “complexity” to any mutually interacting/feedback-mechanism of three or more components.
Mathematically and physically, “global climate models” (GCMs) are thus inherently worthless, treating meaningless “precision” (sic) as synonymous with accuracy per our author’s arithmetic analogy of 2 + 2 = 5 +/- 0.1: Whatever the spurious error-factor, the result is just plain wrong.
No researcher of integrity would endorse the Raj Pachauri/IPCC’s GIGO projections for one second. By definition, every climate hysteric’s GCM is an exercise in real-world futility, made “more sinister and perhaps more protracted by the lights of perverted Science” (Churchill).
I enjoyed the article. It seems like a good effort to calculate uncertainty in the climate models. However, as a lay person, I am not sure I can believe that the uncertainty is really so great as they say. If one extrapolates a few centuries, according to their methodology, the lower margin of error will get to absolute zero and beyond. Is it really appropriate to assume that this is a possible temperature for earth, just based on changes in cloud cover? Likewise, on the upper end, temperatures could get to boiling and beyond. Could changes in cloud cover really have that effect, even theoretically? I’m skeptical. However, I do think that at least this article is a good start.
So, no one knows what the future climate will be, but just in case we better tax everyone just in case it will be worse than we think and we want your money anyway. Oh boy! What a mess the whole world is in because of politicians and their cronies.
Oh dear. Does this mean Michael Shermer is on the verge of yet another flipping point?
SciAm will toss him out on his ear if he does just like they threw my friend Forrest Mims under the bus for failing to properly recognize the mud-to-man evolutionary narrative as indisputable fact. But hey, the magazine would be an improvement without Shermer’s column in it. Fanatic Darwin worshippers like Shermer get so bloody tiresome. Don’t get me wrong. I’m reasonably certain evolution happened over the course of billions of years but the cause, like the cause of global warming, is an unfalsifiable mystery.
@Charles
Sorry, but Patrick cannot remove more than 100% of the clouds. Other than that it would have been a nice topic since the SRES emission scenarios indeed play with different aerosol (human caused cloud) scenarios, but don’t include any uncertainties in effective climate sensitivities throughout the 21st century. Then, if we add natural variability, it’ll be blurry enough to be embarrassing.
+2°C more human warmth is considered bad – even if it increases agricultural production worldwide or even prevents the onset of a new ice age .
For readers who don’t read IPCC reports: 2010 means, the period 1990-2009 is being compared to 1980-1999 resulting in + 0.17°C warming (Hadcrut3). The IPCC-scenarios just show the trends of 20 year averages, not yearly averages as implied in the above article.
With a sceptical, not “denialist” smilie from the Philippines 🙂
It’s nice to see Skeptic mag picking up on real skepticism, even though it’s 20 years late. Most of the time Skeptic mag is “skeptical” in the same Orwellian way that Krokodil was “dissident”, or the same Orwellian way that Bill Maher is “politically incorrect.” These servants of the Establishment do a great job of ridiculing ideas and people currently considered heretical by the Establishment, but rarely poke holes in the current orthodoxy.
The debate has always been about the accuracy of the models in being able to predict temperature changes and climate feedback sensitivities.
We as a technological society are so deep into the “BELIEF” that all of our solutions are best handled by the technicians, we have almost disabled our hearts and higher reasoning.
Thankfully there is an awakening that we do have the ability to find out the truth for ourselves in a collective mass. This website is one of the most important examples of that collective power to amass a true consensus.
2+2 does not equal 5
Ruthlessly logical. The most rewarding paper I’ve read yet on “A”GW.
Peter Pearson says:
June 23, 2010 at 8:04 am
Ugh: I think he means 2+2=5 ± 1, not ±0.1.
===============================
Peter,
He is making the point that the model can never produce a correct answer, however much accuracy improves.
Most large scale models have stochastic elements that simulate natural variability. This has nothing to do with “numerical instability” but rather is a way of taking into account natural variation. The model is then run many times to get a measure of the expected level of natural variation plus the overall trend. The standard variation is a measure of natural variability. Dr. Frank is, of course correct, that any errors in the underlying formulation of the model are not included in the variability (standard deviation).
Latitude: “Everyone breathes, so therefore breathing causes auto accidents.” No, while correlation is not causation, you need to have variation in both variables to test for correlation. In your example, presumably, one would try to relate the average number of cups of coffee consumed with the frequency of auto accidents. In this example, one would probably control for alcohol drinking and age before trying to correlate coffee with accidents (perhaps using multiple regression or analysis of covariance). I understand that this is a silly example, but at least one should run the statistics in a way that makes sense.
Just look how poorly the GCMs model the 20th century. There is none of the sinusoidal pattern, related to oceanic oscillations; all those models simply follow the increasing CO2. Models do not agree with the previous 20th century period: they do not agree with present cooling. They are just extrapolating the 1978-2005 trend to year 2100.
I want to see the models running backwards and to mimic the CET record since 1659.
Peter Pearson says:
June 23, 2010 at 8:04 am
Ugh: I think he means 2+2=5 ± 1, not ±0.1.
____________________________________________________________________
No he meant ±0.1. He did it intentionally to show the models do not “model” reality in any way shape or form.
Garbage in = garbage out
stevengoddard says:
June 23, 2010 at 7:35 am
Justice done in the end, deserved group winners, good luck in the next round.
Roald says:
June 23, 2010 at 6:50 am
“Perhaps it’s worth mentioning that some changes (e.g. sea level rise, Arctic melt) are running faster than predicted by the 2007 Assessment Report.”
Roald;
Arctic melt will not increase Sea Levels. Archimedes, remember?
Antarctic is not melting due to CO2. Remember the post where scientists said that
all sealevel rise is from that Ice Shelf in Antarctica, with a volcano underneath?
Remember?
stevengoddard says:
June 23, 2010 at 7:35 am
That’s football. USA is there. And playing well and strong. Reviewed by a Brazilian. I think America can dream.
John Blake says:
June 23, 2010 at 8:53 am
Such detailed and informed commentary, skewering IPCC climate-models that reduce to change = .1188C x TF/BF (Total Forcing / Base Forcing in degrees Centigrade), is no more than a confirmation of Edward Lorenz’s 1960s insight that gave rise to Chaos Theory: Complex dynamic systems such as Earth’s atmosphere are sensitively dependent on initial conditions (the “butterfly effect”), whereby factors indiscernible in principle produce non-random but wholly indeterminate outcomes on all potential scales…..
_______________________________________________________________
John, can you explain your whole comment in simpler terms for those of us who are mathematically “challenged”? Perhaps an article for WUWT would be appropriate since this is a key point, but please consider your audience has only a junior high – high school education. This is too important to have people dismiss it because they do not understand what you are talking about.
CodeTech says:
June 23, 2010 at 8:21 am
From the article:
Earth’s climate is warming and no one knows exactly why.
—
Nah. But very likely the steady increase of a certain important greenhouse gas is to blame. At least, this increase cannot have no effect at all. Which is what easy skeptics assume when they try alternative explanations: they forget they need to prove in such cases that there is ALSO a cooling mechanism at work that exactly compensates for the increase of carbon dioxide, only so leaving the alternative explanation for warming in the clear.
Even easier skeptics simply [SNIP – get with the program] the existence of greenhouse gases.
Check this out, people. It’s easy, elementary scientific thinking.
This is a brilliant article and well worth the read. For those of us who instinctively feel that AGW is a hoax, it is confirmation that we are being bamboozled by those with an agenda.
NOT ANY MORE
It is only a matter of an intelligent choice:
a) Warm your feet with a bottle filled with hot air.
b) Warm your feet with a bottle filled with hot water.
Bedwetters choose (a) that’s why they peed the bed.
What we have to fear is another ice age
kwik says:
June 23, 2010 at 9:50 am
“Roald;
Arctic melt will not increase Sea Levels. Archimedes, remember?”
I’d appreciate if you stopped putting words in my mouth. I know perfectly well that Arctic melt doesn’t contribute to sea level rise but thermal expansion and glacier melt does.
TA says: June 23, 2010 at 8:57 am
“I am not sure I can believe that the uncertainty is really so great as they say. If one extrapolates a few centuries, according to their methodology, the lower margin of error will get to absolute zero and beyond”
Yes, that is the symptom of the problem. If a model allows the projection of the impossible at some point in the future, then at some point prior to that future the model has obviously failed to reflect reality.
A model of a cyclical chaotic system often fails when the ratio of actual to projection exceeds 400:3, many people do not understand (or they choose to ignore) the cumulative error inherent in modeling a cyclical chaotic system.
Climate is a cyclical chaotic system.
The AO people over at NOAA are actually doing very well with their very, very small component of the overall chaotic system; projecting at +/- 30% 7 days out and +/- 50% 14 days out. I wonder how they might do one, two or a hundred years out?
AO people over at NOAA:
http://www.cpc.noaa.gov/products/precip/CWlink/daily_ao_index/ao_index_ensm.shtml
Arctic sea ice melt, of course.
Accuracy in climates- To model a system you don’t understand -select from a universe of unknowns only those assumptions that fit with a desired result and continue to tweak your model until you get your desired fit. Do not empirically test any of your assumptions as this may cause model problems Once you are sure you have sufficiently tweaked the model to achieve the desired result- run it multiple times. If your output of the multiple model runs are in the same ballpark you have proved the validity of the input assumptions.
The most important step, however, is to prevent any discussion of the fact that you pulled all your assumptions out of your ass!