Note: two events at AGU13 this morning dovetail in with this essay. The first, a slide from Dr. Judith Lean which says: “There are no operational forecasts of global climate change”.
The second was a tweet by Dr. Gavin Schmidt, attending Lenny Smith’s lecture (which I couldn’t due to needing to file a radio news report from the AGU press room) that said:
Smith: usefulness of climate models for mitigation 'as good as it gets', usefulness for adaptation? Not so much #AGU13
— Gavin Schmidt (@ClimateOfGavin) December 10, 2013
With those events in mind, this essay from Dr. William Gray (of hurricane forecasting fame) is prescient.
Guest essay by Dr. William M. Gray
My 60-year experience in meteorology has led me to develop a profound disrespect for the philosophy and science behind numerical climate modeling. The simulations that have been directed at determining the influence of a doubling of CO2 on Earth’s temperature have been made with flawed and oversimplified internal physical assumptions. These modeling scenarios have shown a near uniformity in CO2 doubling causing a warming of 2-5oC (4-9oF). There is no physical way, however, that an atmospheric doubling of the very small amount of background CO2 gas would ever be able to bring about such large global temperature increases.
It is no surprise that the global temperature in recent decades has not been rising as the climate models have predicted. Reliable long-range climate modeling is not possible and may never be possible. It is in our nation’s best interest that this mode of prophecy be exposed for its inherent futility. Belief in these climate model predictions has had a profound deleterious influence on our country’s (and foreign) governmental policies on the environment and energy.
The still-strong—but false—belief that skillful long-range climate prediction is possible is thus a dangerous idea. The results of the climate models have helped foster the current political clamor for greatly reducing fossil fuel use even though electricity generation costs from wind and solar are typically three to five times higher than generation from fossil fuels. The excuse for this clamor for renewable energy is to a large extent the strongly expressed views of the five Intergovernmental Panel on Climate Change (IPCC) reports, which are based on the large (and unrealistic) catastrophic global warming projections from climate models.
The pervasive influence of these IPCC reports (from 1990 to 2013) derives from the near-universal lack of climate knowledge among the general population. Overly biased and sensational media reports have been able to brainwash a high percentage of the public. A very similar lack of sophisticated climate knowledge exists among our top government officials, environmentalists, and most of the world’s prestigious scientists. Holding a high government position or having excelled in a non-climate scientific specialty does not automatically confer a superior understanding of climate.
Lack of climate understanding, however, has not prevented our government leaders and others from using the public’s fear of detrimental climate change as a political or social tool to further some of their other desired goals. Climate modeling output lends an air of authority that is not warranted by the unrealistic model input physics and the overly simplified and inadequate numerical techniques. (Model grids cannot resolve cumulus convective elements, for example.) It is impossible for climate models to predict the globe’s future climate for at least three basic reasons.
One, decadal and century-scale deep-ocean circulation changes (likely related to long time-scale ocean salinity variations), such as the global Meridional Overturning Circulation (MOC) and Atlantic Thermohaline Circulation (THC), are very difficult to measure and are not yet well-enough understood to be included realistically in the climate models. The last century-and-a-half global warming of ~0.6oC appears to be a result of the general slowdown of the oceans’ MOC over this period. The number of multidecadal up-and-down global mean temperature changes appears also to have been driven by the multidecadal MOC. Models do not yet incorporate this fundamental physical component.
Two, the very large climate modeling overestimates of global warming are primarily a result of the assumed positive water-vapor feedback processes (about 2oC extra global warming with a CO2 doubling in most models). Models assume any upper tropospheric warming also brings about upper tropospheric water-vapor increase as well, because they assume atmospheric relative humidity (RH) remains quasi-constant. But measurements and theoretical considerations of deep cumulonimbus (Cb) convective clouds indicate any increase of CO2 and its associated increase in global rainfall would lead to a reduction of upper tropospheric RH and a consequent enhancement (not curtailment) of Outgoing Longwave Radiation (OLR) to space.
The water-vapor feedback loop, in reality, is weakly negative, not strongly positive as nearly all the model CO2 doubling simulations indicate. The climate models are not able to resolve or correctly parameterize the fundamentally important climate forcing influences of the deep penetrating cumulonimbus (Cb) convection elements. This is a fundamental deficiency.
Three, the CO2 global warming question has so far been treated from a “radiation only” perspective. Disregarding water-vapor feedback changes, it has been assumed a doubling of CO2 will cause a blockage of Outgoing Long-wave Radiation (OLR) of 3.7 Wm-2. To compensate for this blockage without feedback, it has been assumed an enhanced global warming of about 1oC would be required for counterbalance. But global energy budget considerations indicate only about half (0.5oC, not 1oC) of the 3.7 Wm-2 OLR blockage of CO2 should be expected to be expended for temperature compensation. The other half of the compensation for the 3.7 Wm-2 OLR blockage will come from the extra energy that must be utilized for surface evaporation (~1.85 Wm-2) to sustain the needed increase of the global hydrologic cycle by about 2 percent.
Earth experiences a unique climate because of its 70 percent water surface and its continuously functioning hydrologic cycle. The stronger the globe’s hydrologic cycle, the greater the globe’s cooling potential. All the global energy used for surface evaporation and tropospheric condensation warming is lost to space through OLR flux.
Thus, with zero water-vapor feedback we should expect a doubling of CO2 to cause no more than about 0.5oC (not 1oC) of global warming and the rest of the compensation to come from enhanced surface evaporation, atmospheric condensation warming, and enhanced OLR to space. If there is a small negative water-vapor feedback of only -0.1 to -0.3oC (as I believe to be the case), then a doubling of CO2 should be expected to cause a global warming of no more than about 0.2-0.4oC. Such a small temperature change should be of little societal concern during the remainder of this century.
It is the height of foolishness for the United States or any foreign government to base any energy or environmental policy decisions on the results of long-range numerical climate model predictions, or of the recommendations emanating from the biased, politically driven reports of the IPCC.
###
William M. Gray, Ph.D. (gray@atmos.colostate.edu) is professor emeritus of atmospheric science at Colorado State University and head of the Tropical Meteorology Project at CSU’s Department of Atmospheric Sciences.

Thank you, Dr Gray – telling it like it is. This needs to find it’s way into the hands of the general public. Even reporters ought to understand this.
But they can cut off his e-mail account and/or office space, as an Australian university did to a big-name skeptic last year (Bob Carter?).
Reliable long range forecast not reliable – I’ll say if you ever listen to a weather forecast no one can get correct five days in advance. And thank you DR to putting your self out there. Hopefully those who will attack you are those you fully expect to react – and you’re ready for the inevitable yet to come your way.
Smith: usefulness of climate models for mitigation ‘as good as it gets’, usefulness for adaptation? Not so much #AGU13
Global Panicker 1; Ohmigod! Ohmigod!
GP2; What!? What!?
GP1; These models say something is going to change!
GP2; What? What’s going to change?
GP1; The models don’t say! But it will be bad!
GP2; Well that’s it then! We have to do something to stop the thing that we don’t know what it is!
GP1; Exactly! Gimme all your money so I can get started!
GP2; Wait a second….
The funny thing is that most people do not understand exactly how LITTLE CO2 makes up the atmosphere! They read 400ppm (parts per million), and see 400, a big, big number. When you show them it reduced to a decimal based percentage, that’s 0.04%. No, no that not right, your math is wrong. Sorry, it’s not “my” math, it is what it is.
O.K. Another illustration of 400ppm. Let assume a grain of rice is an atom of CO2. So, a million
is a 1,000 x 1,000. Out of 1,000 grains, 400 are coloured Red, the other 600 are White. Now mix those 1,000 grains into a BIG bowl with other 999,000 grains and imagine how many Red grains you will see. It’s those Red grains in “some” volume that “reflects” heat back to Earth and causes “Global Warming”.
I still get confused looks followed by, “But THEY say CO2 is bad!”.
Face palm.
Dr. Gray is the savior of my sanity. I became a skeptic of CAGW in the early 90s, but I thought I was all alone. The more papers on climate change I read, the more flaws and poor assumptions I found, but no one around me seemed to notice or care about the poor science. Then I heard Dr. Gray speak at a Hurricane Conference. He broke from his usual talk about the tropics and launched into a talk about Global Warming, cleverly equating climate modelers to barbarians and unabashedly predicting that we would see global cooling in the first half of the 21st Century. He highlighted all the flaws in AGW theory that I had noticed, then informed me of a bunch more. I looked around the conference hall and found many people nodding in agreement! I wasn’t alone and I wasn’t insane.
Over the years, Dr. Gray has taken a lot of flak for his open stance against the global warming industry. His important research in tropical climatology was defunded. He was called a crazy old man and a relic of the past, but he has never backed down. He found other ways to continue his tropical research while always taking the opportunity to speak plainly about the hijacking of the science he loves.
In some ways, he is a relic of the past; a past in which people spoke clearly and honestly and let the chips fall where they may; a past where ‘spin’ in atmospheric science was associated with vorticity and circulations, not how conclusions were worded to insure further funding; a past where the scientific method was followed and being a skeptic was a noble calling, not a derision; a past where a scientist could admit he was wrong and immediately embrace a new course based on new knowledge; a past where it was respectable to be an atmospheric scientist.
Dr. Gray’s detractors would love to relegate him to the past, but he obviously has every intention of confronting them in the present. I thank Dr. Gray for his unending contribution to the true science of climate, and I thank Anthony for highlighting Dr. Gray’s plan speech, knowledge and wisdom in this forum.
Wonderful command of language, and I suspect command of himself too. We need this strength of personality to speak truth unambiguously and loudly.
Aussiebear;
Now mix those 1,000 grains into a BIG bowl with other 999,000 grains and imagine how many Red grains you will see.
>>>>>>>>>>>>>>>>>>
Let’s modify the 400 red grains in a bowl of 1,000,000 white ones just a bit.
Assume instead a square glass jar that is 10 grains on a side. That’s 100 grains in a single layer. Now imagine the jar is 100 grains tall. 10,000 grains in all. Make 99,996 white and just 4 red, same ratio as your example above. Suppose the jar is about 10 cm tall. Now, instantly make all the white grains invisible. What would you see?
Well, you’d see a 10 cm tall jar that is mostly empty, with a fleck of red here and there. You could easily draw a vertical line from the bottom of the jar to the top without hitting any of those red flecks. In fact, you could draw a lot of them.
Now, stack thousands of those jars on top of each other in a tower 14 kilometers high. You’ll need a stack of 140 million jars. Now try drawing a line from bottom to top without hitting a red grain. You can’t. In fact, not only that, you can’t even do it without hitting thousands of red grains.
I’m a confirmed skeptic, but radiative physics is a bit more complex than simply drawing conclusions from concentration ratios.
Game-set-match!
All we need now is Bob Costas to give it as much coverage as he did for gun control on ESPN.
I sent it to many warmistas friends, sure to irritate many.
A great essay. Where is a reliable 100-hour weather forecast? Why should anybody trust a 100-year climate forecast? It might be possible, with much faster computers, and a much better understanding of solar physics, volcano physics, atmospheric physics, oceanology, and biology, to name just a few – but surely we are not there yet.
Of course a 100-year forecast has a great attraction: There is a ZERO risk that someone would tell you next week “You fool, your predictions are all wrong, get out of here!”
Wonderful……but will not get on the BBC or into the Guardian and the Independent newspapers.
Hell will freeze over before that happens.
Thanks Dr. Gray, and in support of:
“But measurements and theoretical considerations of deep cumulonimbus (Cb) convective clouds indicate any increase of CO2 and its associated increase in global rainfall would lead to a reduction of upper tropospheric RH and a consequent enhancement (not curtailment) of Outgoing Longwave Radiation (OLR) to space.”
OLR to space has indeed increased over the past 62 years, opposite of climate model predictions
http://hockeyschtick.blogspot.com/2010/09/scientist-there-is-no-observational.html
“The climate models are not able to resolve or correctly parameterize the fundamentally important climate forcing influences of the deep penetrating cumulonimbus (Cb) convection elements. This is a fundamental deficiency.”
proof the models do not realistically simulate convection: http://hockeyschtick.blogspot.com/2013/09/new-paper-finds-ipcc-climate-models.html
Water-vapor feedback is negative:
http://hockeyschtick.blogspot.com/search?q=water+vapor+negative+feedback&max-results=100&by-date=true
Father of chaos theory explains why it is impossible to predict weather & climate beyond 3 weeks
http://hockeyschtick.blogspot.com/2013/10/father-of-chaos-theory-explains-why-it.html
“Reliable long-range climate modeling is not possible and may never be possible.”
Yep, that’s correct.
In engineering there are many things that cannot be reliably modeled and many things that can be reliably modeled.
For example; an Earth orbiting satellite has many parts that are bolted, riveted, welded and glued (yes, glued) together. No one attempts to create a numerical model to tell if the parts will stay together when it’s launched. Launch vibration loads are quite extreme and exceed anything typically seen in airplanes or cars (expect for crashes). The customers prefer that their satellite survive the launch (they are so darn picky). So instead of a numerical computer model a physical qualification model, or Qual Model, (QM) is built. This is done in a meticulous fashion where all the dimensions, assembly procedures, bolt torques, etc. are recorded. Even to the extent of tracking individual lot numbers of materials (you want to use metal that is all poured from the same melt (pot) of aluminum for example).
Then you take your QM and put it on a shaker table, a big metal slab with powerful hydraulic cylinders that can shake holy heck out of your model. You program the shaker to shake your model twice as hard as you expect it to be shaken when you launch it. If it survives, you then make a Flight Model, or FM using exactly the same documented processes. This method of using a model yields very high success rates in satellite launches.
In another aspect of modeling we need to make reliable predictions of how long a satellite will function once launched. Other than mission determined things like how high it will fly and how much fuel (used to steer the unit) are on board the largest determining factor is the rate at which the individual components degrade. Yes, the parts degrade and fail over time. Optical coatings “fog”, micrometeorites sever cabling, high energy particles damage/destroy semiconductor junctions, etc. Ironically the huge vibration forces experienced during launch have no effect once in space.
Here numerical calculations (a model in a sense) take the place of physical Qual Models.
Estimates of how often a micrometeorite may strike, how many high energy particles to expect (depends on the orbit of course) and how long an optical coating will last are gathered from prior mission histories and experiments. One long term Space Shuttle experiment “left” a whole bunch of material samples up in orbit for several years before retrieving it and bringing it back down to Earth for evaluation.
Knowing the expected degradation effects and the rate of occurrence of events beyond the control of the engineers it is possible to design a satellite with a design life from a few years to a decade or more (Hubble). With this knowledge it is possible to design enough redundancy into the system to have a high confidence that the system will last 5 years (for example) before failing. For example, if you know that a power cable (from the solar cells to a data processor) is likely to be hit and severed by a micrometeorite every 5 years you can calculate that having 4 power cables in parallel should supply power for ten years with an expected reliability of 99.99% (a numerical example to illustrate the concept, not to be taken as exact values please). Now you may launch a satellite and have two power cables cut in the first year and have the other 2 last until the unit is out of fuel. Or, you may launch a bird and have all the cables intact for years until the unit runs out of fuel. But your model predicts that the chances the unit will operate for 10 years are greater than 99.99%. Another model may say that a unit with only one power cable has only a 50% chance of operating for 10 years, perhaps that is acceptable for a 2 year mission.
Given the long history of successful satellite launches this modeling process works quite well. But space is a nasty place and satellites will still occasionally fail without much warning, this is why the ultra high reliability GPS system has entire “spare” satellites “up there” ready to be repositioned just in case that one in a million event occurs. Several examples of units that “lost their minds” and tumbled out of control exist, and unfortunately most satellites have antennas that must be pointed back to Earth so they can be commanded and controlled. Once they tumble you cannot “talk” to them to see what’s going on. This is generally believed to be caused by high energy particles that exceeded the capabilities of the unit. In fact, many satellites have areas of heavy shielding which can be turned towards the Sun to hopefully block particles from solar flares.
Models, physical and numerical are very indispensable tools in Science and Engineering, but they need to be carefully verified and applied judiciously. Reprogramming a whole society to turn away from the life enhancing possibilities of “fossil” fuels based on climate “models” is madness and demonstrates extreme hubris on the part of those folks “pushing” these models. And history will ultimately show they are in fact full of hooey. Climate models will ultimately turn out to be the biggest scientific folly of their era.
Cheers, Kevin.
Eric Simpson says:
“My take on CO2 is that there is no actual evidence that CO2 does… anything!”
I agree. All the evidence suggests that atmospheric CO2 concentration changes are a result of temperature changes, not a cause.
Dr. Gray’s essay gets off to a bad start in the first sentence of the 2nd paragraph, where it claims that the general circulation models predict. That a model predicts implies the existence of the statistical population underlying this model. For the general circulation models there isn’t one. To state that the models “project” avoids this problem.
REPLY: You know, your semantic whining about predict -vs- project on just about every thread that uses either word is really becoming tiresome. Dr. Gray has his view, and people understand what he’s talking about. Give it a rest. – Anthony
The author makes an excellent point that about half the CO2 forcing is expended by latent heat of evaporation from the oceans to the upper troposphere where it radiates to space. This increased heat loss offsets surface warming. Furthermore more evaporation reduces the lapse rate – another negative feedback for CO2 GHE.
In some sense this happens every day in the tropics as solar heating iincreases so sea surface temperatures are limited to < ~30C.
I’m a confirmed skeptic, but radiative physics is a bit more complex than simply drawing conclusions from concentration ratios.
Thank you David, for saving me the trouble. This is the umptieth time this false analogy has showed up on WUWT and yes, it needs to get squashed every time. The relevant fact is that the atmosphere is measurably opaque in the LWIR bands associated with CO_2, water vapor, and ozone. It is seriously opaque when there are clouds.
rgb
Jer0me said:
December 10, 2013 at 4:47 pm
Mark and two Cats says:
December 10, 2013 at 3:55 pm
William M. Gray, Ph.D. … is professor emeritus of atmospheric science at Colorado State University
——————
Thanks to Dr Gray! I hope he doesn’t lose his job for daring to speak such heresy though.
A ‘professor emeritus’ is already retired. However, it is obvious that the dangers you mention are real, because it is generally only the retired who are willing to say these kinds of things that go against the orthodox party line.
—————————–
Yeah I truncated the section I was quoting too soon; he is also:
“head of the Tropical Meteorology Project at CSU’s Department of Atmospheric Sciences”.
Even were he completely retired, it would not be out of character for the warmunists to attack his reputation.
No matter his situation, I am grateful to him for taking a stand!
Temperatures always follow the energy, and the energy in the ocean is thousands of times greater than the energy in the atmosphere. Not surprising that minor changes in ocean circulation can have a large influence on air temps.
Dr. Gray’s three points have always been true. For the modelers I’ll add a fourth. The atmosphere had been changing due to biology for over a billion years. Anaerobic bacteria have been around a lot longer. The result of a couple billion years of biological chemistry is that the composition of the atmosphere is entirely biological. If you don’t understand the global biogeochemical carbon cycle then you can’t possibly understand the atmosphere.
Biological properties can’t be understood by models written by physicists.
“FORECASTING CLIMATE AND WEATHER ARE FUNDAMENTALLY DIFFERENT”
While this is basically true…it is a straw man argument. Forecasting models integrate initial inputs over time. Each integration gives ‘answers’ that are then used to calculate the next integration. Incomplete or inaccurate initial data, plus the inherent uncertainties in the integration process (no matter how small) eventually balloon and the output trends to useless. Even as we improve the observational input, it is estimated that the error inherent in the integration process alone will prohibit any real skill in numerical weather forecasting beyond 10 days. For longer ranch forecasts, pattern recognition becomes the more reliable method.
Climate models do not integrate initial observations like the forecast models do, so they are fundamentally different. Instead, they integrate drivers and processes over time. Lord knows that we do not have a good handle on all of the drivers and processes in the atmosphere, but even if we did, we would still have the error of integration as we ran those processes into the future. Like the forecast model, the numerical climate model will trend towards useless. While they are fundamentally different in some ways, both the climate and forecast models contain the same error generation that makes them useless over time. Such is the nature of Calculus and any non-linear, chaotic system.
Once again, pattern recognition becomes the more useful method for climate prediction. Warmists will argue that we can not use pattern recognition because the current release of CO2 into the atmosphere is unprecedented. While that may be true, it is still no reason to ignore the evidence of climate patterns, or to assume that this unprecedented release of CO2 trumps the pattern. The natural climate pattern is still alive and well, as the lack of warming in the 21st Century has confirmed. The true understanding of the CO2 impact can not be determined without an understanding of the natural pattern of climate change; and the natural pattern is something that the warmests have worked hard to deny.
Perhaps the readers of WUWT would be interested in helping Dr Gray’s Hurricane forecasting group out? They recently lost some private backing and I believe are still looking for a new funding source. Note that without funding, they say they may have to shut the project down. This would be a great loss.
Anthony:
You mischaracterize a “fallacious argument” as a “semantic issue.” Hopefully, your editorial policy does not favor fallacious arguments. Please clarify.
“The relevant fact is that the atmosphere is measurably opaque in the LWIR bands”
Yes, indeed it is, no doubt about that. And IF the surface was a narrow band emitter that ONLY gave off EM radiation in those bands the “Greenhouse Effect” would be a no-brainer, the energy would all be “trapped” and it would never leave and things would roast.
But, in reality the surface is a broadband emitter, and any individual photon has chances of being emitted at almost any wavelength outside of the “opaque” bands. And then it makes its way out to the mostly energy free void of space.
The much vaunted “Greenhouse Effect” merely delays the flow of energy through the Sun/Earth’s Surface/Atmosphere/Universe system by a few tens of milliseconds. Like an optical integrating sphere which demonstrates what a climate scientist would call “nearly” 100% radiative forcing. But the integrating sphere does not supply any “net energy gain” or “extra energy” and simply acts as an optical delay line.
When a pulse of light is input to an integrating sphere you can observe this delay as a “pulse stretching” effect. When a steady state input (like the EM radiation output from the Sun) is applied you cannot (with the currently available tools) observe this delay.
Very high scientific standards displayed here; “it needs to get squashed every time”.
Yes indeed, let’s “squash” anyone that disagrees with us.
For your information there is a whole field of engineering known as “optical engineering” which applies all of the theories of “radiative physics” to solve real problems. AND NONE of my textbooks (going back 5 decades) acknowledge anything as silly as the “Greenhouse Effect”. But we only design things that actually work, not models of hypothesized “effects”.
Please explain again how a gas present in minuscule amounts in the atmosphere with a thermal capacity nearly equal to nothing is “forcing” the massive thermal capacity of the oceans into “thermal equilibrium” and increasing their temperature by 20 some degrees, but this effect cannot be detected beyond the natural temperatures of the complex natural system ?
Squash away good man, squash away…..
Cheers, Kevin.
Dr. Gray provided a paper with a little more detail for those interested.
http://typhoon.atmos.colostate.edu/Includes/Documents/Publications/gray2012.pdf
So much of it just makes logical sense and it jumps right out at you. Here’s another paper that gets into the MOC and some of the possible ramifications.
http://www.nature.com/scitable/knowledge/library/deep-atlantic-circulation-during-the-last-glacial-25858002
It seems pretty obvious to me the longer term warming and cooling periods are driven by the long term ocean circulation. The MWP, the LIA the current warm period are all likely forced by changes in the speed of the MOC. The bipolar seesaw makes perfect sense when you understand the MOC. Whether the MOC is behind the PDO/AMO cycles is another interesting question. The oceans are likely the reason it is difficult to see large scale climate changes. They smooth out all the short term forcings which makes it very difficult to determine the scales.
Forecast models are designed to help people in the short term.
Climate models are designed to control people in the long term.
Thank you, Dr. Gray, and all those others in this arena that are just trying to help people.