Study: lack of cloud physics biased climate models high

The Hockey Schtick brings this to our attention. It seems Dr. Roy Spencer was prescient with his observation:

“The most obvious way for warming to be caused naturally is for small, natural fluctuations in the circulation patterns of the atmosphere and ocean to result in a 1% or 2% decrease in global cloud cover. Clouds are the Earth’s sunshade, and if cloud cover changes for any reason, you have global warming — or global cooling.”

This view of Earth's horizon as the sun sets o...
This view of Earth’s horizon as the sun sets over the Pacific Ocean was taken by an Expedition 7 crew member onboard the International Space Station (ISS). Anvil tops of thunderclouds are also visible. The image is also part of the header at WUWT. (Photo credit: Wikipedia)

Readers might also recall that evidence has been found for Spencer’s 1-2% cloud fluctuation. Even the National Science Foundation recognizes the role of clouds is uncertain: NSF Releases Online, Multimedia Package Titled, “Clouds: The Wild Card of Climate Change”

WUWT readers may recall the recent paper by Suckling and Smith covered at WUWT: New paper: climate models short on ‘physics required for realistic simulation of the Earth system’

In the Suckling and Smith paper it was concluded that the models they reviewed just don’t have the physical processes of the dynamic and complex Earth captured yet. This paper by de Szoeke et al. published in the Journal of Climate finds that climate models grossly underestimate cooling of the Earth’s surface due to clouds by approximately 50%

According to the authors, “Coupled model intercomparison project (CMIP3) simulations of the climate of the 20th century show 40±20 W m−2 too little net cloud radiative cooling at the surface. Simulated clouds have correct radiative forcing when present, but models have ~50% too few clouds.

Let that 40 watts/ square meter sink in a moment.

The 40 watts/ square meter underestimate of cooling from clouds is more than 10 times the alleged warming from a doubling of CO2 concentrations, which is said to be 3.7 watts/square meter according to the IPCC (AR4 Section 2.3.1)

So the cloud error in models is an order of magnitude greater than the forcing effect of Co2 claimed by the IPCC. That’s no small potatoes. The de Szoeke et al. paper also speaks to what Willis Eschenbach has been saying about clouds in the tropics.

Here is the paper:

Observations of stratocumulus clouds and their effect on the eastern Pacific surface heat budget along 20°S

Simon P. de Szoeke, Sandra Yuter, David Mechem, Chris W. Fairall, Casey Burleyson, and Paquita Zuidema Journal of Climate 2012 doi: http://dx.doi.org/10.1175/JCLI-D-11-00618.1

Abstract:

Widespread stratocumulus clouds were observed on 9 transects from 7 research cruises to the southeastern tropical Pacific Ocean along 20°S, 75°-85°W in October-November 2001-2008. The nine transects sample a unique combination of synoptic and interannual variability affecting the clouds; their ensemble diagnoses longitude-vertical sections of the atmosphere, diurnal cycles of cloud properties and drizzle statistics, and the effect of stratocumulus clouds on surface radiation. Mean cloud fraction was 0.88 and 67% of 10-minute overhead cloud fraction observations were overcast. Clouds cleared in the afternoon (15 h local) to a minimum of fraction of 0.7. Precipitation radar found strong drizzle with reflectivity above 40 dBZ.

Cloud base heights rise with longitude from 1.0 km at 75°W to 1.2 km at 85°W in the mean, but the slope varies from cruise to cruise. Cloud base-lifting condensation level (CB-LCL) displacement, a measure of decoupling, increases westward. At night CB-LCL is 0-200 m, and increases 400 m from dawn to 16 h local time, before collapsing in the evening.

Despite zonal gradients in boundary layer and cloud vertical structure, surface radiation and cloud radiative forcing are relatively uniform in longitude. When present, clouds reduce solar radiation by 160 W m−2 and radiate 70 W m−2 more downward longwave radiation than clear skies. Coupled model intercomparison project (CMIP3) simulations of the climate of the 20th century show 40±20 W m−2 too little net cloud radiative cooling at the surface. Simulated clouds have correct radiative forcing when present, but models have ~50% too few clouds.

===============================================================

Given this order of magnitude blunder on clouds, it seems like an opportune time to plug Dr. Spencer’s book where he pointed out the 1-2% cloud forcing issue. Click to review and/or buy at Amazon.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

102 Comments
Inline Feedbacks
View all comments
November 29, 2013 9:15 pm

Likening the model T to climate models is lame, a better comparison would be if Henry Ford had insisted against all evidence, prior to manufacture, that the model T would fly, operate under water and never be in an accident.
The models are fixed; this is the problem, a computer model does what the programmer instructs it to do, any programmer who pretends otherwise might need a new career.
Current climate models predict the future climate as well as scrying the future by studying the guts of small animals.

November 29, 2013 9:16 pm

Nick Stokes;
You have no idea how they are programmed. No such assumption is made.
>>>>>>>>>>>>
ROFLMAO
Go ahead Nick. Explain how models don’t attribute increased forcing, an increased ERL, and increased surface temps to increasing CO2. Go ahead. Explain.

Martin 457
November 29, 2013 9:17 pm

Real science should proceed. Political science should not be recognized as “Real science”.

Nick Stokes
November 29, 2013 9:26 pm

davidmhoffer says: November 29, 2013 at 9:16 pm
“Explain how models don’t attribute increased forcing, an increased ERL, and increased surface temps to increasing CO2.”

I thought you might have some idea of how they are programmed, but apparently not. Models don’t attribute forcing to anything. They are supplied forcing as data. In the case of CO2, it enters as a GHG into the radiative model. There’s no assumption in the programming about how it affects surface temps. There’s only the measured absorption spectrum, which goes back 150 years to Tyndall.
I think it’s time for loudmouth experts on GCM programming to point to the part of a GCM program where such an assumption is made.

November 29, 2013 9:53 pm

Nick Stokes;
In the case of CO2, it enters as a GHG into the radiative model. There’s no assumption in the programming about how it affects surface temps
>>>>>>>>>>>>>>>>>>
You’re playing with words. Technically what you said is correct. The assumptions made however, about how the radiative properties of CO2 play out in the system as a whole, result in increased surface temps. Just because this is arrived at via thousands of calculations attempting to model all the atmospheric processes rather than a direct calculation changes nothing. Pick any set of SRES scenarios you wish, and any model you wish, and the higher the CO2, the higher the temps the model will calculate based on the assumptions built into the system as a whole. Saying otherwise is just silly.

dp
November 29, 2013 9:58 pm

I can believe Nick or I can believe the model results vs observed. Can’t believe both. Sorry, Nick – no room in the tent for faith-based warming.

wrecktafire
November 29, 2013 10:04 pm

AR4, Chapter 8 (Models and their evaluation) was very explicit: they acknowledged a poor understanding of clouds. It was at that moment that I suspected that all the stuff in the Summary for Policymakers was completely politically driven, unconnected to the science. This was confirmed by their further admission in Chapter 8 that the team did not have any idea how to evaluate a particular model for accuracy. My jaw hit the floor. I went back and re-read the SPM to see if there was any acknowledgement of the serious weakness of the models–there wasn’t any. That day, I lost all respect for the IPCC process.

sophocles
November 29, 2013 10:12 pm

For every future futuristic view from the models, we can all chorus
“only if the changing cloud cover doesn’t change.”

ferdberple
November 29, 2013 10:22 pm

Nick Stokes says:
November 29, 2013 at 8:50 pm
You have no idea how they are programmed.
==========
they are programmed to predict what the model builders believe the future looks like. and when the models get the prediction wrong, the model builders change the model until the model gives the correct answer.
and how does the model builder know when the model has given the correct answer? when the model delivers the prediction the model builder believes to be correct for the future.
because if the models had predicted temperatures were going to flat-line in 2000, all the model builders would have seen this as an error, and would have adjusted the models until they showed rising temperatures.
that is why the models are hopeless at predicting the future. if the models actually showed us the future, unless that future was exactly what the model builders expected to see, they would assume the model was in error and needed to be fixed. and fix it they would, until it finally gave the “correct” answer.

November 29, 2013 10:23 pm

Nick Stokes;
I think it’s time for loudmouth experts on GCM programming to point to the part of a GCM program where such an assumption is made.
>>>>>>>>>>>>>>>
And I think it is time for people like YOU to stop issuing such challenges. If you can produce a single model that produces anything BUT higher temperatures in response to higher CO2, then your argument would be substantiated. Not to mention that the WUWT readership would be giddy. As for pointing out WHERE in the models this assumption is made, you know well that’s a fool’s errand as the assumptions don’t exist in any once single place. In fact, tracing out all the pieces of code that wind up delivering an end result of higher temps for higher CO2 is near impossible for any single person to do. I think you know this very well. I reproduce below a comment from Dr Robert Brown from a a few threads ago which lays out the challenge and exposes your little trick for what it is. But bottom line is that we don’t need to perform the analysis you demand to prove dbstealey’s assertion. All we need do is run the models with different levels of CO2 of any of them, ANY of them, produce ANYTHING other than higher temps for higher CO2. Here is Dr Brown’s comment:
>>>>>>>>>>>
GCMs without any question contain all sorts of information on pressure, density, and temperature of the air parcels they manipulate. If you are interested in seeing what goes on in at least on GCM, there is an open source on (CAM 3) with online documentation here:
http://www.cesm.ucar.edu/models/atm-cam/docs/description/
This is not one of the most detailed GCMs, but it is one where you can download the actual source and look at it, with program documentation in hand on the side. One can fault the design (one can ALWAYS fault a design:-) in a number of places, but the assertion that they don’t handle elementary parameters like pressure, density, and temperature is simply not correct. Indeed, to fault this one has to get pretty specific — point to a particular place in section 4 where they do something wrong. The top article basically suggests that there may well be something seriously wrong in CAM 3′s (and other GCMs’) treatments of aerosols, cloud formation, and vertical transport based on actual measurements. I suggest equally specifically that there may be something wrong with using a discrete latitude/longitude grid, especially with comparatively weak or missing adaptivity (CAM 3 actually has a tiny bit of adaptivity in it to handle polar regions better) for the specific reason that radiation rates on thermally averaged cells will be strict lower bounds — not upper bounds — on the true rate, so that CAM 3 and any other GCM that assigns a single temperature to a comparatively large horizontal area (in any given slab or layer) will underestimate the cooling via the unblocked channels and underestimate the rate of radiative energy transport between slabs in the blocked layers (basically, overestimating the “radiative insulation” properties of any given slab) because more uniform temperatures lead to more warming with exactly the same insolation at all scales.
This is the game, if one wants to criticize the GCMs. One can perfectly legitimately point out that they aren’t working without specifying why, as that is a posterior conclusion based on comparison of their predictions and the actual data, but if one wishes to assert that they aren’t working for a specific reason, to be responsible one has to look at the actual code and see if the specific reason you suggest is implemented in the actual code in a way that is (or more properly, may be for some evidence-supported reason) incorrect. So it isn’t that GCMs don’t include the effects of latent heat transport — they obviously do (see “shallow/middle troposphere moist convection” in the CAM 3 documentation, for example). It MIGHT be that they don’t include it CORRECTLY.
It is an open question as to whether or not they are leaving some physics out entirely that ends up being important. The galactic cosmic ray hypothesis, for example, has some empirical support but it is so far not a slam dunk or sufficiently compelling to warrant inclusion in a model on anything other than a trial basis. It would actually be interesting to include it ON a trial basis — one can always insert provisional physics into a model just to test the model and see if it does better with it or without it, or if it gives the model additional explanatory power. This is itself a form of weak evidence, if it does. In a highly multivariate model, however, it is probably WEAK evidence because model predictions, especially of single outputs, are very probably highly covariate in the physical parameters, so that one can turn up CO_2 sensitivity and turn up the effect of aerosols at the same time and maintain good agreement with GASTA across some training set, but end up with highly disparate long-term predictions as eventually CO_2 continues increasing but aerosols don’t.
I suspect that it is this alone that is largely responsible for much of the error in the GCMs relative to the present — they’ve systematically exaggerated CO_2 sensitivity and maintained agreement with data across the 50′s through the 90′s by asserting a larger effect to pollution and volcanic aerosols, but as we moved past the fit region and CO_2 continued up with aerosols not increasing to match (and with volcanism if anything a bit diminished) the highest senstivity models have started to systematically diverge from the observed temperatures.
Is this indeed the explanation? Hard to say. There need not be ONE explanation. There is no doubt that GCMs contain both positive and negative forcing terms and achieve agreement by balancing them. There is little doubt that they assign quite different values to the effect of aerosols as there is no consensus value or model (and the top article shows how nonlinear any model must be to correctly account for all of the observable physics!) The sad truth is that while there is only one way for a program to be right there are countless ways for it to be wrong. Once a program has the complexity of something like CAM 3, not only are there countless ways for it to be wrong but they get to where no single human knows the entire code and few humans are willing to take even a major component of that code and monkey with it as you have to START by learning it all. It gets to be very “expensive” to make changes — one can spend most of a postdoctoral position just getting to understand what the existing code does and have little time to even THINK of making serious changes, retraining the code parameters, and then spending two years of CPU time running the program all over again to see if the changes don’t egregiously break the existing code and (perhaps) lead to some improvement.
I’ve downloaded and looked over the CAM 3 code. Sadly, however good or bad the code itself may be, the packaging of the code truly sucks. It would be a matter of weeks of work (for me) just to get it to BUILD, lest alone build and run on some small test program, and I’m a pretty damn good programmer (although I do hate Fortran, sigh:-). It just isn’t worth it — I have no grant for working on climate (so it is by definition a “hobby”, not a profession), I’m not getting paid to do it, I am getting paid to do a lot of other stuff that is very time consuming and have lots of other hobbies/projects that languish for lack of work on my part.
Porting a rather large Fortran program to C, organizing it so that it will automagically build across a range of platforms in both parallel and serial versions, replacing the lat/long tessellation with a scalable icosahedral tesselation, determining the granulation error in radiative transfer rates as a function of scale and estimates of per-cell spatiotemporal noise, correcting the aerosol, cloud, and vertical transport component (or somehow parameterizing it so that one can experiment with different rates based on empirical evidence as it comes in), adding an “optional” (parametric) component for GCR-modulated cloud nucleation rates, transforming the initialization data from lat/long to the icosahedral grid, fixing the single-slab ocean model to account for oceanic transport and more, better, figuring out how to correctly include projections of solar state (out as far as such projections themselves have some reasonable chance to be right) — I could spend the rest of my life working on this WITH A TEAM and a million dollars a year in grant money. A bit much too tackle for free and if I want to have a life of some sort on the side.
rgb

AB
November 29, 2013 10:27 pm

ferdberple
November 29, 2013 10:30 pm

and how did the model builders know that the models were wrong to predict temperatures were going to flat-line in 2000, and therefore needed to be fixed? because CO2 levels were increasing and thus temperatures could not flat-line.
thus, the models were programmed to predict that rising CO2 levels would lead to rising temperatures, because that was the expectation of the model builders. formally this is known as the experimenter expectation effect.
unless experiments are designed very carefully, the experimenter always finds what they are looking for, because they search until they find the effect, and then stop searching. and by allowing the models to be adjustable, the act of adjustment is a form of search.
the model builders are searching for the correct parametric adjustment so that the models shows the future exactly what they believe it should show, which they use as confirmation to themselves that the model is adjusted correctly.

ferdberple
November 29, 2013 11:16 pm

In a highly multivariate model…and the top article shows how nonlinear any model must be
=============
non-linear multivariate time series analysis as climate models are attempting to perform is beyond the ability of modern mathematics to solve by numerical methods, except in trivial cases. the non-linearity of the problem ensures that round off errors quickly overwhelm the solution, leading to a nonsense result.
the notion that model errors will “even out” over time, such that climate models will be more accurate the longer the forecast horizon, is a fundamental mathematical error. a misapplication of statistics. an example of why this does not work for forecasting the future is the inertial guidance system, as widely used before GPS.
An inertial guidance system predicts the future position of the vehicle based on sensor readings of various forcings and feedbacks. This prediction for the future is then fed into the vehicle steering controls such that the vehicle will maintain it future course. However, no matter how precise you make the system, it will drift.
The errors left and right of course, even though they are random and should by the law of large numbers even out to zero over time, in fact do not average out. The vehicle drifts off course, sometimes to the left, sometimes to the right, and the longer the vehicle stays on inertial guidance, the more likely it is to be off course.
This is completely contrary to the basic beliefs of climate modelling, that the model errors will cancel out over time, making long term forecast more accurate. Inertial guidance systems demonstrate that the problem is fundamental. you can reduce the error by making the equipment more precise, but no matter what you do, the error increases with time. the models will drift.

GAT
November 29, 2013 11:44 pm

In local forecasts the mets talk about how relative humidity, cloud cover (or winter snow cover) and the sun will affect the next day’s local weather. I really don’t see how it’s such a revelation that it’s any different on a global scale. (Not once have I heard a local met discuss CO2 conditions as a factor.)

Steve Reddish
November 29, 2013 11:52 pm

Martin Hertzberg says:
November 29, 2013 at 1:20 pm
“it is hard to understand how either clouds or clear skies at a lower temperature than the surface or the atmosphere below can radiate anything downward..”
Perhaps you are thinking of the 2nd law of thermodynamics, which is often described by “heat always flows from warmer objects to colder objects”. This is actually a simplification. A slightly better description (still not the best) is “Net heat flow is always from warmer objects to colder objects and rate of net heat flow is directly proportional to the temperature difference.”
Matter always radiates heat at a rate corresponding to its temperature. If the first description of the 2nd law was valid, a red dwarf star near a hotter yellow G star would have to cease radiating on the side toward the G star, or the G star would have to turn away the energy being radiated in its direction by the red dwarf. In actuality, the red dwarf radiates energy in all directions, some of which strikes the G star. Simultaneously, the G star is radiating a far greater amount of energy in all directions, some of which strikes the red dwarf. Since energy from the G star striking the red dwarf exceeds the energy from the red dwarf striking the G star, net heat flow is from the G star to the red dwarf.
In the case of a cloud overhead, the cloud is radiating in all directions, at a rate determined by its temperature, even towards the warmer ground below. Simultaneously, the ground is radiating at a rate determined by its temperature. Net heat flow is still from ground to cloud. The ground temperature drops at a rate determined by the difference in rate of heat flow from ground to cloud versus the lesser cloud to ground rate.
Since clouds are usually warmer than clear air due to heat of condensation, and because water absorbs IR strongly (mostly coming from the ground below), the ground receives more radiated heat from a cloudy sky than from clear air. This is why the temperature drops quicker on a clear night than on a cloudy night, even though the clouds are colder than the ground. This is as close as we can come to a real world green house effect.
Consider also that clouds could not be keeping the ground warmer (than it would be in their absence) by reducing either convective heat loss or conductive heat loss by the ground.
SR

Pippen Kool
November 30, 2013 12:25 am

dbsteely says: “Because models are not a hundred years old like a Model T; they are current, and extremely expensive, and they are still completely wrong!”
Stupid #1. Sorry. You, and many others here, are wrong. The models are getting better, each year, just like cars did, or spaceships did, or computers did. It’s the way that scienceny stuff works, and if you don’t like it, get over it. If you don’t understand the changes, either listen to those that do or go to school.
Stupid #2: “GCMs are always wrong for one simple reason: they are programmed with the assumption that CO2 causes rising temperatures, when in reality, it is ∆T that causes ∆CO2.”
Wow. You are living in your own little imaginary bubble land. Enjoy yourself and be happy.
!!! pop !!!

Jimbo
November 30, 2013 1:31 am

Nick Stokes’ world (and modeling career) is falling apart. At the end of the day it’s observations compared to “what if”, scenarios, story lines, projections, predictions that matter. So far 95% of the models are failing, and failing badly. If this carries on much longer the referee will have to blow the final whistle.
Nick, what will it take for your to re-assess AGW as stated by the IPCC AR5 on global surface temperatures? That is how science works as you say.

Jimbo
November 30, 2013 1:43 am

Pippen Kool says:
November 30, 2013 at 12:25 am
………………..
Stupid #1. Sorry. You, and many others here, are wrong. The models are getting better, each year, just like cars did, or spaceships did, or computers did. It’s the way that scienceny stuff works, and if you don’t like it, get over it. If you don’t understand the changes, either listen to those that do or go to school.

They are getting ‘better’ because they are coming down to the sceptics’ point of view – lower climate sensitivity. And you are right that is how science works, OBSERVATIONS. Thanks for coming over to the sceptic side.

What Are Climate Models Missing?
Bjorn Stevens1, Sandrine Bony2
Fifty years ago, Joseph Smagorinsky published a landmark paper (1) describing numerical experiments using the primitive equations (a set of fluid equations that describe global atmospheric flows). In so doing, he introduced what later became known as a General Circulation Model (GCM). GCMs have come to provide a compelling framework for coupling the atmospheric circulation to a great variety of processes. Although early GCMs could only consider a small subset of these processes, it was widely appreciated that a more comprehensive treatment was necessary to adequately represent the drivers of the circulation. But how comprehensive this treatment must be was unclear and, as Smagorinsky realized (2), could only be determined through numerical experimentation. These types of experiments have since shown that an adequate description of basic processes like cloud formation, moist convection, and mixing is what climate models miss most.
http://www.sciencemag.org/content/340/6136/1053.summary

Abstract
Between these conflicting tendencies, 12 projections show drier annual conditions by the 2060s and 13 show wetter.
These results are obtained from sixteen global general circulation models downscaled with different combinations of dynamical methods……
http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00766.1

Abstract – 3 June 2013
[1] In contrast to Arctic sea ice, average Antarctic sea ice area is not retreating but has slowly increased since satellite measurements began in 1979. While most climate models from the Coupled Model Intercomparison Project Phase 5 (CMIP5) archive simulate a decrease in Antarctic sea ice area over the recent past, whether these models can be dismissed as being wrong depends on more than just the sign of change compared to observations. We show that internal sea ice variability is large in the Antarctic region, and both the observed and modeled trends may represent natural variations along with external forcing. While several models show a negative trend, only a few of them actually show a trend that is significant compared to their internal variability on the time scales of available observational data. Furthermore, the ability of the models to simulate the mean state of sea ice is also important. The representations of Antarctic sea ice in CMIP5 models have not improved compared to CMIP3 and show an unrealistic spread in the mean state that may influence future sea ice behavior. Finally, Antarctic climate and sea ice area will be affected not only by ocean and air temperature changes but also by changes in the winds. The majority of the CMIP5 models simulate a shift that is too weak compared to observations. Thus, this study identifies several foci for consideration in evaluating and improving the modeling of climate and climate change in the Antarctic region.
http://onlinelibrary.wiley.com/doi/10.1002/jgrd.50443/abstract

Bryan
November 30, 2013 1:54 am

Steve Reddish says:
“Perhaps you are thinking of the 2nd law of thermodynamics, which is often described by “heat always flows from warmer objects to colder objects”. This is actually a simplification. A slightly better description (still not the best) is “Net heat flow is always from warmer objects to colder objects and rate of net heat flow is directly proportional to the temperature difference.”
Yes ‘better ‘ would be to avoid calling radiation …..heat.
Electromagnetic radiation energy is correct.
Heat can be transformed into thermodynamic work so cooler to hotter radiation is not heat.
Generalized statements about the second law sometimes require further clarification for particular circumstances.
Take today’s local weather
Air temperature about 4C
Ground temperature about -2C
Sun not ‘up’ yet.
The local ground temperature will continue to drop despite the higher air temperature.
The ground being much more dense will radiate a much higher continuous flux than the downward thin radiative flux of the warmer air.
The spectral flux of the air has many gaps particularly around the atmospheric window of 10um.
The energy gain of the surface from conductive interaction with the air is insufficient to make much difference.
Net result is heat loss by land surface is greater than energy gain from atmosphere.

Leon0112
November 30, 2013 3:51 am

Mosher seems to consistently argue that the “science is not settled” and scientists are working constantly to improve their models and understanding. At least that is my understanding of his comments.
If so, good on him.

old engineer
November 30, 2013 4:21 am

The fact that GCM’s don’t “do clouds” , but that clouds are an important variable has been know for a long time. I first became aware of this from a February, 1993, article in “R&D Magazine” titled “Climate Researchers Look to the Clouds.” That was 20 years ago, and only 5 years after Hansen’s 1988 paper.
A couple of quotes from that article:
“Satellite measurements have found that the actual-as opposed to modeled- net effect of clouds in the present climate is to cool the planet. The Earth Radiation Budget Experiment recorded, in April 1985, both SW and LW effects of clouds.
That month an average of 342 W/m^2 shone on the earth in the form of SW radiation- also know as sunlight. Clouds reflected about 45 W/m^2 of this energy, while trapping 31 W/m^2 in the form of LW radiation.”
and:
“All of these facts have led Veerabhadran Ramanathan, of the Scripps Institution of Oceanography, La Jolla, CA, to suggest that cirrus anvils might ‘act like a thermostat” over tropical oceans to arrest warming.”
I doubt that Willis had heard of Dr. Ramanathan when he proposed the something similar.

November 30, 2013 5:08 am

There are fundamental philosophical problems with modelling, that need attention:
“Everything simple is false. Everything which is complex is unusable.” (Paul Valéry)
The diminishing returns of map making were discovered long ago:
“We now use the country itself, as its own map, and I assure you it does nearly as well.” (Lewis Carroll)
No genuine scientific theory can hang its hat on a model. Claiming that we will understand the science of something when we can model it, is absurd. We might model it perfectly but not understand it. However, it is rational to assume that we will be able to model it, when we understand it! This type of science (Modelling for the truth!) is politicly motivated at the outset and intended to fail. The answer from the model will be “42”, then we will have to start all over, searching for the question 😉 The fundamental assumptions still have yet to be proved and no model, no matter how accurate, can reveal them. If you want an accurate process, look out the window!
Fun aside, just stop for a moment and really think what these models are attempting to do.
They seek to model global weather (Perhaps that should be climate;-) in order extrapolate the temperature in the future!
Big brahamic in-breath!
I’m a big believer in the power of computers but then that is just the point!
We pick Co2 to blame because it is the biggest economically and politically but unfortunately it is a bit player in the atmosphere*.
Occam might just have cut his own throat, to end the irrationality.
* With qualification, modellers would say: “Sure, he is a tiny man but he is piggybacking a giant and he’s the brains of the relationship” (Think master blaster from Beyond Thunderdome, Co2 (A gas) riding on the shoulders of H2O (A solid, liquid, gas that gets around in great evaporative machines and powerful precipitative heat exchangers called clouds! 😉

ferdberple
November 30, 2013 5:11 am

old engineer says:
November 30, 2013 at 4:21 am
That month an average of 342 W/m^2 shone on the earth in the form of SW radiation- also know as sunlight. Clouds reflected about 45 W/m^2 of this energy, while trapping 31 W/m^2 in the form of LW radiation.”
==================
which is opposite of the 3x positive feedback assume for increased water vapor. which is why the famous “hot spot” predicted by all climate models does not happen.
water in the atmosphere makes the day cooler and the night warmer. This is because water “blocks” radiation in both direction. It blocks energy from the sun reaching the earth, and blocks radiation from the surface reaching space. however the net effect is negative – not positive.
It is the negative feedback for water in the atmosphere that stabilizes the temperature of the earth, such that over the past 600 billion years the range of average temps has remained at about 16C +- 6C, regardless of CO2 levels.

Mervyn
November 30, 2013 5:38 am

The work of Dr Henrik Svensmark has been invaluable. People should watch the documentary titled “Svensmark: The Cloud Mystery” (2008) about his ground breaking research relating to the astonishing correlation between solar activity, galactic cosmic rays and cloud cover.

Bill Illis
November 30, 2013 5:49 am

0.7C of the 3.0C per doubling proposition is based on a reduction in clouds as a feedback from the initial warming produced by CO2.
But let’s say, it is actually Zero feedback instead. The 3.0C per doubling falls to 2.3C.
Or let’s say the sign of the feedback is actually opposite. Now the 3.0C falls to 1.6C.
Or let’s say the feedback is a large negative instead. Now the 3.0C falls below 1.0C.
Is getting clouds right important?
Its the difference between little warming and large warming impacts so it is obviously very important to get it right.
But the simple fact is that we do not know which one of the above scenarios is right. We have no clue. The observational evidence is relatively scant and variously provides some evidence supporting all of the scenarios.