Real pollution (not 'carbon' pollution) increases storm clouds

From the Pacific Northwest National Laboratory: A common theme among climate alarmists is that the wrongly named ‘carbon pollution’ aka carbon dioxide, increases the frequency and intensity of storms. Observational data show that NOT to be true,  and this new study shows why real pollution (aerosols) results in larger, deeper and longer lasting storm clouds, leading to colder days and warmer nights – Anthony

Cloud and Ice
Pollution decreases the size of cloud and ice particles and increases their lifespans, making clouds grow bigger.

RICHLAND, Wash. – A new study reveals how pollution causes thunderstorms to leave behind larger, deeper, longer lasting clouds. Appearing in the Proceedings of the National Academy of Sciences November 26, the results solve a long-standing debate and reveal how pollution plays into climate warming. The work can also provide a gauge for the accuracy of weather and climate models.

Researchers had thought that pollution causes larger and longer-lasting storm clouds by making thunderheads draftier through a process known as convection. But atmospheric scientist Jiwen Fan and her colleagues show that pollution instead makes clouds linger by decreasing the size and increasing the lifespan of cloud and ice particles. The difference affects how scientists represent clouds in climate models.

“This study reconciles what we see in real life to what computer models show us,” said Fan of the Department of Energy’s Pacific Northwest National Laboratory. “Observations consistently show taller and bigger anvil-shaped clouds in storm systems with pollution, but the models don’t always show stronger convection. Now we know why.”

Also, pollution can decrease the daily temperature range via such clouds: High clouds left after a thunderstorm spread out across the sky and look like anvils. These clouds cool the earth during the day with their shadows but trap heat like a blanket at night. Pollution can cause clouds from late afternoon thunderstorms to last long into the night rather than dissipate, causing warmer nights.

Secret Life of Clouds

Models that predict weather and climate don’t reconstruct the lives of clouds well, especially storm clouds. Usually these models replace storm clouds with simple equations that fail to capture the whole picture.

Because of the poor reconstructions, researchers have been faced with a dilemma: Pollution causes the anvil-shaped clouds to linger longer than they would in clean skies — but why?

Possible reasons revolve around tiny natural and manmade particles called aerosols that serve as seeds for cloud droplets to form around. A polluted sky has many more aerosols than a clean sky — think haze and smog — and that means less water for each seed. Pollution makes more cloud droplets, but each droplet is smaller.

More and smaller droplets change things for the clouds. Researchers have long thought that smaller droplets start a chain reaction that leads to bigger, longer-lasting clouds: Instead of raining down, the lighter droplets carry their water higher, where they freeze. The freezing squeezes out the heat the droplets carry with them and causes the thunder cloud to become draftier. The stronger convection lifts more water droplets, building up the cloud.

But researchers don’t always see stronger convection every time they see larger and longer-lasting clouds in polluted environments, indicating a piece of the puzzle was missing.

To solve this dilemma, Fan and colleagues decided to compare real-life summer storm clouds to a computer model that zooms deep into simulated clouds. The model included physical properties of the cloud particles as well as the ability to see convection, if it gets stronger or weaker. Most models run in days or weeks, but the simulations in this study took up to six months.

“Modeling the details of cloud microphysical properties is very computationally intensive, so models don’t usually include them,” said Fan.

Convection Vexation

The researchers started with cloud data from three locations that differ in how polluted, humid and windy they typically are: the tropics in the western Pacific, southeastern China and the Great Plains in Oklahoma. The data had been collected through DOE’s ARM Climate Research Facility.

With support from DOE’s Regional and Global Climate Model program, the research ran simulations on PNNL’s hometown supercomputer Olympus. Their simulations of a month of storms ended up looking very similar to the actual observed clouds, validating that the models re-created the storm clouds well.

The team found that in all cases, pollution increased the size, thickness and duration of the anvil-shaped clouds. However, only two locations — the tropics and China — showed stronger convection. The opposite happened in Oklahoma — pollution made for weaker convection.

This inconsistency suggested that stronger convection isn’t the reason. Taking a closer look at the properties of water droplets and ice crystals within clouds, the team found that pollution resulted in smaller droplets and ice crystals, regardless of location.

In addition, the team found that in clean skies, the heavier ice particles fall faster out of the anvil-shaped clouds, causing the clouds to dissipate. However, the ice crystals in polluted skies were smaller and too light to fall out of the clouds, leading to the larger, longer-lasting clouds.

Lastly, the team estimated how much warming or cooling the storm clouds contributed. Overall, the polluted clouds cooled the day and warmed the night, decreasing the daily temperature range.

Most models don’t simulate convection well, take into account the microphysical processes of storm clouds, nor address how pollution interacts with those processes. Accounting for pollution effects on storm clouds in this way could affect the ultimate amount of warming predicted for the earth in the next few decades. Accurately representing clouds in climate models is key to improving the accuracy of predicted changes to the climate.

This work was supported by the DOE’s Office of Science.


Jiwen Fan, L. Ruby Leung, Daniel Rosenfeld, Qian Chena, Zhanqing Lid, Jinqiang Zhang, and Hongru Yan. Microphysical effects determine macrophysical response for aerosol impacts on deep convective clouds, Proc Natl Acad Sci U S A, Early Edition online the week of November 11-15, 2013, DOI:10.1073/pnas.1316830110.


Deep convective clouds (DCCs) play a key role in atmospheric circulation and the hydrological and energy cycle. How aerosol particles affect DCCs is poorly understood, making it difficult to understand current and future weather and climate. Our work showed that in addition to the invigoration of convection, which has been unanimously cited for explaining the observed results, the microphysical effects induced by aerosols are a fundamental reason for the observed increases in cloud fraction, cloud top height, and cloud thickness in the polluted environment, even when invigoration is absent. The finding calls for an augmented focus on understanding the changes in stratiform/anvils associated with convective life cycle.


Deep convective clouds (DCCs) play a crucial role in the general circulation, energy, and hydrological cycle of our climate system. Aerosol particles can influence DCCs by altering cloud properties, precipitation regimes, and radiation balance. Previous studies reported both invigoration and suppression of DCCs by aerosols, but few were concerned with the whole life cycle of DCC. By conducting multiple monthlong cloud-resolving simulations with spectral-bin cloud microphysics that capture the observed macrophysical and microphysical properties of summer convective clouds and precipitation in the tropics and midlatitudes, this study provides a comprehensive view of how aerosols affect cloud cover, cloud top height, and radiative forcing. We found that although the widely accepted theory of DCC invigoration due to aerosol’s thermodynamic effect (additional latent heat release from freezing of greater amount of cloud water) may work during the growing stage, it is microphysical effect influenced by aerosols that drives the dramatic increase in cloud cover, cloud top height, and cloud thickness at the mature and dissipation stages by inducing larger amounts of smaller but longer-lasting ice particles in the stratiform/anvils of DCCs, even when thermodynamic invigoration of convection is absent. The thermodynamic invigoration effect contributes up to ∼27% of total increase in cloud cover. The overall aerosol indirect effect is an atmospheric radiative warming (3–5 W⋅m−2) and a surface cooling (−5 to −8 W⋅m−2). The modeling findings are confirmed by the analyses of ample measurements made at three sites of distinctly different environments.

0 0 vote
Article Rating
Newest Most Voted
Inline Feedbacks
View all comments
November 26, 2013 1:20 pm

Science, I believe we may have some here.

November 26, 2013 1:26 pm

What happens when there’s a high pressure weather system(dry, i guess?;i.e. no rain) and an inversion layer holding the exaust/combustion gasses close to the surface, leading to unhealthful air and burn bans.I can smell the car exaust and smoke from heating fires.Hazy dirty clouds and it dosen’t get very warm?Day time highs around 39°F .Shouldn’t that temp be higher?or maybe it should actually be lower and it’s higher because of the pollutants in the air?
my current temps/conditions are:Temp=39F ,Humidity=65% ,Dew point 28F ,Barometer 30.2——————->no storm extreme weather near surface.maybe extreme pollutants and CO2(there is a burn ban and air quality warnings in effect since last week and continuing this week)………………………………………………………WUWT!?
Thanks for the interesting articles .

November 26, 2013 1:31 pm

Also, look at what happens when the heating fires/combustion gasses get out of control and there is clouds :
and i thought the science was settled?

November 26, 2013 1:37 pm

Great stuff, AND falsifiable through observations in the real world, which means it’s real science.

November 26, 2013 1:42 pm

Wow, those are serious numbers for the changes in forcing. One has to wonder if GCR-modulation models (if they are correct) have similarly amplified effects.
But overall, it is yet another key component in the physics in GCMs that could not only be incorrect, but egregiously incorrect. Even small changes in albedo make big changes in surface temperature averages, as it slices insolation “off the top” as it were, although the abstract above suggests that the change in surface forcing due to enhanced vertical heat transport through most of the opaque GHG layer of the atmosphere may be even more important.
It’s already been suggested that the inconsistency in the treatment and effect of aerosols is a major factor in the apparent failure of GCMs to reproduce the current flat climate (or each other). The article is just another piece of evidence supporting the idea that it is time to go back and rebuild the GCMs nearly from scratch, altering lots of things within them.
As I continue to meditate upon them, I cannot help but think that a second place they are likely to be making a serious problem is in their gridding and granularity. A lat/long grid is horribly biased due to the spherical polar coordinate Jacobean, and this alone can lead to large, and difficult to control, errors. It is also very difficult to rescale, making adaptive solutions that rescale to convergence difficult to write. I would very much want to see a uniform rescalable grid used — an icosahedral tessellation, for example — and implemented in a completely rescalable way for full adaptive operation.
Why, you might ask? Because of the T^4 used in radiative cooling computations. At any scale of the estimation of cooling due to thermal radiation, and in any sub-band of LWIR, spatiotemporal averaging of temperatures will produce a spurious warming. This is because the radiation rate from 1 square kilometer, half at 300 K and half at 290 K is strictly greater than the radiation rate from 1 square kilometer at the mean temperature of 295K. It’s not clear to me that there is any scale where this effect stops. The front yard of my house is often 5 to even 10 C warmer than the back yard — the front yard faces southwest with little shade and lots of concrete and asphalt, the back yard faces northwest (in the shade of the house in the afternoon) and has lots of trees. The radiation rate from the property is dominated by the hot front yard, not the cool back yard, and is not equal to the rate of an equal sized plot of land at the average temperature.
GCMs impose a granularity of (typically) 1 degree, without direct compensation for the Jacobean. The average temperature of that 1 degree not-quite-squared will strictly underestimate its radiative cooling unless the temperature is truly homogeneous across the area, yet that whole square is necessarily assigned a single average temperature in the GCMs.
The interesting question is: What is the renormalized correction factor, given the presumably measurable average spatiotemporal temperature inhomogeneity? Is it 1%? 5%? 0.1%? My feeling is that it is around 1%, perhaps a bit less, but that is order of a W/m^2, enough (applied as it would be nearly everywhere) to make a pretty substantial difference. Perhaps enough to explain the growing deficit in warming predicted by the GCMs? Don’t know. I might be able to do the computation, though, and find out subject to some assumptions.

November 26, 2013 2:00 pm

Is the science settled yet? And they wonder why the models keep on failing.

Mike Smith
November 26, 2013 2:11 pm

Okay, so the models did a lousy job on cloud formation. But we’re still 95% certain the planet is subject to catastrophic warming. Well, okay, it hasn’t actually warmed in the past 20 years, but we still have a warming trend. Yeah, okay, it was much warmer in the past, but the current warmth has a “human fingerprint”. 97% (or was it 0.3%?) of scientists agree.
If that’s all they got, the gig is over!

Peter Melia
November 26, 2013 2:34 pm

Obviously the findings must be accepted, especially by lay-people such as me, unless or until someone later shows otherwise.
However, we’ve had this pollution (which cools) for 2, perhaps 3 generations and for a greater part of that period the planet has been warming.
How so?

November 26, 2013 2:53 pm

@Peter Melia: “However, we’ve had this pollution (which cools) for 2, perhaps 3 generations and for a greater part of that period the planet has been warming.”
Call it the EPAGW hypythosesis. As pollution controls got better, albedo lowered, the planet warmed. And that saved us from the catastrophic coming ice age predictions and gave us catastrophic warming predictions. Extensionally we can fault China’s recently hot economy and lax pollution controls from saving us from a climate disaster by undoing the harm the EPA has wrought.
Ballpark. Have fun with the notion anyways.

Alec aka Daffy Duck
November 26, 2013 2:55 pm

The first thing that came to mind is the article from back in august:

November 26, 2013 3:11 pm

rgbatduke: “But overall, it is yet another key component in the physics in GCMs that could not only be incorrect, but egregiously incorrect. … suggests that the change in surface forcing due to enhanced vertical heat transport through most of the opaque GHG layer of the atmosphere may be even more important.”
All is as you say in your post. An additional concern I have is that there is a lot of discussion about line spectra and almost no discussion about the effects on the pressure and density at points in the atmospheric column. This may simply be something that isn’t discussed, well, anywhere. But so far as I know it hasn’t been looked at either. As certainly when temp increases, pressure does, and the density drops. Which reduces conductive transfer and increases radiative losses.
There seems to be a wholesale negligence on modeling the atmosphere as an atmosphere and not a chunk of dry ice.

Rhoda R
November 26, 2013 3:16 pm

Alec aka Daffy; then global warming WAS the result of human activity — just not the culprit originally envisioned.

November 26, 2013 3:48 pm

Perhaps if the team had gone a little bit deeper into the research and asked themselves why and how those cloud droplet forming ice crystals were originally created, they might have come across some quite old research which shows a lot of the ice crystal formation in the upper troposphere is the direct outcome from the presence of ice forming bacteria and bacteria created chemical compounds which act as the nuclei around which super cooled water vapor in the upper atmosphere can condense around to form cloud droplets.
All these natural phenomena aren’t all pollution created as seems to be the current fashionable claim amongst the believers . Such a claim is naturally a direct spin off from mankind’s supposed grievous sinning against Gaia.
Nature is far, far more subtle about it’s secrets and revealing those secrets than that.
Still more of the unknown unknowns or the deeper they go into the science, the deeper it gets.

November 26, 2013 5:27 pm

Secret life of clouds? I knew we couldn’t trust the trees, and now it’s the clouds that are up to something on the sly. Aristophanes was right.

November 26, 2013 6:58 pm

Well, we shall see, maybe , maybe not, but I am sure it is a mistake to be sure of anything based on one study, and most people here should understand why.

November 26, 2013 8:01 pm

This is because the radiation rate from 1 square kilometer, half at 300 K and half at 290 K is strictly greater than the radiation rate from 1 square kilometer at the mean temperature of 295K.
Lost track of how many times I’ve made that point, have almost given up drawing attention to it. Glad to see that someone else is beating that drum.

November 26, 2013 9:22 pm

TheLastDemocrat sez:
feedback is a bi*c*

Brian H
November 26, 2013 11:34 pm

Homeostasis, anyone?

Stephen Richards
November 27, 2013 1:05 am

Is this new ? Some 40 years ago we were seeding clouds with silver iodide. Isn’t that the same physics ?

Otter (ClimateOtter on Twitter)
November 27, 2013 1:50 am

If that works out to be accurate, then would it also be accurate to say that China’s massive increase in coal and oil use, is the major contributor to typhoons in the western Pacific?

Ben Wouters
November 27, 2013 2:41 am

This is because the radiation rate from 1 square kilometer, half at 300 K and half at 290 K is strictly greater than the radiation rate from 1 square kilometer at the mean temperature of 295K.
Have a look at our moon: albedo .11, so it absorbs more solar than earth.
It’s effective temperature is calculated to be ~270K. (Earth ~255K)
Actually measured by the Diviner project: 197K !!!
Sun radiating on only half a planet makes a difference!
So iso trying believing the atmosphere can increase the average surface temperature ~33K,
let’s try to explain why the average surface temperature on earth is > 90K higher than on the moon in spite of the difference in albedo.
The correct answer is not the atmosphere.

November 27, 2013 8:53 am

Whether it’s carbon or other pollution, most of it comes from the development and use of fossil fuels. That is the elephant in the room on earth, and it is most unfortunate that it controls most governments with its almighty dollar.

November 27, 2013 9:32 am

the effects on the pressure and density at points in the atmospheric column. This may simply be something that isn’t discussed, well, anywhere. But so far as I know it hasn’t been looked at either. As certainly when temp increases, pressure does, and the density drops. Which reduces conductive transfer and increases radiative losses.
There seems to be a wholesale negligence on modeling the atmosphere as an atmosphere and not a chunk of dry ice.

I do not think this is true. IIRC one of the reasons that one expects the GHE to increase (still) with CO_2 even though CO_2’s effect is long since saturated is line broadening at the higher partial pressures, especially at altitude. I’m not so sure I buy this argument but if you look at Grant Petty’s book on atmospheric radiation you’ll see that all of these sorts of effects are accounted for “as best they can” given that one has to sum/integrate over dense bands of levels and a range of pressures and temperatures. Some of the effects are indeed negligible — the time required for a CO_2 molecule to thermalize to the surrounding gas after absorbing an IR photon is much, much less than the emission rate, for example, which justifies treating the absorption and emission processes as simple thermal rates rather than resonant absorption/emission (although no doubt some of the latter occurs, it is orders of magnitude less important).
GCMs without any question contain all sorts of information on pressure, density, and temperature of the air parcels they manipulate. If you are interested in seeing what goes on in at least on GCM, there is an open source on (CAM 3) with online documentation here:
This is not one of the most detailed GCMs, but it is one where you can download the actual source and look at it, with program documentation in hand on the side. One can fault the design (one can ALWAYS fault a design:-) in a number of places, but the assertion that they don’t handle elementary parameters like pressure, density, and temperature is simply not correct. Indeed, to fault this one has to get pretty specific — point to a particular place in section 4 where they do something wrong. The top article basically suggests that there may well be something seriously wrong in CAM 3’s (and other GCMs’) treatments of aerosols, cloud formation, and vertical transport based on actual measurements. I suggest equally specifically that there may be something wrong with using a discrete latitude/longitude grid, especially with comparatively weak or missing adaptivity (CAM 3 actually has a tiny bit of adaptivity in it to handle polar regions better) for the specific reason that radiation rates on thermally averaged cells will be strict lower bounds — not upper bounds — on the true rate, so that CAM 3 and any other GCM that assigns a single temperature to a comparatively large horizontal area (in any given slab or layer) will underestimate the cooling via the unblocked channels and underestimate the rate of radiative energy transport between slabs in the blocked layers (basically, overestimating the “radiative insulation” properties of any given slab) because more uniform temperatures lead to more warming with exactly the same insolation at all scales.
This is the game, if one wants to criticize the GCMs. One can perfectly legitimately point out that they aren’t working without specifying why, as that is a posterior conclusion based on comparison of their predictions and the actual data, but if one wishes to assert that they aren’t working for a specific reason, to be responsible one has to look at the actual code and see if the specific reason you suggest is implemented in the actual code in a way that is (or more properly, may be for some evidence-supported reason) incorrect. So it isn’t that GCMs don’t include the effects of latent heat transport — they obviously do (see “shallow/middle troposphere moist convection” in the CAM 3 documentation, for example). It MIGHT be that they don’t include it CORRECTLY.
It is an open question as to whether or not they are leaving some physics out entirely that ends up being important. The galactic cosmic ray hypothesis, for example, has some empirical support but it is so far not a slam dunk or sufficiently compelling to warrant inclusion in a model on anything other than a trial basis. It would actually be interesting to include it ON a trial basis — one can always insert provisional physics into a model just to test the model and see if it does better with it or without it, or if it gives the model additional explanatory power. This is itself a form of weak evidence, if it does. In a highly multivariate model, however, it is probably WEAK evidence because model predictions, especially of single outputs, are very probably highly covariate in the physical parameters, so that one can turn up CO_2 sensitivity and turn up the effect of aerosols at the same time and maintain good agreement with GASTA across some training set, but end up with highly disparate long-term predictions as eventually CO_2 continues increasing but aerosols don’t.
I suspect that it is this alone that is largely responsible for much of the error in the GCMs relative to the present — they’ve systematically exaggerated CO_2 sensitivity and maintained agreement with data across the 50’s through the 90’s by asserting a larger effect to pollution and volcanic aerosols, but as we moved past the fit region and CO_2 continued up with aerosols not increasing to match (and with volcanism if anything a bit diminished) the highest senstivity models have started to systematically diverge from the observed temperatures.
Is this indeed the explanation? Hard to say. There need not be ONE explanation. There is no doubt that GCMs contain both positive and negative forcing terms and achieve agreement by balancing them. There is little doubt that they assign quite different values to the effect of aerosols as there is no consensus value or model (and the top article shows how nonlinear any model must be to correctly account for all of the observable physics!) The sad truth is that while there is only one way for a program to be right there are countless ways for it to be wrong. Once a program has the complexity of something like CAM 3, not only are there countless ways for it to be wrong but they get to where no single human knows the entire code and few humans are willing to take even a major component of that code and monkey with it as you have to START by learning it all. It gets to be very “expensive” to make changes — one can spend most of a postdoctoral position just getting to understand what the existing code does and have little time to even THINK of making serious changes, retraining the code parameters, and then spending two years of CPU time running the program all over again to see if the changes don’t egregiously break the existing code and (perhaps) lead to some improvement.
I’ve downloaded and looked over the CAM 3 code. Sadly, however good or bad the code itself may be, the packaging of the code truly sucks. It would be a matter of weeks of work (for me) just to get it to BUILD, lest alone build and run on some small test program, and I’m a pretty damn good programmer (although I do hate Fortran, sigh:-). It just isn’t worth it — I have no grant for working on climate (so it is by definition a “hobby”, not a profession), I’m not getting paid to do it, I am getting paid to do a lot of other stuff that is very time consuming and have lots of other hobbies/projects that languish for lack of work on my part.
Porting a rather large Fortran program to C, organizing it so that it will automagically build across a range of platforms in both parallel and serial versions, replacing the lat/long tessellation with a scalable icosahedral tesselation, determining the granulation error in radiative transfer rates as a function of scale and estimates of per-cell spatiotemporal noise, correcting the aerosol, cloud, and vertical transport component (or somehow parameterizing it so that one can experiment with different rates based on empirical evidence as it comes in), adding an “optional” (parametric) component for GCR-modulated cloud nucleation rates, transforming the initialization data from lat/long to the icosahedral grid, fixing the single-slab ocean model to account for oceanic transport and more, better, figuring out how to correctly include projections of solar state (out as far as such projections themselves have some reasonable chance to be right) — I could spend the rest of my life working on this WITH A TEAM and a million dollars a year in grant money. A bit much too tackle for free and if I want to have a life of some sort on the side.

Brian H
November 27, 2013 6:54 pm

But, rgb, Think of the Children!

November 30, 2013 3:26 am

Is this study compatible with the work of Dr Jasper Kirkby (CERN) and his research findings?

December 4, 2013 1:51 pm

Thanks in favor of sharing such a nice opinion, paragraph is
good, thats why i have read it fully

%d bloggers like this: