Climate models missing black carbon and resultant CO2 emission

Here’s a look at what black carbon does to radiation flux according to GISS, so it appears they are aware, but maybe not using the right numbers

This is for Asia, I’d really like to see Russia. Also see below the “read more” for an interesting experiment that Mike Smith of WeatherData Inc. did last year to show the effect of carbon on snow. It is a simple experiment that you can do at home. I wonder how much of that soot from Asia finds it’s way to snow at high latitudes?

And here is the article that has been making the rounds this week, h/t to Leif Svalgaard

Savanna fires occur almost every year in northern Australia leaving behind black carbon that remains in soil for thousands of years. Provided by Grant Stone QCCCE

Click for larger image Grant Stone, QCCCE

Savanna fires occur almost every year in northern Australia, leaving behind black carbon that remains in soil for thousands of years.

(PhysOrg.com) — A detailed analysis of black carbon — the residue of burned organic matter — in computer climate models suggests that those models may be overestimating global warming predictions.

A new Cornell study, published online in Nature Geosciences, quantified the amount of black carbon in Australian soils and found that there was far more than expected, said Johannes Lehmann, the paper’s lead author and a Cornell professor of biogeochemistry. The survey was the largest of black carbon ever published.

As a result of global warming, soils are expected to release more carbon dioxide, the major greenhouse gas, into the atmosphere, which, in turn, creates more warming. Climate models try to incorporate these increases of carbon dioxide from soils as the planet warms, but results vary greatly when realistic estimates of black carbon in soils are included in the predictions, the study found.

Soils include many forms of carbon, including organic carbon from leaf litter and vegetation and black carbon from the burning of organic matter. It takes a few years for organic carbon to decompose, as microbes eat it and convert it to carbon dioxide. But black carbon can take 1,000-2,000 years, on average, to convert to carbon dioxide.

By entering realistic estimates of stocks of black carbon in soil from two Australian savannas into a computer model that calculates carbon dioxide release from soil, the researchers found that carbon dioxide emissions from soils were reduced by about 20 percent over 100 years, as compared with simulations that did not take black carbon’s long shelf life into account.

The findings are significant because soils are by far the world’s largest source of carbon dioxide, producing 10 times more carbon dioxide each year than all the carbon dioxide emissions from human activities combined. Small changes in how carbon emissions from soils are estimated, therefore, can have a large impact.

“We know from measurements that climate change today is worse than people have predicted,” said Lehmann. “But this particular aspect, black carbon’s stability in soil, if incorporated in climate models, would actually decrease climate predictions.”

The study quantified the amount of black carbon in 452 Australian soils across two savannas. Black carbon content varied widely, between zero and more than 80 percent, in soils across Australia.

“It’s a mistake to look at soil as one blob of carbon,” said Lehmann. “Rather, it has different chemical components with different characteristics. In this way, soil will interact differently to warming based on what’s in it.”

Provided by Cornell University

This from Brett Anderson’s AccuWeather Global Warming blog last year:

Here is a photo of fresh snow cover in my backyard over which I had tossed some eight month-old fireplace ash under a totally blue sky

Keeping in mind this demonstration is occurring just two days after the winter solstice (meaning the albedo effect is less than it would have been under clear skies in February or March), in just one hour, the greater melting in the ash-covered areas is already apparent:

After four hours, the ash-free area has a depth of 5.5 inches

At the same time, the ash-covered areas have a depth of about 2.5 inches. Multiple measurements were taken (note ruler hold about an inch in front of ruler) which yielded an average depth of 2.5 inches.

The areas without soot melt about 0.5 inches of snow during this 4-hour period while the soot-covered areas melt 3.5 inches.

For visual comparison purposes, note the ruler hole in the non-ash-covered snow above the shadow.

Even tiny amounts of soot pollution can induce high amounts of melting. There is little or no ash at upper right.. Small amounts of ash in the lower and left areas of the photo cause significant melting at the two-hour mark in the demonstration.

Any discussion pertaining to melting glaciers or icecaps must consider the accelerated melting caused by soot pollution in addition to any contribution from changing ambient temperatures.

Photos: Copyright 2007, Michael R. Smith

Mike Smith is CEO of WeatherData Services, Inc., An AccuWeather Company. Smith is a Fellow of the American Meteorological Society and a Certified Consulting Meteorologist.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
97 Comments
Inline Feedbacks
View all comments
Curt
November 20, 2008 9:39 pm

SteveSadlov (15:07:01) :
“Any time you force snow or ice to melt “artificially” (i.e. without raising the ambient temperature) is it a highly endothermic reaction. Even “natural” melting is net endothermic.”
While true, I don’t think this implies that the melting in this case has any cooling effect on the air. The water must absorb energy in the phase change from liquid to solid, but the absorbed energy is coming from the reduced albedo due to the introduction of the black carbon on the snow/ice. It is the absorbed radiative energy that provides the required energy for the phase change (at least on the margin). I don’t see any reason why this would lead to addtional cooling of the air.

Editor
November 20, 2008 10:02 pm

SteveSadlov (15:07:01) :

RE:
Old Coach (10:41:57) :
SteveSadlov,
Can you explain the mechanism by which dirt and other dark particulates on ice and snow is a positive feedback loop for cooling? I am having a hard time picturing it.
thanks
Any time you force snow or ice to melt “artificially” (i.e. without raising the ambient temperature) is it a highly endothermic reaction. Even “natural” melting is net endothermic.

The experiment, if I understand it correctly, was intended to show how sunlight is absorbed and that heat melts the snow. The ambient temperature would not be raised ( at least not beyond freezing). No endothermic reaction, though I share the concern about the KOH forced melting, no exothermic reaction, just melting ice.
At my New Hampshire latitude, black road crud on snow piles in March melt in intersting fashions. The crud often sinks into the snowbank and melts voids. The meltwater can freeze on the reamining snow leaving a fragile structure that doesn’t stand up to feet or cars.

Richard111
November 21, 2008 12:18 am

What if a certain power with an intense desire to operate in the Arctic were to deploy a blanket of soot during the dark months? This would be hidden by later snows but would have a profound effect on the summer melt season.

Mike Bryant
November 21, 2008 2:05 am

Speaking of complicated, I believe that this is a s complicated as any model and it actually accomplishes something:
http://www.break.com/index/best_rube_goldberg_ever.html

November 21, 2008 2:57 am

Has anyone stopped to observe cause/effect of cloud seeding, or am I alone in this regard?
Would you or would you not be upset if tomorrow morning you jumped in your car with precious little time to get to work only to find out that someone had siphoned your gas tank?

November 21, 2008 6:21 am

Now I learn that you burn a forest and black carbon accumulates in the soil, and takes forever to release its charge of CO2. But what gushes into the atmosphere during the burning process? Surely copious volumes of CO2?? And from the next fire that occurs next day?

Harold Ambler
November 21, 2008 11:35 am

Leif Svalgaard (20:22:51):
To give you a feeling for the sophistication of the models, look at these two pages from the 800-page book “Fundamentals of Atmospheric Modeling” by Mark Z. Jacobson, 2nd Ed. found here http://www.cambridge.org/catalogue/catalogue.asp?isbn=9780521548656

The scientist alluded to as a master modeler is Dr. Mark Jacobson, who is on the Stanford faculty, as a professor of Civil and Environmental Engineering and may well know Leif personally. He is a smart guy. On the other hand, this is what he has published with regard to the Sun in a textbook widely used in the study of the atmosphere at American universities:
A sunspot is a large magnetic solar storm that consists of a dark, cool central core (umbra) surrounded by a ring of dark fibrils (penumbra). Sunspots are hot and emit more energy than does the rest of the Sun.
Sunspot number and size peak every 11 years. Because the Sun’s magnetic field reverses itself every 11 years, a complete sunspot cycle is 22 years.
The difference in solar intensity at the top of the Earth’s atmosphere between times of sunspot maxima and minima is about 1.4 W/m2.
Because sunspot intensity varies relatively consistently from cycle to cycle, sunspots cannot be responsible for the rapid increase in recent temperatures.

The textbook is Atmospheric Pollution: History, Science and Regulation; Cambridge University Press (2002)
Besides the ignorance of the Sun itself, there is a straw man argument here. Those who argue in favor of a strong Sun-Earth climate connection do not maintain that sunspots are responsible for increased temperatures during the late 20th century. Jacobson is among those who deny the global nature of the Maunder Minimum, pointing to evidence of warming in Antarctica. But Svensmark, among others, has shown that the increased clouds associated with solar minima lower Antarctica’s albedo and have an anomalous warming effect on the southern continent.
Archibald’s work showing the effect of the length of solar cycles comes to mind. The length of Cycle 23 alone, never mind what may or may not be coming down the pike, means that further cooling in the short term is money in the bank.
Dr. Jacobson’s knowledge of particulate pollution, on the other hand, is widely admired, and rightly so. Perhaps the incipient cooling will prompt him to study the Sun more and draw the world’s attention to the kinds of real pollution that are degrading air and water throughout Asia — and in many other places around the globe.

Bruce
November 21, 2008 11:57 am

Peat fires in Indonesia in 1997/1998
http://www.iht.com/articles/2002/12/13/fires.php
“The scientists from Indonesia and Europe who carried out the research reported that up to 2.6 million metric tons of carbon entered the atmosphere as a result of widespread fires in Indonesia in 1997, contributing as much as 40 percent to the biggest annual increase in carbon emissions since records began being kept in 1957.”
Bet there was lots of soot too.

Fernando
November 21, 2008 1:44 pm

Leif;
In his opinion:
We are in: equilibrium (last 300 years)
Quasistatic
or
metastable

George E. Smith
November 21, 2008 4:21 pm

So we know about the diamond lattice form of carbon, and we know about the graphite form of carbon, and then there’s the latest geek toy, the Bucky ball form of carbon, and its carbon nanotube spinoffs; but what the hell are black carbon and organic carbon.
I thought the fact that it has carbon in it made it organic.
Enquiring minds want to know the difference between organic carbon and black carbon.

November 21, 2008 4:43 pm

Bruce,
Someone didn’t calculate right here, or they used a restricted comparison.

But the researcher team concluded that the most likely amount of carbon released from the burning of peat and its surface vegetation in Indonesia in 1997 was 810,000 tons to 2.57 million tons.

In reality, the upper limit of the wildfires is less than 1/1000th of the yearly global emissions of 9 GtC or 9,000 million tons by burning fossil fuels.
Moreover, most of the wildfires are from vegetation that has a limited age: from one year to a few decades. That doesn’t add much CO2 to the atmosphere, as that was sequestered a few years to a few decades before the release. The same for humans (and animals) exhaling CO2, which was sequestered a few months to a few years ago by plants, before eaten and digested by animals (or first by other animals)…
The more interesting point than CO2 emissions from burning vegetation is that aerosols are used in models to (indirectly) imply a huge impact for CO2. Without these cooling by (reflecting sulphate) aerosols, models are not capable to match the temperature trend in the period 1945-1975. But as I already suspected, the magnitude and even the sign of the total aerosol effect (white, brown, black) is not known to any accuracy. Thus the impact of CO2 on temperature is probably a lot lower that implemented…

George E. Smith
November 21, 2008 5:01 pm

“” Harold Ambler (11:35:33) :
>>deletions…
“A sunspot is a large magnetic solar storm that consists of a dark, cool central core (umbra) surrounded by a ring of dark fibrils (penumbra). Sunspots are hot and emit more energy than does the rest of the Sun.”
Oh yeah; I thought sunspots are the sun’s refrigerators; that’s how an optical pyrometer works; right ?: the “hotspots show up as dark patches against the colder brighter background” ?? is there something wrong with this picture ?
“Sunspot number and size peak every 11 years. Because the Sun’s magnetic field reverses itself every 11 years, a complete sunspot cycle is 22 years.”
No complaint there.
“The difference in solar intensity at the top of the Earth’s atmosphere between times of sunspot maxima and minima is about 1.4 W/m2.”
or here.
“Because sunspot intensity varies relatively consistently from cycle to cycle, sunspots cannot be responsible for the rapid increase in recent temperatures.”
Wow! where’s the data to support that claim. The sunspot peak of around1605 had about 120 sunspot number, and the 1740ish peak was around 110. Between those it never got over 80 and was less than 10 from about 1645 t0 1715 the Maunder minimum. Then it climbed to around 100-150 for the few cycles just before the Dalton minimum, and was about 60 for three cycles in the Dalton minimum; 1795-1823. The next four cycles were in the 100-140 rangeand then dived down to the 70 range for the 5 cycles from 1882 to 1930. The next two cycles went up to 120 then 150, leading to the placement of the IGY at the 1957/8 next sunspot peak.
Nobody could have foreseen that the IGY sunspot peak would be the highest peak ever in the entire history of sunspots, at about 190 spot number.
The recent peaks since IGY were about 110, 160, 160, and then the cycle 23 peak of the recent past (don’t have the number).
So sorry to report but since the IGY of 1957/58, the sunspot peak numbers have been the highest in sunspot history.
“Besides the ignorance of the Sun itself, there is a straw man argument here. Those who argue in favor of a strong Sun-Earth climate connection do not maintain that sunspots are responsible for increased temperatures during the late 20th century. Jacobson is among those who deny the global nature of the Maunder Minimum, pointing to evidence of warming in Antarctica. But Svensmark, among others, has shown that the increased clouds associated with solar minima lower Antarctica’s albedo and have an anomalous warming effect on the southern continent. ”
Well count me as one who DOES attribute the warming period basically following the IGY to those high sunspot peaks. No not to the 1.4 W/m^2 p-p change in the solar constant during a sunspot cycle, but to the magnetic field effect associated with sunspots, and the effect of that on cosmic rays, and solar charged particles arriving at earth.
It is the magnetic link that influences cloud formation that accounts for the late 20th century warming; and that most definitely was associated with the sunspots; and the lack of those spots now, and the resulting change in the magnetic environment, is what is giving us the new period of cooling.
Nobody seriously points to the 0.1% p-p solar constant cycle associated with sunspots as a reason for warming and cooling; but the associated magnetic field/cosmic ray/cloud linkage certainly can explain everything we have seen happen since IGY.
That’s my story and I’m sticking with it.

Bruce
November 21, 2008 5:26 pm

Ferdinand,
I guess my quote source sucked.
However: “In 2002, Rieley and his colleagues estimated that during 1997 and 1998 smouldering peat beneath the Borneo forests released between 0.8 and 2.6 billion tonnes of carbon into the atmosphere. That is equivalent to 13 to 40 per cent of all emissions from burning fossil fuels, and contributed to the CO2 peak in 1998.”
http://www.newscientist.com/article/dn6613

George E. Smith
November 21, 2008 5:26 pm

“”” SteveSadlov (15:07:01) :
“Any time you force snow or ice to melt “artificially” (i.e. without raising the ambient temperature) is it a highly endothermic reaction. Even “natural” melting is net endothermic.” “””
I’m having a hard time trying to decipher who actually said this from all the times it is posted above.
So the statement asks a question; “is it a highly endothermic reaction.” But no question mark, so is the question mark missing or was it really a statement; “IT IS a highly endothermic reaction. ” Anybody know?
The latent heat of freezing associated with the water/ice phase change is 80 calories per gram. When floating sea ice melts, it must absorb that 80 calories from the ocean water it is floating on, and that will cool a huge amount of sea water which will shrink, since water with more than 2.47% silinity has no maximum density before it freezes (unlike fresh water). So the sea level will go down when the floating sea ice melts. : see Physics today for Jan 2005 Letters section by George E. Smith. Prediction was confirmed by British Dutch team in mid 2006 who reported on ten years of arctic ocean observations using a European satellite. result was 2 mm per year drop, confirming that in fact the ice was melting during that ten years.
BUT, in order for water to freeze, that 80 calories per gram has to flow the other way; and sadly the system is not symmetrical. The second law of thermodynamics still insists that the flow must be from a warmer source to a colder sink, so the water will NOT freeze, unless the surroundings (air or water) are already colder than that water. The ocean sea water could be colder than zero since it has about 3.5% salinity; but nothing is going to freeze unless the surroundings are already colder than the freezing water; so nothing is going to warm up.
The most you can claim is that the surroundings which are removing the energy from the water, won’t get as cold as they otherwise be, in the absence of water to freeze.
Grape and citrus growers routinely spray their crops with water during a freeze. For one thing, pumped ground water is going to be in the 60 degree range, but so long as ice keeps forming on the crops, the temperature won”t drop below freezing so the fruit will not freeze (because of the dissolved sugars and such)
So it is misleading to claim that freezing water warms the surroundings; it isn’t going to freeze unless the surroundings are, and remain colder than the water.

Fernando
November 21, 2008 6:45 pm

We’re trying to study an open thermodynamic system. In this system. A decrease of entropy is possible. (Macromolecule, water>>>> ice, urban heat island).
Caution: this is not a cylinder. in a closed system.
FM

Pete
November 21, 2008 7:13 pm

George E. Smith (17:26:50) :
“So it is misleading to claim that freezing water warms the surroundings; it isn’t going to freeze unless the surroundings are, and remain colder than the water.”
I don’t think it is technically misleading, but i do think that it is misleading if this statement is made to a non-technical reader.
Can you imagine a reporter being given this statement and makes a headline out of it; “Freezing Causes Warming of Atmosphere – Skeptics Say”. That would be it for the skeptics.

E.M.Smith
Editor
November 21, 2008 8:01 pm

From Robert Bateman: No doubt very taxing on computational power. Mother Nature makes it look so easy.
end quote.
Remembering wayback… In the late 80s, early 90s I ran a supercomputer center. At the end when we were shutting down I got a call from a PhD student at Stanford asking if he could get some time for cloud modeling, but he couldn’t pay much (or at all was the impression). He was very surprised when I basically said “Sure, all you want; free”. I had the machine, my users were laid off, it was idle, why not?
The “bottom line”? To model a single small cloud forming event took several hour to days on a Cray YMP 2-32. While that would be about the same power as the Mac I’m using to type this today, it was a lot then. I suspect it is part of why they still don’t model clouds in the GCMs. If one cloud takes a fast processor for a few days, what would a few million clouds take…
FWIW, somewhere in the back of some doctorial thesis is a footnote thanking Apple for the machine time. I like to think of it as our Cray spending it’s last days dreaming of clouds in the sky…

November 22, 2008 4:44 am

Bruce,
Thanks for the link, that sounds better and indeed should be around 13 to 40 % of the human emissions of the years 1997-1998.
The only remaining question is in how much that effected the increase in reality: 1997 shows a relative low increase (below average), 1998 was a peak year, but also a strong El Niño year, where there is a strong (temporal) effect of temperature on CO2 increase speed.

Ron Cram
November 22, 2008 5:15 am

Anthony, this is slightly off topic but not that much off because the automatic generator of news articles above links to this article:
http://money.aol.com/news/articles/_a/bbdp/nasas-carbon-sniffing-satellite-sleuth/246144
You have made big contributions to improving the instrumental record. Evidently scientists also want to improve the measurement of atmospheric carbon dioxide. I just came across this:
http://www.biokurs.de/treibhaus/180CO2/bayreuth/menuee.htm
Perhaps you have discussed this in the past. Dr. Beck is claiming atmospheric CO2 was higher early in the 20th century than it is now. He claims we have just stopped measuring it using the more precise methods. Is this accurate? Are we just relying on the one site in Mauna Loa these days?
REPLY: My position on those measurments cited by Beck is the following:

-The inherent variability of the chemical analysis methods used adds a significant error bar.
-The locations of the measurements were almost all in western cities, thus likely to be hotspots for CO2 just as they are today.
-The measurements vary so greatly over short time spans that they seem unlikely to be representative of any global trend given that we know CO2 variability has not been much more than a few PPM/yr in the last 50 years.

So, I don’t put much creedence in them. – Anthony

Ron Cram
November 22, 2008 10:31 am

Anthony, thank you for the reply. You are correct that the measurements vary a great deal over short time spans, but it is unclear to me how that discredits the measurements. You seem to be assuming a global trend exists and has to be within a very narrow range, yet that is the very thing we are trying to determine.
You mention most of the measurements were taken in western cities, hotspots for CO2. Assuming you are correct that these cities are hotspots, they still show considerable variability. Some might take that to mean the carbon cycle is more active in certain regions or CO2 residence time is much less than estimated by the IPCC using the Bern Carbon Model. Willis E says there is no way to know which method for measuring residence time fits the data better.
http://www.climateaudit.org/?p=4370#comment-312489
Why would Mauna Loa measure global atmospheric CO2 when other methods measure local CO2?
I am still at a loss on this. If our data on atmospheric CO2 were really reliable, I don’t think we would be sending up a satellite to improve data quality.

anna v
November 22, 2008 11:37 am

REPLY: My position on those measurments cited by Beck is the following:
-The inherent variability of the chemical analysis methods used adds a significant error bar.
-The locations of the measurements were almost all in western cities, thus likely to be hotspots for CO2 just as they are today.
-The measurements vary so greatly over short time spans that they seem unlikely to be representative of any global trend given that we know CO2 variability has not been much more than a few PPM/yr in the last 50 years.
So, I don’t put much creedence in them. – Anthony

I just came accross this reference and would appreciate links for the criticisms.
I am so disillusioned about the temperature measurements and the purported accuracies of the anomaly plots, that I am suspicious of CO2 measurements which have not been scrutinized and where also corrections are applied as the spirit moves them. For example the recent NOA plots no longer have the upswing on the last averaged point. Go figure.
In addition, AIRS data, which become less clear as time goes on (one has to search for the color codes, I suspect there has been a lot of flak flying around) show that there is a 15ppm non uniformity in the CO2 due to season, latitude and longitude, and, I suspect, volcano outlets.
Mauna Loa is in a volcanic region and right on the hot stream of the north hemisphere, which most of the earth is not, nevertheless they claim that the sea data and the Mauna Loa data agree, within 2ppm.
Would the chemical data errors be worse than the 2ppm errors in the Mauna Loa/sea comparison?
Also your argument about cities: we have to agree, is CO2 homogeneous as the GCM claim, and then close to cities or not should not matter, or inhomogeneous , and then we have to think everything from the beginning.
in this link http://www.biokurs.de/treibhaus/180CO2/bayreuth/bayreuth2e.htm
many locations seem open enough and away from habitations ( slide 2).
From the AIRS data it also seems to me that the icecore records have to be re thought, since they may come from regions with depleted CO2 ( according to the above entry about black carbon also) by at least 15ppm . The whole CO2 field needs a lot of scrutiny.

peer
November 22, 2008 3:13 pm

re: ron cram
co2 measurements are still being made in Europe in the vicinity of cities and although they differ from mauna loa abit, they re not radically higher.
they used to be here:
http://cdiac.esd.ornl.gov/trends/co2/contents.htm
therefore the test has been done. current city co2 is not 30 percent higher than mauna loa and therefore the old data can be considered potentially accurate.
If any one has access to the proper links or even historical measures, I would appreciate it

jorgekafkazar
November 22, 2008 3:25 pm

Anthony, Anna, Ron: Regarding Beck:
I’m curious as to how wide that error bar is. (“The inherent variability of the chemical analysis methods used adds a significant error bar.”)
Were the measurment locations all in high CO² emission areas in the early 1800’s?
Anna: Supposedly, the Mauna Loa measurements take into account errors due to release of volcanic gases. I wonder exactly how they take their measurements. etc.

Mike Bryant
November 22, 2008 4:52 pm

You know what? It seems like the AGW people want us to believe that measuring and recording CO2 levels is an ancient art passed down by Dr. Keeling to his acolytes. Hmmm… it appears that measuring CO2 is not really very mysterious or complicated after all. Maybe Mauna Loa and the other CO2 labs could be eliminated by devices like this one:
http://www.hhydro.com/cgi-bin/hhydro/XHH0036.html

Pamela Gray
November 22, 2008 5:10 pm

Prior to 1920, natural fires in Oregon were allowed to burn themselves out, and fields were burned sometimes twice a year. Prior to 1850, the Indians routinely burned nearly the entire Willamette Valley every year. There must have been tons of carbon in the air. Much more than today.