Climate models missing black carbon and resultant CO2 emission

Here’s a look at what black carbon does to radiation flux according to GISS, so it appears they are aware, but maybe not using the right numbers

This is for Asia, I’d really like to see Russia. Also see below the “read more” for an interesting experiment that Mike Smith of WeatherData Inc. did last year to show the effect of carbon on snow. It is a simple experiment that you can do at home. I wonder how much of that soot from Asia finds it’s way to snow at high latitudes?

And here is the article that has been making the rounds this week, h/t to Leif Svalgaard

Savanna fires occur almost every year in northern Australia leaving behind black carbon that remains in soil for thousands of years. Provided by Grant Stone QCCCE

Click for larger image Grant Stone, QCCCE

Savanna fires occur almost every year in northern Australia, leaving behind black carbon that remains in soil for thousands of years.

(PhysOrg.com) — A detailed analysis of black carbon — the residue of burned organic matter — in computer climate models suggests that those models may be overestimating global warming predictions.

A new Cornell study, published online in Nature Geosciences, quantified the amount of black carbon in Australian soils and found that there was far more than expected, said Johannes Lehmann, the paper’s lead author and a Cornell professor of biogeochemistry. The survey was the largest of black carbon ever published.

As a result of global warming, soils are expected to release more carbon dioxide, the major greenhouse gas, into the atmosphere, which, in turn, creates more warming. Climate models try to incorporate these increases of carbon dioxide from soils as the planet warms, but results vary greatly when realistic estimates of black carbon in soils are included in the predictions, the study found.

Soils include many forms of carbon, including organic carbon from leaf litter and vegetation and black carbon from the burning of organic matter. It takes a few years for organic carbon to decompose, as microbes eat it and convert it to carbon dioxide. But black carbon can take 1,000-2,000 years, on average, to convert to carbon dioxide.

By entering realistic estimates of stocks of black carbon in soil from two Australian savannas into a computer model that calculates carbon dioxide release from soil, the researchers found that carbon dioxide emissions from soils were reduced by about 20 percent over 100 years, as compared with simulations that did not take black carbon’s long shelf life into account.

The findings are significant because soils are by far the world’s largest source of carbon dioxide, producing 10 times more carbon dioxide each year than all the carbon dioxide emissions from human activities combined. Small changes in how carbon emissions from soils are estimated, therefore, can have a large impact.

“We know from measurements that climate change today is worse than people have predicted,” said Lehmann. “But this particular aspect, black carbon’s stability in soil, if incorporated in climate models, would actually decrease climate predictions.”

The study quantified the amount of black carbon in 452 Australian soils across two savannas. Black carbon content varied widely, between zero and more than 80 percent, in soils across Australia.

“It’s a mistake to look at soil as one blob of carbon,” said Lehmann. “Rather, it has different chemical components with different characteristics. In this way, soil will interact differently to warming based on what’s in it.”

Provided by Cornell University

This from Brett Anderson’s AccuWeather Global Warming blog last year:

Here is a photo of fresh snow cover in my backyard over which I had tossed some eight month-old fireplace ash under a totally blue sky

Keeping in mind this demonstration is occurring just two days after the winter solstice (meaning the albedo effect is less than it would have been under clear skies in February or March), in just one hour, the greater melting in the ash-covered areas is already apparent:

After four hours, the ash-free area has a depth of 5.5 inches

At the same time, the ash-covered areas have a depth of about 2.5 inches. Multiple measurements were taken (note ruler hold about an inch in front of ruler) which yielded an average depth of 2.5 inches.

The areas without soot melt about 0.5 inches of snow during this 4-hour period while the soot-covered areas melt 3.5 inches.

For visual comparison purposes, note the ruler hole in the non-ash-covered snow above the shadow.

Even tiny amounts of soot pollution can induce high amounts of melting. There is little or no ash at upper right.. Small amounts of ash in the lower and left areas of the photo cause significant melting at the two-hour mark in the demonstration.

Any discussion pertaining to melting glaciers or icecaps must consider the accelerated melting caused by soot pollution in addition to any contribution from changing ambient temperatures.

Photos: Copyright 2007, Michael R. Smith

Mike Smith is CEO of WeatherData Services, Inc., An AccuWeather Company. Smith is a Fellow of the American Meteorological Society and a Certified Consulting Meteorologist.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

97 Comments
Inline Feedbacks
View all comments
November 22, 2008 5:21 pm

I would agree that Beck can not be dismissed too easily. I have done a lot of research on his figures over the past month and many of them stand some scrutiny. The following is an attempt to put climate change-and becks work- into a broader historic context. As most sites have many non experts who are often confused by terminology it has been written in a straightforward manner, so apologies in advance to the experienced.
Background
Put simply, the official IPCC view is that current temperatures are ‘unprecedented’ and this has been caused by man made co2 emissions from burning fossil fuels adding to the existing levels of ‘natural’ emissions. This has disrupted atmospheric ‘equilibrium’ (whereby co2 has previously been absorbed into natural ‘sinks’ such as oceans) and subsequently created a dramatic rise in atmospheric co2 concentrations. These levels range from 280ppm (parts per million) before 1750- the start of the industrial age – as measured by ice core samples, a 1900 figure of 295ppm (according to the work of GS Callandar) through 315ppm in 1958 as measured by Charles Keeling at Mauna Loa, through to today’s reading of 380ppm, thereby ‘proving’ the relentless rise of warming man made co2.
The following graph compares known historic temperatures with known levels of human co2 emissions and consists of information from two separate data sets. Dr Mann has spent 15 years working on his ‘hockey sticks’ and the ‘spaghetti derivatives’ so readers should understand this is very much a first try!
Graph 1 (temperatures in red) is the one chosen as our base line-It shows Hadley CET (Central England Temperatures) dating from 1660 to current date (in degrees C) This is the longest temperature series in the world and covers much of what we know as the Little Ice age-approximately 1350 to 1880. The figures have been unadjusted or smoothed, so shows actual peaks and troughs very well. Whilst CET is pretty good, the four stations represented within it do change and we would estimate a UHI (Urban Heat Island) effect over the last fifty years of at least 0.5C. However the graph has NOT been adjusted for this probable bias as it makes it easier to spot potentially rogue temperatures from other series. Consequently we consider this to be our benchmark.
The blue line in the bottom right hand corner shows the actual CDIAC//IPCC total emissions of co2 by humans since 1750. Any man made emissions are said to go straight to the ‘bottom line’ as an increase in concentration. Because of this immediate cause and effect we have therefore translated these additional emissions directly to an equivalent ppm amount- a rather crude reference point. We had as a guideline however the known ppm from today, back to 1958 when Charles Keeling first took his measurements. N.B. Natural emissions are annually 20 times greater than mans.
The lower green line illustrates the actual levels of man made co2 still in the atmosphere in any year through the decomposition of co2 over a 50 year period
After plotting all the known fixed points (please refer to graph) it appeared to show that either;
1. Co2 has no relationship whatsoever to temperature-it can be equally as warm at the 280ppm pre industrial levels, as it can at today’s 380ppm levels.
2. Alternatively that the estimate of 280ppm taken from ice cores is false and that other co2 peaks and troughs need to be factored in to create any sort of co2/temperature relation
3. Alternatively co2 lags temp rise by up to 800 years (as suggested in ice cores) so the current rises in temperatures is a response to the Medieval warm period, not the modern warm period.
It can also be seen that temperature rises appear to predate co2 increases and there were various times even in the Little Ice age when temperatures were as warm as today. Hadley CET are said to be ‘indicative’ of the Northern Hemisphere.
http://cadenzapress.co.uk/download/mencken.xls
It seemed worth investigating proposition 2 further as the graph seems to cry out for additional co2 spikes to be inserted in order to provide some correlation with fluctuating temperatures.
Consequently the work of Beck was inspected in considerable detail (Beck believes there are many reliable co2 readings prior to 1958 and that these demonstrate much greater variability than the ice cores suggest.) Some of Beck’s historic measurements were factored in, and this appeared to show some interesting correlation. There has been much criticism of Beck, and his claims somewhat derided by warmists and sceptics alike. Consequently a detailed history of pre 1958 co2 measurements was also researched in the context of the wide spread taking of measurements from the period 1820 onwards. These were needed for scientific and practical reasons-for example within mines, hospitals and in workplaces. The British Factories Act of 1889 set a limit of 900ppm inside cotton factories.
From this research it is difficult to come to any conclusion other than very many high quality scientists-some Nobel winners- from Victorian times onwards, were perfectly capable of taking very accurate measurements which showed variations from around 280 to 400ppm or so ( capable and actually achieving it are two different things)
Of course, if this interpretation is accepted it means the ice core measurements are incorrect.
I subsequently relooked at Becks information as I wanted to insert some of the readings so I took the following jpeg from Becks paper http://www.biokurs.de/treibhaus/180CO2/Bad_Honnef/bhonnef1e.htm
and put it over my graph (Yellow dots are his readings). I think I have a much better accuracy of temperature than he does, as the fit is rather close.
http://cadenzapress.co.uk/download/beck_mencken_with_decomposition.xls
Especially note the rise in co2 at a very similar rate to the current levels during the period 1920 to 1950. Also note we do not have a measurement for each year so will be missing some peaks and troughs of co2 readings. We are also missing the lower co2 readings which I intend to insert shortly
Now it may be Beck cherry picked measurements BECAUSE he knew they correlated with various temperatures. However I am not sure this is so, as his jpeg would have reflected this relationship more closely than it does.
It is my intention to work on other graphs and readings to more accurately examine Becks findings.
In the meantime others should read Ferdinand Engelbeens excellent web site debunking Beck so they can see a technical critique of Becks claim-mine is purely looking at the historical and social context, in as much measurements were taken frequently for a variety of reasons. Some of the higher ones may have been to measure a known co2 hot spot in a factory with a view to prosecution.
TonyB

evanjones
Editor
November 22, 2008 6:05 pm

If soils release 10 times more CO2 than all human activities combined, perhaps we should just pave over more raw land. That would easily solve the CO2 problem. No-brainer.
Sure, soils emit all that CO2.
The problem arises that the soils absorb more than they emit. Including a share of what man emits.

anna v
November 22, 2008 9:54 pm

Tony B
Your plots are not displaying since one cannot show plots here. You should give links.
Do you have a link for Ferdinand Engelbeens web page? Nothing came up as such when I searched with Yahoo.

November 23, 2008 12:50 am

Anna V
Here is the link to Ferdinand’s critique of Beck
http://www.ferdinand-engelbeen.be/klimaat/beck_data.html
I have just clicked on the links in my initial email and everything is coming up as it should and they are displaying correctly. Anyone else having problems?
TonyB

anna v
November 23, 2008 4:17 am

TonyB
I can see links that come paragraphs after you mention Graph 1 to which on should refer. Which link has Graph1?
thanks for the engelbeen link.
My main “point” is the total acceptance that ice core data are a good background estimate . quote:
“. But the averages measured over land in the period 1935-1950 (15 years) is about 100 ppmv higher than in the ice core. That proves that the land based measurements show positive biased values, which have no resemblance to the real historical background (which was already obvious in the previous chapter).”
In view of
a) the Airs data http://airs.jpl.nasa.gov/
b) this present post about “missing black carbon and resultant co2 emission”
I would find it not surprising for the background values over land in the northern hemisphere to be 100 ppm higher than the values in arctic regions.
And this, without checking into how all these values that are accepted as bible truth are calibrated.

November 23, 2008 7:14 am

Hi Anna
This is graph 1 which clearly showed ‘something’ seemed to be missing
http://cadenzapress.co.uk/download/mencken.xls
I took the material from Beck as per the link under;
http://www.biokurs.de/treibhaus/180CO2/Bad_Honnef/bhonnef1e.htm
and superimposed it over graph 1 to produce the link under; In addition I took into account the 50 year half life of co2 and put that in as a green line. (It needs to be plotted back to the 1750 line).
http://cadenzapress.co.uk/download/beck_mencken_with_decomposition.xls
The Beck plots are as yellow dots. It must be pointed out that Beck has stressed to me that not all the available data is reliable due to flaws in measurement methods, or too few readings were taken over a specified period etc.
I am satisfied the scientists of the era were perfectly capable of taking accurate measurements, but as in all walks of scientific life some would not stand close scrutiny. Consequently I am investigating in detail a small number of what are said to be the more reliable plots to determine the exact circumstances of the readings.
Three of the people I respect most in Climate work are Ferdinand E, Steve M and Anthony Watts. It is somewhat daunting therefore to say that I disagree with ALL of them and think there is more to Becks work than perhaps they think!!
I distrust nice smooth lines such as we have for the Keeling curve, and having investigated the history of how the measurements came about- around 1955- I think Keeling relied to much on Callendars work in setting the 1900 benchmark at 295ppm. If anyone is interested (and I appreciate this is a science blog) I can post some of the history of the matter that persuaded Keeling to accept 295ppm as his benchmark, and which helped persuade him there was a Keeling ‘curve’ rather than accept his work in 1958 was merely just another plot of 315ppm which fitted into natural Co2 variability.
I am afraid I am also rather sceptical about ice core readings and think previous atmospheric variability up to modern levels is a more rational explanation for the lack of correlation I perceive in the graph. Otherwise we are saying higher temperatures than the present are possible at 280ppm pre industrial levels back to 1660. Factor in the MWP and the Roman warm period and the Holocenes and in effect temperatures could be warmer at 280ppm than they are at 380ppm.
Chemical measurements would be accurate up to 3% ( or better) so would be close enough to clearly demonstrate previous levels
Perhaps the 100ppm we are said to have added since 1750 doesn’t matter as the climate doesn’t react to anything much over 300?
TonyB

beng
November 23, 2008 7:31 am

SteveSadlov, while I agree that melting from black carbon is an endothermic action, I don’t think you’re seeing the big picture. The overall picture is that less sunlight is reflected back into space by the carbon compared to pristine snow, so more net energy is absorbed by the surface. Comparing the two cases, even at the same temp (say freezing), the black carbon situation has more liquid water than the pristine-snow case, and hence more total “heat”.

November 23, 2008 11:11 am

I always enjoy reading Ferdinand Englebeen’s skeptical comments here, and he makes some very good points regarding the Beck analysis. But I have a few quibbles. Englebeen states:

Well, the release of 210 GtC (= 100 ppmv) in 7 years time is theoretically possible as result of a huge release from volcanoes, (undersea) vents, meteorite impacts, etc… Or burning 1/3th of all vegetation on earth… There is no sign that something like that happened, but it is possible. But the opposite way: that 210 GtC were absorbed in ten years time, either by vegetation (that is one third of all vegetation as extra growth) or oceans, is physically impossible. There simply is no process in the natural world which can absorb such a quantity of CO2 in such a short time. This in fact refutes the probability of such a peak value around 1943.

Why is there no mention of the fact that in 1943 the world was at war, which resulted in very rapid industrialization, along with the incineration of entire cities? Omitting that world event seems to have been a major oversight in the Beck critique, particularly since the critique assumes, as an unarguable fact, that human production of CO2 is the reason for its rise — even though it has been shown that rising CO2 levels follow centuries-earlier rises in temperature.
As the war wound down, CO2 levels of course diminished; daily bombing runs by multi-thousands of heavy aircraft stopped, fires were extinguished, factories were taken off 24/7/365 schedules, etc.
To unequivocally state that there is ‘simply no process’ that could possibly account for declining CO2 levels is to assume that all knowledge of CO2, including its persistence in the atmosphere, is currently known.
As Prof. Freeman Dyson has pointed out, we know little about the effect of topsoil in absorbing carbon dioxide. Microbes in the soil multiply rapidly, many times per day. Their growth could easily increase if sufficient food were available. CO2 is, of course, plant food.
Englebeen’s critique also ignores the much larger CO2 peak and subsequent decline in the very early 1800’s, when the industrial age was just beginning. That also appears to be a glaring omission.
Not all of Beck’s data was collected from cities. His site shows pictures of locations where CO2 data was collected. Many of those locations were very isolated places like the rural Scottish coast, and islands in the Baltic sea, with very little human habitation. Data was also collected from ships crossing the open ocean. To arbitrarily discard Beck’s entire data set simply because some locations probably had higher than normal levels of CO2 is almost as sloppy as reversing the critique’s CO2/year chart’s x and y axes.
Beck’s reconstruction has problems. But it also has value. It is one of the few historical CO2 data sources available. And as we know, the Vostok source has accuracy problems, too.
Finally, scientists of the day did not receive lucrative financial grants in return for peddling their case; they were people who were truly interested in the atmosphere, and they recorded data for their own knowledge. That in itself is quite different than the current state of affairs. The work of Einstein’s contemporaries had more integrity than what government entities produce today.

Guido
November 23, 2008 12:23 pm

Engelbeen [quoting]: “In reality, the upper limit of the wildfires is less than 1/1000th of the yearly global emissions of 9 GtC or 9,000 million tons by burning fossil fuels.”
Not sure how you determined this upper limit, but most studies indicate a number close to 2 Pg C / year for wildfires and anthropogenic fires combined. That is almost 1/4 instead of 1/1000. True, real “wild” wildfires are a relatively small fraction.
Good point about most of it being just “fast respiration” (CO2 emitted that was previously captured by photosynthesis). Still, the fraction associated with deforestation or an increase in fire activity is considerable. This is a net source.
Fires also play a role in explaining interannual variability in growth rates of CO2 and CH4, for example the large fires in Indonesia mentioned earlier in the thread in 1997/1998 explaining part of the high CO2 and CH4 growth rates that year.

AnyMouse
November 23, 2008 7:15 pm

Now I learn that you burn a forest and black carbon accumulates in the soil, and takes forever to release its charge of CO2. But what gushes into the atmosphere during the burning process? Surely copious volumes of CO2?? And from the next fire that occurs next day?

What is being measured is the carbon which remains after burning. Plants take carbon from the air while they’re growing. Eventually they burn or rot, both releasing some carbon to the atmosphere. Frequent burning leaves a lot more carbon in the soil than trying to stop burning because a fire after 40 years is both more destructive and leaves less carbon in the soil (indeed, intense fire burns carbon out of the top layer). Much land used to be burned often, trapping large amounts of carbon. In the Americas this was reduced after 1492 and actively stopped in much of the world around 1900 for various reasons.

Marcus
November 24, 2008 7:31 pm

Re: The PhysOrg article and what it means for “overprediction” by climate models:
Reading the actual text of the article, for a 4.5 degrees C warming by 2100, the cumulative carbon emission overestimation from Australia for the century would be 135 Tg of carbon. Australia is about 5% of the world’s land mass, so let’s be generous and assume that this overestimation is both valid and globally applicable. 20*135Tg = 2.7GtC. That’s, um, less than half of this year’s total carbon emissions – spread out over 100 years!
Whoopee. That will reduce projections a whole heck of a lot.

Marcus
November 24, 2008 7:48 pm

And on the Beck argument: Listen to Anthony. Look at the scatter in the chemical measurements before the Mauna Loa record: you see jumps of 100+ ppm in a year or two. Is it just coincidence that scatter of this magnitude completely disappears when we develop a more reliable monitoring system? And there are stations in the southern hemisphere that show pretty much the same signal as Mauna Loa, just with an opposite, smaller seasonal pattern and a slightly lower CO2 concentration (which makes sense: smaller land mass = less vegetation doing the grow/die cycle, less industry and more ocean sink meaning slightly lower CO2 because it does take a couple years for the atmosphere to mix between hemispheres).
Regarding the original question: the reason for satellite monitoring of CO2 is not because we don’t know global mean CO2, or that the rise is anthropogenic, but rather because there are important details we still don’t have a good grasp of. The biggest of which are natural emissions and uptake: where is the ocean taking up carbon? Releasing it? Which forests are sucking carbon up? How much CO2 does land use change cause to be released? How much carbon does no-tillage agriculture sequester? We have a good bound on how much we emit from fossil fuels, because that is an easy calculation: just weigh the oil + coal + natural gas that you burn, and you know how much CO2 you produce. And yes, we can measure the atmosphere quite well at multiple sites around the globe. So we know what the sum of the ocean + ecosystem + land use change is pretty well, but we don’t know how to partition between them.
I think that all the “skeptics” who jump on the “CO2 rise isn’t anthropogenic” bandwagon and similar poorly-supported memes really don’t help the cause of those skeptics who actually understand the science.
I personally happen to believe that the mainstream climate science consensus is fairly good, but I also believe there is value in websites to keep the consensus honest by documenting poorly located weather stations etc. I’m looking forward to seeing the analysis on the top quality sites, and seeing how it matches the standard temperature records. Also I’m looking forward to seeing comparisons between the Climate Reference Network and USHCN when that is on line. But I think the noise level at these sites can get pretty high, and sometimes I wish the skeptic community could filter out the really junk stuff so we can argue about the stuff that is actually worthwhile…

Marcus
November 24, 2008 7:57 pm

Finally, on Steve Sadlov and the “cooling by melting” argument:
Think systems. If you have an ice cube in an insulated jar and the ice cube melts, yes, the air temperature in the jar will drop. But the total energy in the system stays the same. Which means you are just shifting energy around in the system. On Earth, that means you won’t get long term cooling just by throwing salt on all your ice and melting it. You’ll get short term cooling, but the next winter the ice will all refreeze and release all the heat right back out again.
Long term heating (or cooling) requires somehow changing the energy balance of the system. Black carbon does this by lowering the albedo, thereby absorbing more sunlight instead of reflecting it out to space. In this case, you probably won’t even get short-term cooling, because the heat of fusion is being supplied by the heat the the black carbon is capturing which wouldn’t have been in the system otherwise. And if enough snow melts to expose dark ground… well, there’s your positive feedback.

November 25, 2008 1:06 am

Marcus
Let us step back a bit.
Recorded temperatures back to 1660 show similar values to todays without added co2. The previous warm periods have risen to greater levels without added co2. So is man made CO2 the ‘guilty ingredient’ when the suspect was actually missing from the scene of the crime?
Co2 is a small part of overall greenhouse gases and man made emissions a small part of that. It is the notion of ‘Equilibrium’ that has decided that the increase in man made co2 levels has nothing to do with the greater natural amount being emitted annually-and is the tipping point .
Co2 is said to degrade with a half life of anything from 5 to 50 years-depending on who you believe- so much of what we have put into the air has gone into a sink.
If man has added 100 ppm over 250 years that equates to 10 parts per 100,000. In UK terms it means we have added half a part since we started industrialisation in 1750, much of which has subsequently gone into a sink.
Now co2 is stored in all sorts of places according to Eli Rabbetts concept of ‘boxes’. Do we really know exactly how big those ‘boxes’ are and the rate at which they release co2 back, according to the temperatures of say the ocean? Can we be certain we have identified all the boxes-let alone their size?
The thing about Becks work is that it does provide us with a historic record and the historic temperature spikes make no sense unless there are historically higher and lower co2 readings.
Even in Victorian times Co2 analysing methods could be perfectly accurate and measurements were commonly taken for medical and employment reasons-the cotton industry had particular problems with co2 because of ventilation concerns, and the first factories act set a limit of 900ppm in 1889. This had been debated by the British Parliament for twenty years prior to that and its success reviewed in Parliament 15 years after.
Levels aren’t set unless they can be enforced.
Levels were commonly taken- including by many higly experienced scientists including nobel winners. Those by Buchanan and Benedict seem particularly interesting. Not all measurements would be correct of course. Many that were correct may seem high to us but may have been taken in a known ‘polluted’ area precisely because it was polluted, and in order to see if co2 conformed to agreed limits.
We greatly underestimate the abilty of our forefathers scientific abilities-on which the modern age is founded- if we think that NONE of the hundreds of thousands of measurements from around 1820- when Saussure took readings around Geneva-can be accurate.
The modern level of 295ppm set by Callandar as the 1900 benchmark was an arbritary figure set to support his own theory that Man was responsible for causing the ‘greenhouse’ effect.
I am currently investigating some 15 readings which Beck believes can be termed ‘reliable’ in order to determine if they really are. Until then I would say his work is very interesting but needs to be fully proven.
TonyB

Marcus
November 25, 2008 4:01 pm

TonyB: Let’s see:
First: the CET temperature record does not come close to being a good representation for global temperatures. 1660 was _not_ at today’s temperatures globally.
Second: CO2 is _not_ the only factor involved in climate change. Everything from volcanoes to solar changes to ocean current changes to orbital changes to aerosols to methane to… well, the list goes on. You need to look at all the known forcings to decide what is responsible for what temperature change. (recently, CO2 has been the largest positive change in forcing, followed by methane and black carbon, with aerosols and the occasional volcano being large negative forcings)
Third: Man-made emissions are small compared to _gross_ natural emissions, but are large compared to _net_ natural emissions. Man is emitting, on average, twice as much CO2 as we see in terms of atmospheric increase, with the balance ending up in the ecosystem and oceans.
Fourth: CO2 does not degrade with a “5 to 50 year” half-life. It goes into three places: the atmosphere, the ocean, and the ecosystem (with a very small component being lost to the system through weathering and deep ocean sediment formation). In the whole system, additional carbon (from fossil fuel combustion) has a half-life due to the weathering component of thousands of years. If you care about the atmosphere, the half-life is complicated, because it will start out fast and then slow down as the various uptake (ocean + ecosystem) mechanisms approach equilibrium with the atmosphere. The common simplification is around 100 years, though again, a half-life is not a good measure.
And again, we know we have accurate measurements at various isolated places around the globe ( http://cdiac.ornl.gov/trends/co2/sio-keel-flask/sio-keel-flask.html ) which all match each other. In none of them do we see more than a few ppm seasonal variation year to year during a 5 decade steady rise. Compare this to Beck’s record, where CO2 jumps all over the place. And there is no theoretical explanation for Beck’s record… other than the obvious one, which is that there were a large number of bad measurements back then.

November 26, 2008 12:08 am

Marcus
Lets look again
Holocene warm periods at 280 ppm warmer than present
Roman warm periods at 280ppm warmer than present
MWP at 280ppm warmer than present
The middle of the LIA age (I did not say 1660 look at the other periods in the graph) around as warm as present
I think history is telling us something about;
Either
a) The overall importance of CO2 related to temperature
OR
b) Our knowledge of past CO2 readings is faulty
You cant have it both ways;.
In all previous warming instances CO2 wasnt even at the scene of the crime.
I do not agre with your ‘obvious one’. Have you actually studied the history of how and why measurements were taken and the credentials of the numerous scientists who took them. Do you really think our forefathers were that dumb that they couldnt take ANY accurate measurements?
I would be intereested in hearing from you how much and where all the CO2 is sequestered and the relative ‘half life of Man made and natural emissions and also why ‘OUR’ emissions- although smaller than natures- are the ones that distrub the equilibrium
You can also tell me the source of your 1660 ‘Global’ temperatures-the ones these days are a farce let alone any concocted back to 1660.
TonyB.

anna v
November 26, 2008 4:35 am

Marcus (16:01:52) :
And again, we know we have accurate measurements at various isolated places around the globe ( http://cdiac.ornl.gov/trends/co2/sio-keel-flask/sio-keel-flask.html ) which all match each other. In none of them do we see more than a few ppm seasonal variation year to year during a 5 decade steady rise.
In the AIRS data, http://airs.jpl.nasa.gov/ there are up to 15ppm variations of CO2 due to longitude and latidute at a given time, and these measurements are rather high up and not near surface.
I do not trust the measurements of CO2 going thorough the same channels that give us measurements of ground temperatures. If they have managed to make such a mess of such a simple reading as a temperature, I would triple audit any CO2 results.

November 26, 2008 11:48 am

Anna
I do not trust ice cores that tell us that pre industrial levels were 280ppm for the same reason I don’t trust global temperatures nor the Mann hockey stick-they are all there to give us their version of the narrative of global warming.
It is a fact we know temperatures can be as high or higher than todays at levels said to be 280ppm. If we find that co2 levels could be as high as today that destroys the ice core readings and the narrative. Until I check the 15 readings that Beck have given me I can’t comment as to their accuracy. However I can ceretainly state categorically that the idea that NONE of the thousands of readings taken prior to 1958 are accurate is fanciful in the extreme.
The figures produced by GS Callendar around 1955 were highly selective in order to support his own theory of AGW and Charles Keeling did not have the knowledge at the time to dispute them.
One thing in Becks favour is that he publishes all his references on line, unlike those who refuse to divulge their information and have caused a number of requests to be made under the Freedom of information Acts to try and dig the information out.
TonyB

anna v
November 26, 2008 8:44 pm

TonyB
“I do not trust ice cores that tell us that pre industrial levels were 280ppm”
What I am trying to say referring continuously to the AIRS maps, is that maybe the ice core measurements were 280 because they were in regions where there was a dearth of CO2 a la AIRS maps. In addition I would expect that close to the ground/ ocean surface the latitude/longitude differences would be much greater, and considering that ice forms where the temperature is less than 0 centigrade I would expect a huge ocean sink next to that ice that was being formed to be later sampled and measured.
So the ice cores might be accurate, for the arctics.

November 27, 2008 12:09 am

Anna
Interesting thoughts. Have you ever come across any references to confirm your ideas?
Low co2 readings are recorded at times of historic lower temperatures so would we automatically expect ice cores to reflect that or are we talking two completely issues here?
Any links to articles will be read with interest
TonyB

anna v
November 27, 2008 9:20 pm

TonyB
Low co2 readings are recorded at times of historic lower temperatures so would we automatically expect ice cores to reflect that or are we talking two completely issues here?
I think it is two different things, and it depends on whether CO2 is homogeneous over the globe and in the atmosphere. We know the temperatures are not. Up to the publication of the AIRS maps I have not read of anybody seriously challenging this. There have been challenges on the ice core method and whether it retains information from the past or is contaminated by the process of extraction.
When we look at the ice core temperature record do we accept that this is the temperature of the whole globe? Below 4 degrees C? It is the temperature of the region where the ice formed. The same holds for CO2, is all I am saying.
I think that the whole CO2 record is at the stage the temperature records were before the scrutiny of people like Anthony: people trusted the scientific expertise and integrity of the climate science community.
It needs auditing.

anna v
November 27, 2008 9:55 pm

TonyB
there is a discussion on CO2 over at a Forum of Lucy
http://www.greenworldtrust.org.uk/Forum/phpBB2/viewforum.php?f=22 .
The format is easier than a blog because recent items come up on the list and it is easier to follow.