CO2 report – estimated to be "highest in 15 million years"

Another paper for the Copenhagen train. This is an estimate according to the abstract. Here’s the abstract and the supplemental information, of course the publicly funded paper is behind the AAAS paywall.

From UCLA News: Last time carbon dioxide levels were this high: 15 million years ago, scientists report

By Stuart Wolpert October 08, 2009 Category: Research
tripati_CO2-15million
More ice hockey - last 1000 years of CO2 from Vostok
You would have to go back at least 15 million years to find carbon dioxide levels on Earth as high as they are today, a UCLA scientist and colleagues report Oct. 8 in the online edition of the journal Science.
“The last time carbon dioxide levels were apparently as high as they are today — and were sustained at those levels — global temperatures were 5 to 10 degrees Fahrenheit higher than they are today, the sea level was approximately 75 to 120 feet higher than today, there was no permanent sea ice cap in the Arctic and very little ice on Antarctica and Greenland,” said the paper’s lead author, Aradhna Tripati, a UCLA assistant professor in the department of Earth and space sciences and the department of atmospheric and oceanic sciences.
“Carbon dioxide is a potent greenhouse gas, and geological observations that we now have for the last 20 million years lend strong support to the idea that carbon dioxide is an important agent for driving climate change throughout Earth’s history,” she said.
By analyzing the chemistry of bubbles of ancient air trapped in Antarctic ice, scientists have been able to determine the composition of Earth’s atmosphere going back as far as 800,000 years, and they have developed a good understanding of how carbon dioxide levels have varied in the atmosphere since that time. But there has been little agreement before this study on how to reconstruct carbon dioxide levels prior to 800,000 years ago.
Tripati, before joining UCLA’s faculty, was part of a research team at England’s University of Cambridge that developed a new technique to assess carbon dioxide levels in the much more distant past — by studying the ratio of the chemical element boron to calcium in the shells of ancient single-celled marine algae. Tripati has now used this method to determine the amount of carbon dioxide in Earth’s atmosphere as far back as 20 million years ago.
Aradhna Tripati

Aradhna Tripati
“We are able, for the first time, to accurately reproduce the ice-core record for the last 800,000 years — the record of atmospheric C02 based on measurements of carbon dioxide in gas bubbles in ice,” Tripati said. “This suggests that the technique we are using is valid.
“We then applied this technique to study the history of carbon dioxide from 800,000 years ago to 20 million years ago,” she said. “We report evidence for a very close coupling between carbon dioxide levels and climate. When there is evidence for the growth of a large ice sheet on Antarctica or on Greenland or the growth of sea ice in the Arctic Ocean, we see evidence for a dramatic change in carbon dioxide levels over the last 20 million years.
“A slightly shocking finding,” Tripati said, “is that the only time in the last 20 million years that we find evidence for carbon dioxide levels similar to the modern level of 387 parts per million was 15 to 20 million years ago, when the planet was dramatically different.”
Levels of carbon dioxide have varied only between 180 and 300 parts per million over the last 800,000 years — until recent decades, said Tripati, who is also a member of UCLA’s Institute of Geophysics and Planetary Physics. It has been known that modern-day levels of carbon dioxide are unprecedented over the last 800,000 years, but the finding that modern levels have not been reached in the last 15 million years is new.
Prior to the Industrial Revolution of the late 19th and early 20th centuries, the carbon dioxide level was about 280 parts per million, Tripati said. That figure had changed very little over the previous 1,000 years. But since the Industrial Revolution, the carbon dioxide level has been rising and is likely to soar unless action is taken to reverse the trend, Tripati said.
“During the Middle Miocene (the time period approximately 14 to 20 million years ago), carbon dioxide levels were sustained at about 400 parts per million, which is about where we are today,” Tripati said. “Globally, temperatures were 5 to 10 degrees Fahrenheit warmer, a huge amount.”
Tripati’s new chemical technique has an average uncertainty rate of only 14 parts per million.
“We can now have confidence in making statements about how carbon dioxide has varied throughout history,” Tripati said.
In the last 20 million years, key features of the climate record include the sudden appearance of ice on Antarctica about 14 million years ago and a rise in sea level of approximately 75 to 120 feet.
“We have shown that this dramatic rise in sea level is associated with an increase in carbon dioxide levels of about 100 parts per million, a huge change,” Tripati said. “This record is the first evidence that carbon dioxide may be linked with environmental changes, such as changes in the terrestrial ecosystem, distribution of ice, sea level and monsoon intensity.”
Today, the Arctic Ocean is covered with frozen ice all year long, an ice cap that has been there for about 14 million years.
“Prior to that, there was no permanent sea ice cap in the Arctic,” Tripati said.
Some projections show carbon dioxide levels rising as high as 600 or even 900 parts per million in the next century if no action is taken to reduce carbon dioxide, Tripati said. Such levels may have been reached on Earth 50 million years ago or earlier, said Tripati, who is working to push her data back much farther than 20 million years and to study the last 20 million years in detail.
More than 50 million years ago, there were no ice sheets on Earth, and there were expanded deserts in the subtropics, Tripati noted. The planet was radically different.
Co-authors on the Science paper are Christopher Roberts, a Ph.D. student in the department of Earth sciences at the University of Cambridge, and Robert Eagle, a postdoctoral scholar in the division of geological and planetary sciences at the California Institute of Technology.
The research was funded by UCLA’s Division of Physical Sciences and the United Kingdom’s National Environmental Research Council.
Tripati’s research focuses on the development and application of chemical tools to study climate change throughout history. She studies the evolution of climate and seawater chemistry through time.
“I’m interested in understanding how the carbon cycle and climate have been coupled, and why they have been coupled, over a range of time-scales, from hundreds of years to tens of millions of years,” Tripati said.
In addition to being published on the Science Express website, the paper will be published in the print edition of Science at a later date.
UPDATE: Bill Illis add this graph in comments, which brings up the obvious correlation questions.
0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

346 Comments
Inline Feedbacks
View all comments
Richard
October 18, 2009 1:36 pm

The gang at RC “referee” each others PAPERS in an old boys network – that should have been

cba
October 18, 2009 2:29 pm

joel,
I’ve got to prepare for a conference this week and I don’t have access to money locked papers at home at home so I will not be able to look at 33 and 34 for a while. I did note that the paper is based upon the first few years of erbe and is one of the first to be done. I find Kiehl & Trenberth 97 a bit more interesting from that time frame as they actually attempted to determine an energy budget and compare it to various measurements including erbe. It’s interesting that they differ (evidently) by a substantial amount for their atmospheric effects as compared to ramanathan’s cloud number. Back then though the assumptions that aerosols and dust and scattering were very much unknowns and assumed to have more significant effects than is believed now.
enjoy untangling and following the previous post. It would seem that a number of my assumptions about your reference were spot on – such as the use of averages rather than actual effects in those numbers. I didn’t have time to verify whether all my assumptions about the paper were correct though.

P Wilson
October 18, 2009 11:00 pm

Joel.
The fallacy comes mainly from NASA and the IPCC. Of course,, you see what you want to see, but the claims made by RC are quite convoluted. Not that there aren’t enough intitutions here in the UK that interpret science and physics how they see as expedient. The chief problem is that the claims they infer just aren’t happening.
I was rather hoping, since you mentioned my ignorance of the history of science to move it onto the scientist Kelvin who was quite robust on the physics of heat – which is the energy transfer. It is not an absolute or a constant, and more than Resistance and force are constants from a fulcrum, or any more than wind is a constant. It changes form, so there is no reason to assume that heat leaving a system has to equalise with heat heat coming in – although Kelvin claimed that heat might be lost to man but not to the universe (since it changes form) – which includes biological energy transformation. Earth simply doesn’t radiate as much heat as is claimed, so these numbers have been concocted without being able to justify them.
By your own admission somewhere else, air doesn’t increase ocean heat quickly (although evidence says not at all) so if preipitation occurs it can only be solar forcing -It is oceans which cause precipitation and not the atmosphere. This is how the AGW farrago is in a mess. It is reasoned that if air is heating quickly by ‘ghg forcing’ the oceans are not. Therefore little precipitation would occur, save for the occasional extreme tempest.This isn’t happening either. In truth, precipitation is increasing everywhere – even in the sahara. On the basis of this it was concocted that c02 forcing heats the oceans, although no mechanism is provided for it, because that isn’t happening either, so the theory is in a mess. Even
Anyhow, it is still a mystery how it could be maintained that the upper troposphere is the region of greatest importance for ghg forcing. It is -50 to -70C in this region. No explanation is offered how these subzero temperatures can cause increased heat at lower tropospheric regions. There is none, unless one concocts another series of logic defying manipulations with an esoterical set of a priori assumptions to contrive it – however, one would be going into the realms of pure science fiction, and presenting it as fact.
What I think we should do is look at the evidence on the basis of just that. I presume you didn’t read Monckton’s brief – it is all verified, particularly the section on how IPCC have quietly changed numbers regarding c02 ‘forcing’, after the original claims were so absurd. They will never get it right nonetheless, until they understand that water vapour is a far greater feedback than c02, and c02 is a rather extraneous feedback.

cba
October 19, 2009 5:59 am

joel,
Talking to you, I keep getting the feeling that agw proponents think that nature will not continue to do that which nature does – unless we do something to help it along. Things like lower density water or air failing to rise … that sort of thing.

October 19, 2009 6:52 am


cba (05:59:05) :
joel,
Talking to you, I keep getting the feeling that agw proponents think that nature will not continue to do that which nature does – unless we do something to help it along. Things like lower density water or air failing to rise … that sort of thing.

I get the impression Joel et al are simply advanced ‘script kiddies’ without ability to intepret the technical material posted or referenced; sans ability to interpolate or extrapolate anything except simple linear relationships when values/situations appear outside ‘published bounds’, probably as a result of continued, merciless poundings at RC, exemplifying and continueing the ‘you are not allowed to think for yourself’ mentality.
.
.

cba
October 19, 2009 5:25 pm

perhaps – but then it’s the sort of thing i expect from those single specialty types in a multidiscipline environment as well.

Joel Shore
October 19, 2009 6:30 pm

_Jim: I do read RealClimate but I actually read serious texts on climate and atmospheric science and a lot of the original literature, the IPCC reports etc. And, since I have a PhD in physics, I can interpret it just fine…although I will admit that I still have a lot to learn about atmospheric and climate science.
By the way, I took the liberty of following your link to your “FreeRepublic” page (which may say something about you!) and I would note that your label of that graph as “incoming solar radiation” and your statement that “Earth radiates 160,000 times less than the sun” are both somewhat subject to misinterpretation. First, while it is true that the amount that the sun radiates per square meter of its surface is ~160,000 times the amount the earth radiates, the earth actually intercepts only a small fraction of this because by the time the radiation from the sun reaches the earth’s orbit, that radiation is spread out over a sphere of radius ~223 times the radius of the sun, which means the irradiance in W/m^2 is down by a factor of ~50,000 from what it is at the sun’s surface. And, thus your plot showing the “incoming solar radiation” is not correctly normalized if, by incoming you mean incoming to the Earth.
Of course, to a very good approximation, the amount that the Earth radiates is in fact equal to the amount of radiation that it receives (i.e., absorbs) from the sun because even with the current increases in greenhouse gases, the Earth is still only a fraction of a percent out of radiative balance relative to the amount of incoming radiation.

cba
October 20, 2009 6:35 am

Joel Shore
“By the way, I took the liberty of following your link to your “FreeRepublic” page (which may say something about you!)”
That statement shows an awful lot more about you than it does about Jim … and not in any positive light whatsoever.
I gather you’ve been too busy to work your way through my post above.

Joel Shore
October 20, 2009 2:53 pm

cba says:

I gather you’ve been too busy to work your way through my post above.

Yeah…A combination of too busy and it going off into more details than I wanted to get into. My basic point was that, while cloud feedbacks are certainly a large source of uncertainty, things aren’t poised quite as sensitively to clouds as one might think very naively.
For example, I once had a person tell me that a 1% change in cloudiness would produce as much radiative forcing as doubling CO2. This was presumably obtained, sloppily from the idea that a change in albedo by 0.01…and without any change to the outgoing longwave radiation…would amount to about a 3.5 W/m^2 change. And, my point is that it is not really nearly that sensitive both because a change of 0.01 in albedo would amount to about a 5%, not 1% change in the albedo due to clouds…and also that there will tend to be partly-compensating effects from clouds effects on the outgoing longwave radiation.
You seem to be suggesting by your back-of-the-envelope calculations that the value for the net radiative forcing of about -18 W/m^2 for clouds computed from the ERBE data by Ramanathan et al. may be somewhat on the low (in magnitude) side. Perhaps that is the case; I would have to look into the literature more to see what the range of current best values is. But, at any rate, for the sort of very basic qualitative point I was making, it would not make a huge difference. Clearly, it would make a difference if one wanted to get very quantitative about it.

cba
October 20, 2009 6:53 pm

as stated, those numbers from ramanathan are for cloudy/clear averages – and they are quite old – the earliest efforts at analyzing erbe which as I recall, looks crudely at specific bands. seems there’s a number of newer instruments with higher resolution etc. the back of the envelope suggests they are way off on both sides. you’ve lots of radiative output at lower rates from clouds as they are emitting continuum radiation. according to one measurement methodology, around 1997 there was a massive temperature spike related to a massive drop in albedo – ostensibly due to internal enso oscillations. the albedo effect dropped by nearly 10% and that amounts to over 10 w/m^2 and that is associated with a drop in cloud cover (palle & goode 07).
one of the problems is when you start to talk about truly serious cloud cover – as in low overcast clouds rather than thin semi transparant stuff near the stratosphere that has little to no effect on albedo anyway – is that you can lose 90% + of the incoming solar power. Remember that there is more power in the SW incoming in the near IR than there is in the visible as well. Some ofthat power is reflected in albedo and some is absorbed. It doesn’t wind up going to ground. In the tropical areas where there really is significant incoming solar, a lot of that absorbed energy goes into daytime cloud and thunderstorm formation – and the energy is high up in the atmosphere ripe to radiate out.
What you can do is create a couple of linear graphs (x axis is cloud cover fraction 0-1.0) for what could be called an averaged cloud effect. No clouds results in almost 340 W/m^2 incoming while 1.0 cloud cover is around 10% of that. Outgoing can be (back of the envelope estimated) at 390 – 150 = 240w/m^2 for clear skies and then estimated for cloud tops at an altitude around freezing with radiation by stefan’s law being for near 273k. Draw the incoming and outgoing lines as it should be rather linear with surface area. Look for the intersection – it should be in the general vicinity of just over 60% or 0.6 cloud cover fraction. You should also note that the energy balance between in and out is not strictly a function of ghg absorption but rather is also a function of cloud cover fraction. Changing the cloud cover fraction can result in changing the balance point without changing temperatures. Another little tidbit there is the implicit suggestion that there is a strong feedback mechanism (actually a setpoint control mechanism using negative feedback) that determines the average cloud cover because otherwise there would be no common average temperature as it would randomly drift around. Unfortunately, it doesn’t really suggest what that mechanism is though one can infer that it is the water cycle at work – power heats surface, heat of evaporation taken into air, lighter warmer parcels rise, radiating out energy, forming clouds, dropping precipitation cooling off, dropping out h2o content, falling down due to higher density etc etc etc. – classical cycle just like what one sees inside an individual thunderstorm or rainstorm.

Joel Shore
October 21, 2009 6:16 pm

cba says:

Some ofthat power is reflected in albedo and some is absorbed. It doesn’t wind up going to ground. In the tropical areas where there really is significant incoming solar, a lot of that absorbed energy goes into daytime cloud and thunderstorm formation – and the energy is high up in the atmosphere ripe to radiate out.

I’m a little confused at what you are suggesting here. If you are going to claim that this affects the final radiative balance, the argument would presumably have to be that the upper troposphere ends up warmer than the models would expect relative to the surface…and, yet, you have most people here quite insistent that the data actually shows the opposite, i.e., that the amplification in the tropical atmosphere is not occurring, or at least not the degree predicted by the models. I think that this is probably mainly a data quality issue (along with correctly taking into account the errorbars on the model predictions), but still the idea that the upper troposphere is actually significantly warmer relative to the surface than the models predict seems like a bit of a stretch!

Changing the cloud cover fraction can result in changing the balance point without changing temperatures.

I agree.

Another little tidbit there is the implicit suggestion that there is a strong feedback mechanism (actually a setpoint control mechanism using negative feedback) that determines the average cloud cover because otherwise there would be no common average temperature as it would randomly drift around.

I don’t understand this claim at all. Actually, I am not really sure what you are saying exactly anyway. However, to the extent that you think that the lack of negative feedback would cause too much change in temperatures, it seems to me that this would then be apparent in the climate models…i.e., they would exhibit too much variability in temperature relative to the observations. However, I don’t think this is actually found to be the case.
And, with a strong negative feedback mechanism from clouds, it becomes difficult to explain the ice age – interglacial cycles without proposing some very big additional forcing that is currently being left out.

cba
October 21, 2009 9:10 pm

this is an area where stefan’s law should be adept at an understandable explanation. When you increase the ghg absorption in a layer in the atmosphere, you are increasing the emissivity of that layer as well. That means you’re radiating more power downward and more power upward as stefan’s law is integrated from a surface out in all directions away from that surface. You’ve increased the absorbed power of that layer by E and you’ve increased the emitted power by 2*E so conservation of energy demands a decrease in that temperature. Actual arguments are more sophisticated than this but it doesn’t really change the outcome and it makes the concept harder to grasp.
One can get a glimpse of positive or negative feedback by the average sensitivity of the atmosphere based upon how much increase in temperature the average Earth value is over that of a blackbody with the same albedo (33 deg K rise for 150 to 160 W/m^2 power gives a response of around 0.22 K rise per W/m^2. One can also get a glimpse of this value, at least for clear skies, by taking the increase in surface temperature required to overcome an increase of 1W/m^2 in absorption by an increase in a little over 1 W/m^2 in emissions using stefan’s law. In clear skies, one has 390-150 = 240 w/m^2 escaping or a fraction of emitted 240/390 = 0.62. That indicates 1/0.62 = 1.6w/m^2 results in 1w/m^2 leaving the atmosphere. For 288.2 K, that means the 391.2w/m^2 would have to increase to 392.8 in order for balance to be restored, ignoring convection. This corresponds to 288.5 kelvins temperature or 0.3 Kelvins increase in temperature without convection. Note that convection at the surface is close to 100 w/m^2 but reduces to essentially nothing at the tropopause. It is what is needed to overcome all that water vapor located near the surface that is almost nonexistent way high up. Any increase in surface temperature creates an increase in convection that is a negative feedback. The expectation of a large water vapor feedback not only ignores the watervapor cycle but it must be based on a very paltry variation.
The comment you say you don’t understand simply states that the negative feedback is actually what is referred to as a setpoint control using negative feedback. The statement of feedback is not actually correct conceptually.
Assuming an absence of negative feedback is to presume that convection doesn’t occur or isn’t a function of temperature differential or change. You’ll note that the above examples shows the average effect per w/m^2 of 0.22 is less than the 0.3 kelvin rise of the differential temperature necessary for radiative only. It also precludes the presence of some sort of net positive feedback other than more watervapor.
As for climate models showing anything actually related to the real world, they remind me of the old punch card sorting equipment they used to have in the old carnival sideshows billed as computer based fortune telling.
the cloud feedback is simple to explain with a minimal bit of thought. Clouds are high albedo, liquid water at low angle of incidence (wrt to the normal) is tremendously low and makes up most of the surface where there is a low incidence angle. When one develops a high albedo surface condition, such as fresh snow, then one has a totally short circuited situation where the setpoint system no longer functions at all because there’s no difference with and without the clouds – not to mention the liklihood that there is far less h2o vapor present so the actual ghg absorption drops as well so a double whammy of higher albedo and lower ghg absorption occurs. Something like massive volcanic erruptions and/or asteroid impacts and/or massive increase in cloud cover perhaps due to cosmic radiation variations and/or gamma ray burst effects could, especially when combined with milenkovich orbital effects could trigger such a condition and such conditions as a glaciation period could potentially last until precipitation dropped below the sublimation rate and/or the ice/snow cover became very dirty with much lower albedo and/or some event variation with warming (rather than cooling) effects occurred.

P Wilson
October 22, 2009 7:04 am

To begin with, it makes no sense to talk of Watts per square metre for air or clouds – it only applies to two dimensional surfaces. You could calculate it as watts per cubic metre. However, there is no way of telling how many wpsm leave the earth – It is only an assumption based on a mathematic assumption. Also, stefan law applies to black body radiation, so the figure of 33C is likewise an unmeasured assumption from which the rest follows – again only to be applied to two dimensional surfaces – so the equation doesn’t represent nature, as it isn’t logical to assume that molecular weight and bonding doesn’t take place during th eprocess of heat transfer. Heat is a variable. Its often forgotten that energy can be neither created or destroyed (ok that part isn’t forgotten) but can be changed from one form to another. The result is that objects don’t emit as much IR as they receive in other forms. To represent nature, the SB would be an S curve to reduce the amount of radiation given off by normal matter, whilst equating with that of the sun. The smooth curve simply assumes too much radiation from normal matter at normal temperatures, and is too simple an exponent

cba
October 22, 2009 8:45 am

It does make sense to talk w/m^2 for those who understand it as that is what impinges on the boundary layer. It isn’t what is absorbed in that layer. There are measurements that show not only what the sb average values are but also show the radiation and/or reflectivity as a function of wavelength. The figure 33K is a difference in what is measured and what we know the nature of a bb to produce. Fortunately, Earth’s temperature is low enough such that its emissivity is quite high at such long wavelengths whereas its emissivity at much shorter wavelengths tends to drop significantly. The distribution of energy associated with temperature is what gives one the bb distribution curve. The smoothness of the curve or continuum emission is related to the nature of solids and liquids and the practically infinite possible lines combined to give the curve, whereas the spectrum of gases is far more limited yielding bands and lines.
I’m not sure what you mean by an s curve or how it applies to sb. sb was first empirical in origins but it is derivable from plank’s equation, integrated over angle and wavelength. emissivity as a single number is merely an engineering approximation but within limits, it works well. There are many commercial products that measure temperature by analyzing the IR signature of normal matter. As for the Sun, it essentially radiates a continuum from a gas that we would consider is a rather good vacuum at around 6000k.

P Wilson
October 23, 2009 2:39 pm

the constant can have the right characterisitcs for incandescent metals but be wrong for normal matter, as a lot of change occurs at normal temperature – whereas the SB constant requires horizontal change from the temperaure required to sustain life through to absolute zero. Chemical bonds reduce the ability to heat emission, and if their molecular weight increases there is less velocity. Metals give off far more heat than non metals at a particular temperature – free electrons amplify emitted heat – so its hard to justify a simple equation for nature.
I’ve just posted on the basal metabolic rate of a human, (in the “A borehole in Antarctica produces evidence of sudden warming” thread) which is an average 85wpsm – equipment calibrated on the SB constant records a higher wattage of emission for normal temperature matter, yet thermal imaging technology records a human at a greater radiate magnitude than normal temperature matter.

cba
October 24, 2009 7:33 am

most ir temperature measurement equipment uses a couple of ir wavelengths to make a measurement. If the emissivity is not the same at both wavelengths then the reading may be higher or lower than the actual temperature. After all, emissivity in reality is a function of wavelength.

P Wilson
October 25, 2009 7:50 pm

the most common detectors are from 7-14microns, which would record heat loss from buildings, humans (we’re about 9 microns optimum). 3-5 microns would be radiators, boilers, kilns etc.
if its true we generate optimum 100wpsm at 15 C, it’s logical that radiation exiting is a great deal less than 50wpsm. All things earthly would be in the 12-16 micron range, whilst S-B overestimates it 10 times.
Simply stated: Human heat loss is 75-100wpsm. Earth heat loss is much less, otherwise, we’d see it via telemetry, which covers all the *assumed* wavelengths that our venerable climate officials say it is

cba
October 26, 2009 5:35 am

curves are much wider than that. Detectors only need enough width for to read significantly above the noise and then compare at two different wavelength. The nature of the curve is to peak close to the maximum. that doesn’t mean all of the power is in the peak. For Earth, 288K, around 1-2% of the total is still beyond 65um. As for 9 microns – there’s a big absorption band around there.
actual heat loss is going to be the difference between the heat radiated (and evaporative cooling sweat) and heat absorbed by radiative means.
as stated above, stefan’s law is an integration of angle and wavelength of planck’s law.
I don’t know what you mean by seeing telemetry. also, radiative transfer is the only science of substance that those politicians can legitimately claim is fairly well understood.

Roy
November 8, 2009 10:18 am

the new drive 5 miles less advert for act on co2 is laughble. Its just ridiclous.
How about people in america stop driving 5.0litre cars on a regular basis? Driving 5 miles less each week in the uk is hardly going to amount to anything. Its these stupid little things that annoy me a great deal.
The fact is if its such a big problem, then do some big things to solve it. Not these stupid things like drive 5 miles less. Drive 5 miles less forever? Will it go up to 10? 20? Drive 50 miles less? Where does it stop? It doesnt provide a long term solution. Therefore its laughable.
Ive also heard that actually if the atmosphere becomes that bad they can fire rockets (ridiclous I know, but so a scientist said) into the atmosphere and balance it out. Sorry I dont know what was in the rockets, it sounded insane. But so does driving 5 miles less.

Roy
November 8, 2009 10:20 am

Also the ridiclous averaging out and calculations from what must amount to rocks and soil hardly provides an accurate reading of what co2 existed 15 million years ago. You cannot save its accurate no matter what unless you stood there 15 million years ago and took “a reading” not the average guess work produced now. Then again most of the stats are hardly founded in fair tests.

AirFiero
January 5, 2011 12:39 am

Did we ever get a source for Bill’s graphs?

1 12 13 14
Verified by MonsterInsights