The Logarithmic Effect of Carbon Dioxide

Guest post by David Archibald

The greenhouse gasses keep the Earth 30° C warmer than it would otherwise be without them in the atmosphere, so instead of the average surface temperature being -15° C, it is 15° C. Carbon dioxide contributes 10% of the effect so that is 3° C. The pre-industrial level of carbon dioxide in the atmosphere was 280 ppm. So roughly, if the heating effect was a linear relationship, each 100 ppm contributes 1° C. With the atmospheric concentration rising by 2 ppm annually, it would go up by 100 ppm every 50 years and we would all fry as per the IPCC predictions.

But the relationship isn’t linear, it is logarithmic. In 2006, Willis Eschenbach posted this graph on Climate Audit showing the logarithmic heating effect of carbon dioxide relative to atmospheric concentration:

And this graphic of his shows carbon dioxide’s contribution to the whole greenhouse effect:

I recast Willis’ first graph as a bar chart to make the concept easier to understand to the layman:

Lo and behold, the first 20 ppm accounts for over half of the heating effect to the pre-industrial level of 280 ppm, by which time carbon dioxide is tuckered out as a greenhouse gas. One thing to bear in mind is that the atmospheric concentration of CO2 got down to 180 ppm during the glacial periods of the ice age the Earth is currently in (the Holocene is an interglacial in the ice age that started three million years ago).

Plant growth shuts down at 150 ppm, so the Earth was within 30 ppm of disaster. Terrestrial life came close to being wiped out by a lack of CO2 in the atmosphere. If plants were doing climate science instead of us humans, they would have a different opinion about what is a dangerous carbon dioxide level.

Some of the IPCC climate models predict that temperature will rise up to 6° C as a consequence of the doubling of the pre-industrial level of 280 ppm. So let’s add that to the graph above and see what it looks like:

The IPCC models water vapour-driven positive feedback as starting from the pre-industrial level. Somehow the carbon dioxide below the pre-industrial level does not cause this water vapour-driven positive feedback. If their water vapour feedback is a linear relationship with carbon dioxide, then we should have seen over 2° C of warming by now. We are told that the Earth warmed by 0.7° C over the 20th Century. Where I live – Perth, Western Australia – missed out on a lot of that warming.

Nothing happened up to the Great Pacific Climate Shift of 1976, which gave us a 0.4° warming, and it has been flat for the last four decades.

Let’s see what the IPCC model warming looks like when it is plotted as a cumulative bar graph:

The natural heating effect of carbon dioxide is the blue bars and the IPCC projected anthropogenic effect is the red bars. Each 20 ppm increment above 280 ppm provides about 0.03° C of naturally occurring warming and 0.43° C of anthropogenic warming. That is a multiplier effect of over thirteen times. This is the leap of faith required to believe in global warming.

The whole AGW belief system is based upon positive water vapour feedback starting from the pre-industrial level of 280 ppm and not before. To paraphrase George Orwell, anthropogenic carbon dioxide molecules are more equal than the naturally occurring ones. Much, much more equal.


Sponsored IT training links:

Worried about 642-426 exam results? Join 642-446 training program and get complete set of 350-029 dumps with 100% success guarantee.


The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
4.7 19 votes
Article Rating
436 Comments
Inline Feedbacks
View all comments
jorgekafkazar
March 8, 2010 10:11 am

Richard Telford (01:00:29) : “I don’t know if this post was supposed to be misleading and confusing but is certainly is.”
Obviously, at least one person has been misled and confused. This is a good thread and perhaps based on reader comment the post can be supplemented with additional material that will lessen Mr. Telford’s confusion. Dismissing his comments out-of-hand is neither productive or polite. Clarification seems warranted.

Slartibartfast
March 8, 2010 10:12 am

A better link than above and a nice short description of the origins of this discussion

Not a good experiment, IMO. There are escape paths for internal heat other than back through the window, and the Earth isn’t accurately modelable as a small, fully-enclosed greenhouse lined with black cardboard.
Oh, and he’s using sunlight already filtered by the atmosphere as input. But that’s probably just a quibble.

Michael
March 8, 2010 10:13 am

Is the Sun going back to sleep? It tried to get out of bed for a little and has been dragging it’s ass with just a few sun specks. Take a look, the sunspot number is zero again.
Solar wind
speed: 325.8 km/sec
Sunspot number: 0
Updated 07 Mar 2010
Spotless Days
Current Stretch: 2 days
2010 total: 3 days (5%)
2009 total: 260 days (71%)
Since 2004: 773 days
Typical Solar Min: 485 days
http://www.spaceweather.com/

March 8, 2010 10:15 am

Larry (08:18:31) :
“Any computer model with a little bit of excess positive feedback is going to predict whatever it is modelling is going to hell in a handcart over sufficient iterations. These modellers should be made to repeat before they go to work “any natural system that had unrestrained positive feedback would have destroyed itself before I got to model it”.”
Generally, positive feedback is a very bad thing as it causes instability and oscillation. Biological systems have strong negative feedbacks otherwise life would have died out long ago.
A small amount of positive feedback can sometimes be accommodated but there is then need for ‘intelligent’ control, monitoring and possible intervention to ensure that a runaway problem doesn’t develop. We run incandescent lightbulbs from voltage sources, relying on a NEGATIVE feedback mechanism – as the filament heats, its resistance increases, reducing the current and so throttling back on the I^2R heating losses. Equilibrium is quickly established, and the system is intrinsically stable even if one varies the voltage. Not so with driving an incandescent lightbulb with a current source where the system then has POSITIVE feedback: as the filament heats its resistance increases so I^2R heating losses increase, so it gets hotter, so its resistance increases more, so I^2R losses increase even more. Even with this regeneration, there will be a low value of current at which (a not-very-stable) equilibrium will be attained. However, there is a critical current above which the system becomes unstable and thermal runaway ensues until the filament blows. Depending on thermal inertias, a small glitch that takes the current momentarily over the critical current can prove fatal to the system by pushing it into a region of instability from which there is no recovery (without external intervention). So, as a designer, you would never knowingly run a filament lamp from a current source. As a designer, you wouldn’t design a biosphere with positive feedbacks either if you wanted it to be stable.
As Richard Lindzen has remarked, any significant positive feedback in the climate system is a problem for the theist because it would be evidence of ‘UnIntelligent Design’. That’s not compelling for the atheist, but even there one could consider the argument that of all possible earths that could exist, the only ones that can persist are those that don’t exhibit positive feedbacks. Since this earth exists with a flourishing biosphere and has without doubt persisted for a very long time, then this evidence would tend to militate against the presence of positive feedbacks as well. Of course, these are metaphysical arguments, but when someone propounds a crazy idea like AGW it’s worth doing a reality check by taking a look outside the box.

Larry Huldén
March 8, 2010 10:15 am

One feedback which has not been discussed here (unless I have missed a comment) is the presumed increase of outgoing CO2 from the soil when climate is getting warmer. During this winter there was a study where the authors claimed to have quantified this effect: for each degree oC there would be 7 % extra warming because of extra release of CO2. According to climate models a standard value for this feedback has been fixed to 40 % extra warming. The difference would be 1.07 oC contra 1.40 oC.
Thus we have in fact two uncertain factors in IPCC models, this outgoing CO2 and the clouds.
I wonder if Hans Erren could comment on this.
Larry Huldén
Finnish Museum of Natural History

Tim Clark
March 8, 2010 10:22 am

Richard Telford (04:39:45) :
The natural CO2 forcing is shown without any feedbacks, whereas the anthropogenic forcing is shown with feedbacks. This is misleading. Nobody would argue that the natural changes in CO2 are not magnified by feedbacks (try to explain the glaciations without feedbacks) One can argue about the magnitude of the feedbacks. Perhaps the IPCC has them too high. Perhaps too low.

Well…actually we would. Not only the absolute value of the “magnification” of feedbacks, but the direction. As for your glaciation analogy, lookup orbital cycles. See if that helps your understanding.

March 8, 2010 10:22 am

Hi JonesII
Thanks for the ‘Magnetic drain’ link.
Here is a graph showing huge drop in the intensity of the Earth’s GeoMagnetic Field (GMF, vertical component) at latitude of 36 degrees South.
http://www.vukcevic.talktalk.net/LFC13.htm
Location east of Concepcion (in the Andes near the Argentinean border) the GMF has one of the largest drops anywhere on Earth (in 1600 was 54 microTesla , in 2010 is 14.6 microTesla ) i.e. in 1600 GMF was 370% stronger than it is today.
You can find the South Atlantic GMF sweep on: http://www.vukcevic.talktalk.net/GandF.htm

Murray
March 8, 2010 10:27 am

Congrats David. That is an excellent, clear and easy to understand exposition, and illustrates beautifully the nonsense of water vapor positive feedback. I did a check of all the NH high latitude temp. stations a couple of years ago, and the 1976 PDO shift shows up very clearly, especially for Alaska and Siberia. In greenland it gets obscured to some degree by NAO shifts. For Alaska, if you factor out other impacts, like paved runways in the ’90s, and UHI increases you also have the “no further warming”.
Clearly, the process is:
– hypothesize warming
– build a model to prove warming
– tune the model so it gives the desired warming
– invent a mechanism that explains the fudge factor used to tune the model
WV positive feedback is the mechanism for post 2000 warming.
The fudge factors to make the models backcast are also interesting. Very slow mixing between the surface and deeper layers of the ocean are necessary to give the needed CO2 pulse lifetime and then the cooling from ca 1945 to 1975 is aerosols. Noone explains the source of the aerosols, nor why they ceased to be effective after 1975. With 2 undemonstrated mechanisms you can make the models backcast pretty well, and then with a third one you can get a “catastrophic” (also undemonstrated) forecast.
Ain’t AGW wonderful??

John Carter
March 8, 2010 10:33 am

If you want to see an excellent performance by sceptics and a disaster by alarmists you would find this difficult to beat.
A must see.

Readfearn – What a plonker.

Toto
March 8, 2010 10:36 am

See Tony Brown’s excellent post at The Air Vent:
“Historic variations in CO2 measurements.”
http://noconsensus.wordpress.com/2010/03/06/historic-variations-in-co2-measurements/

RockyRoad
March 8, 2010 10:46 am

Larry (08:18:31) :
(…)
These modellers should be made to repeat before they go to work “any natural system that had unrestrained positive feedback would have destroyed itself before I got to model it”.
————-
Reply:
You’re absolutely right Larry. And without going into a whole lot of theoretical or graphical persuasion, what you say is obvious. Why? From a geologist’s standpoint, if there was truly a “tipping point”, the earth would have tipped a long time ago when the CO2 levels were many times what they are today. Yet we don’t see the oceans boiled up into the sky in a soup so thick you could cut it with a knife. And those that propose “global warming” as the cause of earth’s 5 major extinction episodes rather than catastrophic impacts are simply ignoring the obvious.
Before those proposing a “tipping point” gain any credibility whatsoever, they need to propose an “untipping point”. Please identify the mechanism. Barring that, it’s all pretty much fantasy. I simply grab a photo of the earth taken from space as Exhibit A. And I’m pretty certain that’s how the earth appeared before the industrialized era began 150 years ago.

March 8, 2010 10:48 am

mathman (08:32:29) :
“Is anyone out there thinking?
CO2 is a gas found in the atmosphere. As is true of many molecules, CO2 responds to certain incoming radiation (photons) by jumping to an excited state. I am citing the obvious here. What then? Does the CO2 remain excited forever? No. The excited state decays back to an unexcited state, releasing the quantum of energy (this time without regard to direction).”
Hey, hang on: I’m getting uncomfortable with some of the comments here. CO2 is a MOLECULE like water is a molecule. Molecules can be excited into resonance in a way that atoms can’t. Nils Bohr was right about atoms, for example sodium absorption and emission spectra, but molecules are different: they can have resonances between the atoms that atoms themselves can’t have. When you put your food in the microwave oven the frequency of the magnetron is tuned to excite a molecular resonance for water, not an atomic one: the non-ionizing radiation is not exciting electrons into different states in the individual atoms. Air is made up of O2, N2, CO2 and CO2 molecules, and Ar atoms.
In the IR spectrum of emission from the earth’s surface, the only atmospheric absorption that takes place worth considering are molecular absorption mechanisms, as the atomic absorption/emission lines are way out of the spectrum. If that’s the case then what Nils Bohr said was not relevant to this case as we are talking about a different mechanism – molecular resonance, not quantum mechanics.

JonesII
March 8, 2010 10:48 am

Vukcevic (10:22:58) Thanks Vuk! Really surprising, and as the line of force goes from the strongest to the lowest…there is a “pressure” on the spot.

son of mulder
March 8, 2010 10:51 am

“Mike (04:54:41) :
Let’s see, usually we are told by AGW skeptics that the atmosphere is too complex to understand.”
This relates to the chaotic nature of the climate. Within Newtonian mechanics the famous 3 body problem is chaotic and the paths of the individual bodies cannot be calculated exactly because of the uncertainty of the exact initial conditions. But it is known that the maximum possible distance between any 2 bodies will be determined by the balance between kinetic and gravitational potential energy.
Similarly the climate is governed overall by an energy balance but the predicted configuration of the atmosphere is unpredictable. Hence we know the earth can’t get warmer than the sun as an extreme example but we can’t predict how winds and ocean patterns would change.

March 8, 2010 10:52 am

Should have read “Air is made up of O2, N2, CO2 and H20 molecules, and Ar atoms”

March 8, 2010 10:59 am

Jaye (09:10:13) :
John Finn…your application of “appeal to authority” is humorous, entirely fallacious from a logical pov, but definitely humorous.

Whereas you appear prepared to believe any old rubbish as long as it supports your fervent wish that CO2 should have no effect. Well, suit yourself, but when you find that AGWers are able to ridicule sceptic arguments don’t start whining.

March 8, 2010 11:12 am

Any computer model with a little bit of excess positive feedback is going to predict whatever it is modelling is going to hell in a handcart over sufficient iterations. These modellers should be made to repeat before they go to work “any natural system that had unrestrained positive feedback would have destroyed itself before I got to model it”.
Which is why I believe they do not show their results for farther than 100 years out. Not that the results would be useful, but they would show if the models are stable.
Funny thing is that you don’t hear much about tipping points from the climate “scientists” these days although it was all the rage for a while. Perhaps they were afraid some one would tumble to the fact that the models are broken Fundamentally.

March 8, 2010 11:22 am

Murray (10:27:29) :
Very slow mixing between the surface and deeper layers of the ocean are necessary to give the needed CO2 pulse lifetime and then the cooling from ca 1945 to 1975 is aerosols. Noone explains the source of the aerosols, nor why they ceased to be effective after 1975.

Couldn’t have anything to do with a world war and over 500 atmospheric nuclear bomb tests could it!

Editor
March 8, 2010 11:26 am

Can Mr. Archibald please include some citations? I have no trouble believing that the IPCC models assume that water vapor feedback effects start at 280ppm. They do much worse. For instance, they parameterize total solar effects as having 1/14th the forcing effect of CO2, when numerous studies show a .6-.8 degree of correlation between solar activity and past temperature change.
Want to see where solar effects are assumed (not found) to be tiny compared to CO2? Look on figure 2.4 on page 39 of the Synthesis Report: http://www.ipcc.ch/pdf/assessment-report/ar4/syr/ar4_syr.pdf
The only solar variable included is TSI, which is accorded a forcing of .12 W/M2, compared to 1.66 for CO2.
Providing the citation allows other people to USE the information because they can back it up. Similarly, we need the citation in order to use Mr. Archibald’s information. He is making a pretty important claim. It would be nice to be able to verify and cite it.

Don Pooley
March 8, 2010 11:31 am

What percentage of the CO2 increase can be ascribed to our increasing use of fermented beverages? More people = more booze drinkers; is this a significant part of the CO2 problem?

George E. Smith
March 8, 2010 11:32 am

Well I for one have a problem with the entire premise of this essay. For a start “Climate Senistivity” is defined as the permanent increase in the mean global surface temperature of the earth for a doubling of the CO2 abundance in the atmosphere. I googled dozens of papers; in fact dozens of pages of papers which citre this definition or the equivalenty in slightly different words. The IPCC evidently even gives a value for it namely 3.0 deg C per doubling. Well actually to be more accurate they say 3.0 +/-1.5 deg C, a 3:1 spread in value. I don’t know whether that is a GCM modelled value, or an actual planet earth observed value.
In any case they have a Temperature versus log CO2 relationship; that’s not the same thing as a “Forcing” in Watts per metre squared versus log CO2.
Now your curves look very pretty, especially that first one, the modtrans logarithmic plot.
Now toss in a 3:! spread about that nice curve, and then try to convince me that the relationship is still logarithmic; well more likely to be logarithmic than say linear.
Well the relationship between mean global surface temperature and “Forcing” in Watts per m^2, is not even linear. Well the simplest assumption that the connection exactly follows black body radiation laws, would make the “forcing” go as the 4th power of the absolute temperature. Luckily, if the relationship was logarithmic, that 4th power merely changes the scale.
Unfortunately the thermal processes that remove heat from the earth are a lot more complex than simply black body radiation; and the rate of heat loss globally is not simply related to the mean surface temperature.
But the proof of the pudding is in the climate data over a longer period of time. Your pre-inductrial levels of CO2 to today, are not even one half of one octave of doubling, so to twice today, is less than 1 1/2 doublings.
Hoe about five doublings; well halvings anyway, that have taken place over the last 600 million years.
See http://www.geocraft.com/WVFossils/PageMill_Images/image277.gif
Now there you have CO2 dropping from 7000 ppm (25 times today’s value) down to your 180 ppm low. Yet over most of that 600 million years, the temperature remained constant at 22 deg C.
So currently, the earth is in an anomalous cold phase, the likes of which we haven’t seen in over 300 million years; well maybe 260.
It would be nice if somebody was able to show us some believable observational data, either measured or believable proxy data, covering at least one octave of CO2 doubling, which confirms a logarithmic relationship as more likely than a simple straight line linear relation.
It would be even nicer if someone would offer even a simple physics model for why the earth’s mean surface temperature should be expected to vary as the logarithm of the atmospheric CO2. Ther’s no such physical connection that I am aware of; so I’d like someone to point to such a theory.
Well a lot of people like to point to Beer’s Law, which governs the transmission of light through absorptive media; optical glasses for example.
The net transmission decays exponentially with thickness of the (assumed homogeneous) absorbing medium. The concept is simple; the probability of absorption of any single photon is a constant for a particular wavelength and thickness of the sample, and of course on the nature of the material. The probability of the photon passing through many such layers is simply the product of the (transmission) probablilities of all those layers.
Well Beer’s law works quite well, if you measure the transmitted energy with a monochromator. Some “fast cut” long pass color filter glasses can easily reduce the transmission to 0. 001 % in say 3 mm of glass, just a few hundred nanometres longer in wavelength than the wavelength which passes 50% (the cut-off wavelength for that sample.
But don’t expect to get only 0.001% of the energy transmitted through that same sample. Replace the monochromator with a wide bandwidth detector, and you will find orders of magnitude more energy than the absorption curves claim. The problem is that such materials fluoresce, and the energy absorbed by the glass at one wavelength is then re-emitted at a longer wavelength, and emerges out the other side simply shifted to a longer wavelenght. You can stack up a whole series of such glasses with increasing cut-off wavelengths, and each will sequentially shift the wavelength so that it passes through the next layer too. So Beers law is NOT always followed, in the case where the energy can be re-emitted at longer wavelengths.
The best one can hope for in say visible light absorption filtering, is that the visible wavelength energy that is absorbed by the filter results in heating, and the final emission is in the LWIR spectrum; hopefully a long way away from an area that can influence whatever system the filter is part of.
Well guess what happens in the atmosphere when GHG molecules e.g. CO2 absorb specific photon energies in the 13.5 to 16.5 micron range. That energy becomes thermalized, and transmitted to the ordinary atmospheric gas molecules; which eventually radiate a thermal continuum LWIR spectrum, based on the temperature of the atmopshere; not the temperature of the original emitting surface.
There’s no evidence that such processes conform to Beer’s Law, or in any other way exhibit a logarithmic response as to the resulting mean earth surface temperature rise.
Al Gores famous graphs in his book, of ice core temperatures and CO2 both have the same general shape, as he makes clear in his book, and apparently did so in his movie as he waved them in front of the audience, and suggested that they awere the same. If one was the logarithm of the other they certainly wouldn’t look the same.
The 600 million years of proxy data cited above certainly do not support a logarithmic connection between either mean global surface temperature, or “Forcings” and CO2 abundance in the atmosphere.
Earth’s comfort temperature range, is being controlled by something a whole lot more influential than atmospheric CO2.
For one thing, over the earth’s extreme total temperature range from -90 C, to over +60 C (all of which could be co-existing simultaneously); the maximum possible surface emittance bounded by black body radiation laws, covers a 11 to one range of Watts per square meter (“forcings” if you like). That sets the maximum energy available to be captured by CO2, depending on location on earth. There isn’t any global network that is sampling the value of “Climate sensitivity” all over the earth to arrive at the IPCC’s 3.0 +/-1.5 deg C per doubling of CO2.
And the definition says nothing about CO2 getting any assistance or support from any other GHG; that is the result of doubling CO2; period. Other GHGs each have their own effect and they are unrelated to how much CO2 is present.
To me, the whole idea of “Climate senitivity” and a logarithmic relationship between mean global surface temperature and the log of CO2 abundance in the atmosphere simply doesn’t hold water, either experimentally or theoretically.
But that is just my opinion of course; I’d be happy to learn of either a theory or measured data showing otherwise.

Mike M
March 8, 2010 11:34 am

Alan D McIntire (07:56:20) :… OBVIOUSLY the feedbacks must be mostly negative, probably due to clouds,…
Well, if the geologic record of earth’s temperature is correct then temperature definitely hits some sort of a severe negative feedback ‘wall’ around +22C. It’s as though we have a giant air conditioner out there with its thermostat set at +22C.
http://geocraft.com/WVFossils/PageMill_Images/image277.gif (According to Al Gore earth should have burnt up 500-600 MYA when there was at least 3000 PPM CO2.)
If not because of water vapor then what else could it possibly be? We already know that warming happens the least in the tropics and most at the poles so the key to the negative feedback from water vapor appears to be found in whatever is happening in the tropics – IMO large quantities of water vapor’s latent heat being convected way up there above most of the GHG’s. As the earth gets warmer – tropical conditions would expand to higher latitudes thus enhancing the negative feedback over a larger area.

March 8, 2010 11:36 am

Ralph (10:04:17) :
And I like this paper, which says that CO2 concentrations have been as high as 480ppm in the 1940s. Figures, derived through chemical gas analysis.
http://www.friendsofscience.org/assets/files/documents/CO2%20Gas%20Analysis-Ernst-Georg%20Beck.pdf
Any problems with this paper too?

There’s a lot wrong with it. The measurements are clearly taken at locations which are contaminated by local effects. There are plenty of places where you can measure 500 ppm but they do not provide a global representation of CO2 in the atmosphere, i.e. CO2 is not “well-mixed” in the atmosphere. The Beck numbers make no sense whatsoever. According to Beck, there is an increase of around 150 ppm between 1930 and 1940. Where did that lot come from? Modern fossil fuel emissions are about 7.5Gt Carbon per year which corresponds to ~3.5ppm. Around half of that is absorbed by natural sinks in the ocean and the terrestrial biosphere (there was a post on WUWT about this a bit back). This gives us an average increase of just under 2ppm per year. What do you imagine caused the 150ppm increase?
It gets worse. According to Beck there is a similar rise in the 1820s. There are also annual increases (and decreases) of 50 ppm in a single year. How this paper is being treated seriously by anyone is beyond me.

Steve Goddard
March 8, 2010 11:41 am

I think everyone agrees that the direct effect of doubling CO2 is less than 1.5C. The area of disagreement has more to do with feedbacks in the climate models.

A C Osborn
March 8, 2010 11:43 am

Ralph (10:04:17) :
Any problems with this paper too?
I don’t see how there can be, they are actual measured values, not guesses from Ice cores, Tree Rings, Leaves etc.
But the IPCC & Climate Scientists seem to want to ignore written History, because it is Inconvenient.
Just look at Australia’s history compared to current “Unprecedented” temperatures, droughts & Rainfall etc. They just love to apply that “Unprecedented” when it is obviously not so.

1 5 6 7 8 9 18