New paper finds transient climate sensitivity to CO2 is ~35% less than IPCC claims

From The Hockey Schtick:

A paper under discussion for Earth System Dynamics finds the transient climate response [TCR] to CO2 is 1.3°C per doubling of CO2 levels, about 35% less than claimed by the IPCC mean estimate and the same as determined by another recent paper by Otto et al finding a TCR of 1.3°C. 

The authors say:

 “assess the origin of these differences [between the IPCC high TCR estimates and lower estimates from this paper and others] and highlight the inverse relation between the derived anthropogenic temperature trend of the past 30 years and the weight given to the [natural] Atlantic Multidecadal Oscillation (AMO) as an explanatory factor in the multiple linear regression (MLR) tool that is usually employed. We highlight that robust MLR outcomes require a better understanding of the AMO in general and more specifically its characterization. Our results indicate that both the high- and low end of the anthropogenic trend over the past 30 years found in previous studies are unlikely and that a transient climate response with best estimates centred around 1.3°C per CO2 doubling best captures the historic instrumental temperature record.”

Earth Syst. Dynam. Discuss., 5, 529-544, 2014

www.earth-syst-dynam-discuss.net/5/529/2014/

doi:10.5194/esdd-5-529-2014

Impact of the Atlantic Multidecadal Oscillation (AMO) on deriving anthropogenic warming rates from the instrumental temperature record

G. R. van der Werf and A. J. Dolman

Abstract:

VU University Amsterdam, Faculty of Earth and Life Sciences, Amsterdam, the Netherlands

Abstract. The instrumental surface air temperature record has been used in several statistical studies to assess the relative role of natural and anthropogenic drivers of climate change. The results of those studies varied considerably, with anthropogenic temperature trends over the past 25–30 years suggested to range from 0.07 to 0.20 °C decade−1. In this short communication we assess the origin of these differences and highlight the inverse relation between the derived anthropogenic temperature trend of the past 30 years and the weight given to the [natural] Atlantic Multidecadal Oscillation (AMO) as an explanatory factor in the multiple linear regression (MLR) tool that is usually employed. We highlight that robust MLR outcomes require a better understanding of the AMO in general and more specifically its characterization. Our results indicate that both the high- and low end of the anthropogenic trend over the past 30 years found in previous studies are unlikely and that a transient climate response with best estimates centred around 1.3 °C per CO2 doubling best captures the historic instrumental temperature record.

0 0 votes
Article Rating
54 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
milodonharlani
May 8, 2014 4:24 pm

I’m shocked! Shocked to learn that yet more real science has actually been going on!
Equilibrium CS is probably about the same, ie scarcely more than the effect from doubling CO2 alone (280-560 ppm), over half of which effect has already occurred since c. AD 1850. The supposed huge positive feedbacks from water vapor & any other source have always been unsupported phantom fantasies.

Otter (ClimateOtter on Twitter)
May 8, 2014 4:29 pm

Dana to the misrepresentation!

Editor
May 8, 2014 4:29 pm

Nice to see there are no disagreements in climate science (sarc off).

Editor
May 8, 2014 4:30 pm

Otter (ClimateOtter on Twitter) says: “Dana to the misrepresentation!”
Thanks. Made me laugh.

MattN
May 8, 2014 4:43 pm

It’s really probably more like 60-70% less than what the IPCC says.

NikFromNYC
May 8, 2014 5:02 pm

It still makes no sense since the recent temperature variation in our high emissions era is so closely mirrored in the former one, and temperature saw-toothed *down* just as emissions finally boomed after WWII:
http://www.woodfortrees.org/plot/hadcrut4gl/from:1955/to:2013/plot/hadcrut4gl/from:1895/to:1954
Since CO2 rose about 30% since WWII that would add 0.35° extra warming since then, but even up-adjusted Climategate University’s global average shows no such big boost in warming versus the low CO2 spike before it, only perhaps 0.2° extra, which stepping back mathematically indicates sensitivity of only 0.74° not 1.3°.

Orson
May 8, 2014 5:11 pm

Bob says “Nice to see there are no disagreements in climate science (sarc off).” YUP, we wouldn’t want anybody to get the strange idea that there was no Imperial Consensus about the Truth of AGW, would we?

NikFromNYC
May 8, 2014 5:11 pm

Correction: I have to discount the 10% boost in CO2 prior to 1940, which would have added 0.13° to that early spike, so the new spike really can be said to be 0.33° extra big compared to the *natural* part of the older one, quite close to their sensitivity claim after all.

NikFromNYC
May 8, 2014 5:19 pm

…then again, sensitivity could still be zero due to negative feedbacks or unappeciated physics but natural variation may still have caused most all of the curve, given that even bigger transient warm spikes litter the last 30K years in the Greenland ice core, the last 3500 years being playfully plotted here:
http://s6.postimg.org/zatdndwq9/image.jpg

Matthew R Marler
May 8, 2014 5:23 pm

What exactly did they use for their “linear” anthropogenic term? CO2 concentration? Log of CO2 concentration? If it’s there I missed it.

Latitude
May 8, 2014 5:38 pm

The instrumental surface air temperature record…has been so jiggered to show faster warming
…and this is still all we could squeeze out of it

David Ball
May 8, 2014 6:02 pm

Let me see,…… remove all the unjustified “adjustments”, factor in the poor station siting, include known geological data, and presto; zero CS. ( Oh yes, I forgot to add; ignore the ridiculously inadequate GCM’s as per rgbatduke ).

john robertson
May 8, 2014 6:14 pm

Genuflection in. Gospel out?
So if the next 30 odd years are cooler than today, will the sensitivity drop to zero or go negative?
Given the quality of our current temperature record, can we honestly claim warming, [cooling] or no can say?
The claimed trends being less than the noise on the signal, it takes a lot of gall to make any definitive claims from this data.

RoHa
May 8, 2014 6:16 pm

1.3C? We’re doomedish.

john robertson
May 8, 2014 6:16 pm

spellcheck loves me.. cooling..or no can say.

milodonharlani
May 8, 2014 6:26 pm

NikFromNYC says:
May 8, 2014 at 5:02 pm
That’s why in real science, rather than Climate Science (TM), CACA has been repeatedly what we call “falsified” (in both the technical & ordinary senses of the word). In fact, it was born falsified, since data from c. 1945 to its birth c. 1977 already showed the hypothesis false.

May 8, 2014 7:13 pm

Anthony.
This sensitivity thing is something that has confused me (I’m easily confused) for a long time. When I studied energy transfer at university, we used curves based on work by Leckner (Leckner, 1972) and earlier work by Hottel (1927) and Hottel and Sarofim (1967). These models for [AB sorption] of EMR versus concentration of CO2 showed that the [AB sorption] for CO2 (unlike water) plateaued (well nearly plateaued as there is always an increase in absorbance with an increase in concentration). This would be a sensitivity of almost 0. The CO2 level was at a path length of about 200 bar.cm which corresponds to atmospheric CO2 of about 300 ppm. So the question I have for Mosher and his buddy the telescope guy. Can they explain to a dimwit like me, what conditions are necessary to move from:
the [AB sorption] as modeled by Leckner, which shows a somewhat hyperbolic relation between CO2 and [AB sorption], with the asymptote nearly parallel to the concentration axis,
to the [AB sorption] as modeled by Ramanathan, which shows a more parabolic/logarithmic relation?
At lower concentrations (400 ppm and lower) the results are so close, they can be called identical. At higher concentrations, Leckner’s work shows near 0 sensitivity. Leckner’s curves are used by combustion, mechanical, chemical and metallurgical engineers to design things that work. Ramanathan’s are used by climate scientists to design climate models that over-estimate global warming. I think (could be wrong, often am) Myhre et. al. 1998 is more recent that Ramanathan’s work and is the source of the 5.35ln[CO2] equation.
For those interested in how all the physics of this stuff works, I direct them to the blog to the right called “scienceofdoom”. Excellent reference for radiative physics.

May 8, 2014 7:16 pm

Mods: I’ve made a minor mistake in a technical term above. Wherever I say “adsorption”, one should substitute AB sorption. It will be interesting to see if anyone picks up on this.
[Like that [AB sorption] or [absorption] ?? First doesn’t look right. Mod]

joeldshore
May 8, 2014 7:51 pm

John Eggert:
You seem to be stuck on a basic point that Roy Spencer addresses here – http://www.drroyspencer.com/2014/04/american-thinker-publishes-a-stinker/
The basic point is this: For determining forcing, the issue is not one of the absorption saturating. The issue is one of how high you have to go in the atmosphere until it is optically transparent enough that radiation emitted can escape to space without being absorbed again. This height rises as you increase CO2 levels and, along with the lapse rate in the troposphere, this means emission occurs from higher, colder regions of the atmosphere where (because the intensity of emission is an increasing function of temperature) means that the emission to space is less.

May 8, 2014 8:09 pm

Nice one John Eggert -at least one commenter who has some understanding of heat transfer. (adsorption and absorption are different chemical engineering subjects)
I make another point, no one has proved that the 280ppm CO2 atmospheric concentration prior to 1960 is correct. Everyone is ignoring the measurements made by well qualified scientists (some recognised with Nobel prizes) with instruments whose accuracy had been checked and errors understood. (see here http://www.biomind.de/realCO2/realCO2-1.htm) If the concentration as measured by W Kreutz and others around 1941 was about 390ppm then it points to a sensitivity in the period 1940 to 2014 of zero. Measurements of CO2 and temperature back around 1850 have doubts about being representative of the world and the extent of errors but there is equally no evidence the CO2 level was 280ppm.
Discussing a word or subject such as sensitivity based on guesses or hypotheses is futile. Lots of factual information with knowledge of errors are required for one can make a sound analysis..

Forrest
May 8, 2014 8:12 pm

Well this is more in line with what I think we should be talking about when it comes to Climate Change. Again this is still a little on the conjecture end as there may be numerous other items that play a role in temperature that have amplified the signal we have thus seen but it is getting closer to where I understand the numbers to be.
It is funny to see people say that skeptics are deniers of climate change when in reality I think the correct term is realist. Now I am happy to talk to people about the benefits of increasing both the temperature and CO2 versus the trade offs when we are not talking 6 – 10 degrees celsius of warming…

Jared
May 8, 2014 8:14 pm

The science is settled. These guys must be deniers to still be working on climate sensitivity to CO2. Let me guess they were paid by the Koch brothers and Big Oil. [/sarc]

Dr. Strangelove
May 8, 2014 8:31 pm

Eggert
What IR wavelengths are they measuring in Leckner et al models? What are the full absorption lengths? Show me the graphs and equations used in these models. You can observe “saturation” in lab experiments but it is not true for the atmosphere because it is 10,000 meters thick while the lab apparatus is one or a few meters in length.

May 8, 2014 10:12 pm

Hi Cementafriend:
do you know why there was a large peak in the CO2 concentration in 1941 on the website of Ernst-Georg Beck?

May 8, 2014 10:20 pm

“One other outcome of these MLR analyses is that most of the temperature increase over the past 100 years is of anthropogenic origin, whether the AMO is included or not and whether the anthropogenic shape is linear or follows the forcing estimates. This indicates that there is no combination of natural factors that can better match the observed temperature pattern than one with a large anthropogenic influence.”
AMO and anthropogenic warming rates, G. R. van der Werf and A. J. Dolman

Pethefin
May 8, 2014 10:29 pm

An explanation to the lower sensitivity of the CO2 is discussed in relation to another new paper:
http://hockeyschtick.blogspot.fi/2014/05/new-paper-questions-basic-physics.html
“A forthcoming paper published in Progress in Physics has important implications for the ‘basic physics’ of climate change. Physicist Dr. Pierre-Marie Robitaille’s paper(s) show the assumption that greenhouse gases and other non-blackbody materials follow the blackbody laws of Kirchhoff, Planck, and Stefan-Boltzmann is incorrect, that the laws and constants of Planck and Boltzmann are not universal and widely vary by material or different gases”

Joel O'Bryan
May 8, 2014 10:37 pm

whew!!! 1.3 C per CO2 doubling. 2.6 C higher by the year 2500. Considering thats about where the Roman Warm period was, we can expect about 500 years of climate optimum to gradually research and engineer away from reduced carbon fuels to whatever comes next—, hydrogen-oxy fuel cells, deuterium fusion, geothermal, thorium fission, etc.
Our descendants will look upon our “settled science” NCA 2014 they way we look at the 1615-1625 Vatican trying to maintain a Ptolemaic orthodoxy by labeling the skeptics with data (not epicircle models) as heretics.

May 8, 2014 10:42 pm

Looks like the science is settled then?

Berényi Péter
May 9, 2014 12:05 am

An undeniable Transient Climate Response related hockey stick is detected.

Evan Jones
Editor
May 9, 2014 12:49 am

Notice the procedures.
Now there’s your top-down model. That’s the way to do it.

John Peter
May 9, 2014 12:56 am

So we are moving ever closer to the Lindzen near enough 1.1C or so climate sensitivity for doubling of CO2. I think we can live with that. Whether it is right or not or whether CO2 has a climate effect at all is of minor importance compared to this relatively minor impact that more and more researchers are now showing, backed by the socalled “pause”. Maybe we can soon move on to more important matters then?

Somebody
May 9, 2014 1:21 am

The old electron charge story at a bigger error scale. When they get a value close to the one from the doctrine, it is good and they won’t look for errors in pseudo-measurement. When they find a difference, they look for errors. Or simply drop the result, as being contrary to their religion.
For the electron charge, they even provided error bars. Guess what? The current electron charge is way out of the first error bars. And measurement in laboratory are way more accurate and precise than the pseudo-measurements and guess work of the pseudo science of climate.

lgl
May 9, 2014 2:51 am

joeldshore
“(because the intensity of emission is an increasing function of temperature)”
and an increasing function of concentration…

Somebody
May 9, 2014 4:01 am

“The basic point is this: For determining forcing, the issue is not one of the absorption saturating. The issue is one of how high you have to go in the atmosphere until it is optically transparent enough that radiation emitted can escape to space without being absorbed again.”
I think the answer to this was given by two theoretical physicists in an old ‘denialist’ paper: There is no energy conservation law for a particular form of energy, only for the total energy. The radiative energy is partially thermalized in atmosphere. As a consequence of those, there is no need for radiative energy to stay at the same frequency from the ground until it leaves the atmosphere. And indeed, if you compare the spectrum of the ground with the one above the atmosphere, that is the case.

James Strom
May 9, 2014 5:45 am

Joel O’Bryan says:
May 8, 2014 at 10:37 pm
whew!!! 1.3 C per CO2 doubling. 2.6 C higher by the year 2500.
_____
To add to your remarks, it’s also good to bear in mind that by that time human use of fossil fuels will be trailing off drastically, at least because of their much greater cost if not because of simple depletion.

May 9, 2014 6:26 am

James McCowan asks about CO2 peak around 1941. Afraid i do not have a complete answer but information I have seen is that a) measured atmospheric temperatures very high in the late 1930’s around the world, and b) sea surface temperatures were high and peaked around 1940-41 (see Beck’s second graph http://www.biomind.de/realCO2/realCO2-1.htm )
Dr Strangelove asks about Lechner et al -you must have models stuck in your mind. Lechner and Hottel did not play around with models. There were no computers around at the time. They made measurements in furnaces built at MIT and in furnace in industrial plant. They built a databank and developed empirical equations which could be used for design. They tested new designs to verify the work they did. A summary of equations and design parameters can be found in Section 5 (Heat and Mass transfer) in Perry’s Chemical Engineer Handbook. You might have to read Section 3 (Mathematics) and Section 4 (Thermodynamics) first but these may be beyond the capabilities of most so-called scientists.

george e. smith
May 9, 2014 7:53 am

“””””……joeldshore says:
May 8, 2014 at 7:51 pm
John Eggert:
You seem to be stuck on a basic point that Roy Spencer addresses here – http://www.drroyspencer.com/2014/04/american-thinker-publishes-a-stinker/
The basic point is this: For determining forcing, the issue is not one of the absorption saturating. The issue is one of how high you have to go in the atmosphere until it is optically transparent enough that radiation emitted can escape to space without being absorbed again. This height rises as you increase CO2 levels and, along with the lapse rate in the troposphere, this means emission occurs from higher, colder regions of the atmosphere where (because the intensity of emission is an increasing function of temperature) means that the emission to space is less…….””””””
Joel, I’m trying to follow this argument that you present here.
First there is the claim by some, that the ordinary atmospheric gases do not radiate thermal continuum radiation spectra, that is temperature dependent. That would require that the CO2 is emitting such a spectrum. What is that radiative mechanism ? Also at that higher colder altitude, the density is lower, so the capture rate would be lower.
The lower altitudes would presumably be warmer as a result of the higher CO2, so that would presumably move the radiation to shorter wavelengths. Do observations show any spectral shift, with increased CO2 ??
I’m trying to understand whether the LWIR emitted by the atmosphere exhibits a BB like thermal spectrum signature, rather than a different essentially temperature independent CO2 band (molecular lines) spectrum. At the lower temperatures of that higher altitude, the CO2 lines would be narrower, because of less Doppler, and less density broadening, so the emission would be expected to be lower because the capture is lower.
I think your argument is a red herring. The satellite spectra I’ve seen (even here recently) show only a surface Temperature specral signature; not an upper atmospheric temperature signature.

mpainter
May 9, 2014 8:01 am

As cementafriend points out, all climate sensitivity studies share the same fault: they are based on unwarrented or unverifiable assumptions. They also ignore the empirical and observational evidence that CO2 has little or no influence on climate.

MikeB
May 9, 2014 11:01 am

George e. smith says:
May 9, 2014 at 7:53 am

So that [Co2] would presumably move the radiation to shorter wavelengths.

No!
George, you seem a fairly intelligent chap, you manage to put the commas in the right places. What is your difficulty? The spectral emission from the earth is measured and is on record. What exactly do you find hard to understand about it?

The satellite spectra I’ve seen (even here recently) show only a surface Temperature spectral signature

Look again, and try to understand it. What’s hard? Why do you fail to understand clear empirical evidence ? Wishful thinking?

Gary Pearse
May 9, 2014 12:11 pm

Do these guys know that their empirical derivation comes from use of ever changing temperature records? Do they know that the earlier temps are being reduced as we speak to steepen the warming (they are constrained by satellite data in the recent end so can’t raise those temps upwards after the last work by HadCrut (4) which would violate decency laws or something). With CO2 sensitivity going down and the temp curve being pivoted anticlockwise, eventually back calculations will show that there was no CO2 in the atmosphere in 1850.

george e. smith
May 9, 2014 1:31 pm

“””””…..
climate.
MikeB says:
May 9, 2014 at 11:01 am
George e. smith says:
May 9, 2014 at 7:53 am
So that [Co2] would presumably move the radiation to shorter wavelengths.
No!
George, you seem a fairly intelligent chap, you manage to put the commas in the right places. What is your difficulty? The spectral emission from the earth is measured and is on record. What exactly do you find hard to understand about it?
The satellite spectra I’ve seen (even here recently) show only a surface Temperature spectral signature
Look again, and try to understand it. What’s hard? Why do you fail to understand clear empirical evidence ? Wishful thinking?…..”””””
A lot of Mike-assumptions going on there.
What’s YOUR evidence that I “fail to understand clear empirical evidence.”
The “satellite spectra” I mentioned, If my memory serves me correctly, is from a Nimbus-4 satellite looking down, presumably from outside essentially all of the atmosphere, and I had no trouble understanding it. I even commented that it matched very closely, the completely theoretical calculated spectra, published in “The Infra-red Handbook.” And indeed it does have a mean earth surface (288 k) spectral signature; and not a sub zero high altitude signature.
On a recent flight to LHR by a very direct southerly route (not over the pole) (mid March), the outside air Temp. at 34,000 ft was -68 deg C. Emission spectra at that temperature would not match earth surface Temperature.
I’m too long in the tooth to be bothered by your snide personal remarks. You have NO inkling of what I understand. What does putting commas in the right place, have to do with intelligence. It used to be customary, to actually teach such things in school; it has nothing to do with intelligence, simply proper instruction.
For the record, in my final high school graduating class, I was “top of the class” in the year final English exams. (out of 23 students; most of whom, were “fine arts” students). I still have the class English prize, attesting to that fact; a book called; “Standard Stories from the Operas.”
So if you ever want to know the story of the Opera Trilogy, “The Cauldron of Annwyn”, I’ll write you a brief synopsis. Well you could just go and watch it (them) for yourself.
Oh; almost forgot. All the way through my Technical High School education, English was by far my worst subject.
My original puzzlement was how Joel gets a low temperature spectral signal from a non radiative atmosphere, that is just passing through parts of a thermal emission spectrum from the surface. (well of course with holes in it from temperature independent molecular resonance absorptions by GHGs)
I don’t know how many times I have pointed out that isotropic LWIR radiative emissions from whatever atmospheric component, should split half up and half down, but the upward escape route is favored over the downward return path, by both Doppler and density (pressure) broadening of the GHG molecular spectral lines, as a function of temperature and pressure change with altitude.
And I’m sure everyone understands that when I say “temperature independent”, I am talking about the intrinsic frequencies of those molecular spectral lines; and not the Doppler broadening of those lines.
The surface BB like thermal radiation, of course has the usual Wien displacement law dependence on surface Temperature, that is impressed on the satellite measured radiation spectrum. (Nimbus-4)

Somebody
May 9, 2014 2:25 pm

“exhibits a BB like thermal spectrum signature” The answer is no. As I already said, there is only partial thermalization of radiation in atmosphere. A black body requires full thermalization. And indeed the spectrum is quite different than that of a black body. One needs a great deal of pareidolia to find a match. Treat it like a black body, and you can prove anything, applying the principle of explosion (ex falso, quodlibet, one of the many pseudo-scientific methods of the ‘science’).

Somebody
May 9, 2014 2:48 pm

“The issue is one of how high you have to go in the atmosphere until it is optically transparent enough that radiation emitted can escape to space without being absorbed again. This height rises as you increase CO2 levels”
There is no such ‘height’ (as in a single one).
For different frequencies, there are different such heights. One can figure that out when he learns the distribution of water in atmosphere, compared with that of CO2.
Now, think about that partial themalization of radiation, and about the fact that there isn’t a law of conservation of a particular form of energy (in this case, energy of radiation at a particular frequency). Energy can continue to go in radiation at the same frequency, or change frequency by the mean of thermalization. To help you, think about what happens with a cloudy sky. Do the CO2 bands continue upwards as in the case of a clear sky? By the way, you can check the spectrum, too.

joeldshore
May 9, 2014 8:22 pm

Somebody says:

For different frequencies, there are different such heights. One can figure that out when he learns the distribution of water in atmosphere, compared with that of CO2.

I agree that what I presented is a simplified picture. As you imply, to do the problem right, you have to do a full radiative transfer calculation. I was just giving the intuitive reason why the saturation argument is a red herring.
But, the point is that these radiative calculation have been done and there is no serious debate about the answer: A doubling of CO2 produces about 4 W/m^2 of top-of-the-atmosphere radiative forcing. The IPCC says that, scientists with opinions as diverse as Richard Lindzen and Roy Spencer agree.

george e. smith
May 9, 2014 10:53 pm

“””””…..Somebody says:
May 9, 2014 at 2:25 pm
“exhibits a BB like thermal spectrum signature” The answer is no. As I already said, there is only partial thermalization of radiation in atmosphere. A black body requires full thermalization. And indeed the spectrum is quite different than that of a black body. One needs a great deal of pareidolia to find a match. Treat it like a black body, and you can prove anything, applying the principle of explosion (ex falso, quodlibet, one of the many pseudo-scientific methods of the ‘science’)……”””””
Well you don’t need to teach me about black body radiation. It of course is a complete fiction, because no such material or contrivance exists; or can exist, so nobody ever observed black body radiation ( from zero to infinite frequency; sans the two end points)
Nevertheless, practical objects can track (in shape) the predicted Planck spectral radiance quite well, so I call those “black body like”, meaning the spectrum shape is close to the Planck shape for some temperature. All that matters is the range from half the BB peak wavelength to 8 times the peak wavelength, that carries 98% of the total energy
The Nimbus-4 observation graph, that Ira Glickstein posted in fact had an envelope tracking the Planck shape for the surface temperature, and I mentioned that it had the CO2, O3, and H2O “holes” that are well known, and whose frequencies aren’t temperature dependent, like the BB like part is.
I don’t have ANY disagreement, with the general notion of the CO2 effect changing with altitude, that as you say, Lndzen and Spencer agree with. I do too.
I just don’t think it matters a hoot. So it affects atmospheric warming; I don’t have a problem with that.
I just believe cloud feedback washes out all those minor irritations. If it can wash out the 0.1% TIR solar cycling, and the bigger annual TSI change; and as Willis showed, it can wash out a sizable MEASURED DROP in surface insolation due to Pinatubo like events, and leave no measurable Temperature difference, then the upper atmosphere warming, is not going to bring the sky down on our heads.
As for “forcings”, I have the same belief in that concept as I have in astrology. Sorry, but it just isn’t my idea of science. Nor is the concept of “climate sensitivity” which even the climate itself pays no attention to. I know what the mathematical logarithm function is; and nothing in the climate is logarithmic.

May 10, 2014 6:28 am

It is not at all certain how much radiative forcing a doubling of atmospheric CO2 concentration will cause, even if one leaves feedback forcings out of account. James Hansen said the forcing was 4.4 Watts per square meter; then the IPCC, in 1990 and 1995, agreed; then Myhre et al (1998) reduced the coefficient in the forcing function by 15% to produce the current 3.7 Watts per square meter; but most skeptical scientists will admit, as Chris Essex told me when I asked him what his answer was, since he had actually done the line-by-line transfer calculations, that although the current estimate is probably in the right ballpark he is still unable to warrant that the coefficient is not on the high side.
Also, Will Happer has recently pointed out that the logarithmic form of the CO2 forcing function is only an approximation. Given the long period of what the imperial Chinese would have called “no-warming”, it may be that the CO2 forcing function is not as well understood as Mr Shore suggests. My own approach is to use the IPCC’s current estimates, because then they can’t argue, but to bear in mind the possibility that in a complex, non-linear, chaotic object in which measurement error is very large we do not understand matters as well as the champions of Thermageddon would have us believe.

May 10, 2014 8:55 am

Joel D. Shore and Dr. Strangelove have asked questions regarding my post. First, the wavelengths that Leckner considered. For higher temperatures (in the thousands of degrees celsius) the number of lines that must be solved are higher than those at cooler temperatures, so for combustion problems, one must consider more wavelengths than for cooler thermodynamic systems. So the short answer is “all of them”. The database for combustion is HIGHTEMP. I believe (could be wrong) it has more lines that HIGHTRAN.
Joel: Dr Spencer’s argument is correct and I fully understand it, except his bit about the upper atmosphere being cooler affecting radiant energy transfer. The cool side is always cooler. The point of my question is: In combustion work there IS a point of saturation. This is at 200 to 500 bar.cm, depending on temperature. 200 at 0 Celsius and 500 at 4,000 Celsius. This is a real thing. If you use the combustion curves to generate a forcing curve it is nearly identical to the climate curve (roughly modeled by 5.35ln[CO2]), except that at very high levels of CO2, such as exists in the Earth’s atmospheric system, there is a divergence. The combustion curves show 0 forcing while the line by line solutions for the climate system continue a logarithmically modeled growth. What is the point at which the divergence occurs? Is there a distance? A difference in pressure? A particular temperature?

Somebody
May 10, 2014 2:13 pm

“you have to do a full radiative transfer calculation”
To do a full radiative transfer calculation correctly, one should go down to quantum field theory, that is, ab initio without all sorts of approximations leading to huge errors. Nobody is able to do that.
“But, the point is that these radiative calculation have been done”
Yes, it has been done. Wrong. Ex falso, quodlibet. Besides, it was never ever verified experimentally. You know, that tiny key of science.
” and there is no serious debate about the answer:”
Among pseudo-scientists. Anyway, the claim that ‘there is no serious debate’ has no place in science.
“A doubling of CO2 produces about 4 W/m^2 of top-of-the-atmosphere radiative forcing. The IPCC says that, scientists with opinions as diverse as Richard Lindzen and Roy Spencer agree.”
Cool. If there is a bandwagon agreeing on it, let’s ask them to vote to make the Earth flat.

george e. smith
May 11, 2014 7:35 am

Related to this GHG absorption issue, and how, CO2 and H2O bands overlap, so “water is swamping those bands anyhow”
Well except, they aren’t “bands”, but large numbers of spectral lies, each intrinsically narrow, so the tend to not overlap at all.
So H2O and CO2 are really additive, even when they appear to be competing. Phil often makes sure we don’t forget that.
I actually have high resolution plots of sea level atmospheric transmission, over some kilometers, showing hundreds of narrow absorption lines due to this and that.
It’s actually not unlike the night sky, except I reverse.
The night sky has a gazillion bright stars, which are akin to the molecular absorption lines. But the stars are so small, that the sky is still black, rather than white.
So if the various and sundry GHG molecular spectral lines don’t all overlap each other, why is the atmosphere opaque to IR, instead of mostly transparent, and full of non-overlapping transmission BANDS. I call them bands, in the same sense as the wide gaps of bright sunlight, that shine through in between all those Fraunhoffer dark lines in the solar spectrum.
According to Trenberth, only 40 W/m^2 of surface emitted LWIR radiation is contained in those wide bands of transmission, in between the non-overlapping narrow GHG molecular spectral lines.
‘splain that to me ?? !!

Phil.
May 11, 2014 9:49 am

Monckton of Brenchley says:
May 10, 2014 at 6:28 am
Also, Will Happer has recently pointed out that the logarithmic form of the CO2 forcing function is only an approximation. Given the long period of what the imperial Chinese would have called “no-warming”, it may be that the CO2 forcing function is not as well understood as Mr Shore suggests. My own approach is to use the IPCC’s current estimates, because then they can’t argue, but to bear in mind the possibility that in a complex, non-linear, chaotic object in which measurement error is very large we do not understand matters as well as the champions of Thermageddon would have us believe.

I’ve been making that point on here for several years, probably for longer than Will has. It follows from the ‘curve of growth’, where the present-day concentration of CO2 lies in the approximately logarithmic region between the ‘weak’ linear region and the ‘strong’ square root region. Just because something doesn’t exactly fit to a mathematical function doesn’t mean it’s not understood, it’s a good approximation (not ‘only’ an approximation).
Fig 10.2 gives an excellent illustration:
http://www.ast.cam.ac.uk/~pettini/Physical%20Cosmology/lecture10.pdfwa

joeldshore
May 11, 2014 6:41 pm

Just to correct Phil.’s link; it’s http://www.ast.cam.ac.uk/~pettini/Physical%20Cosmology/lecture10.pdf
Somebody: I really don’t know how to respond to a tirade like that. You are merely illustrating how someone will demand impossible levels of evidence to believe something that they are ideologically opposed to. It has nothing to do with science and everything to do with what you want to believe…and so it is useless for me or anybody else to try to convince you.

May 11, 2014 7:02 pm

joelshore says:
Somebody: I really don’t know how to respond to a tirade like that.
That was not a “tirade”. joelshore’s comment was because he has no better response to ‘Somebody’.

george e. smith
May 11, 2014 8:18 pm

“””””…..Phil. says:
May 11, 2014 at 9:49 am
Monckton of Brenchley says:
May 10, 2014 at 6:28 am
Also, Will Happer has recently pointed out that the logarithmic form of the CO2 forcing function is only an approximation. Given the long period of what the imperial Chinese would have called “no-warming”, it may be that the CO2 forcing function is not as well understood as Mr Shore suggests. My own approach is to use the IPCC’s current estimates, because then they can’t argue, but to bear in mind the possibility that in a complex, non-linear, chaotic object in which measurement error is very large we do not understand matters as well as the champions of Thermageddon would have us believe.
I’ve been making that point on here for several years, probably for longer than Will has. It follows from the ‘curve of growth’, where the present-day concentration of CO2 lies in the approximately logarithmic region between the ‘weak’ linear region and the ‘strong’ square root region. …..”””””
I’m quite familiar with real world situations, where events are combinations of two or more different physical processes, each of which has been explained, in terms of well based theoretical terms governed by closed form mathematical equations, but the total effect changes from one regime to another.
An example is the forward current in a semiconductor diode, as a function of forward Voltage; which can be found derived in detail, in Andy Grove’s book. There are at least two components of that current; one due to recombination, and one due to diffusion of carriers. Both components theoretically obey exponential equations, but they have a 2:1 slope ratio on a log current plot.
But those two behaviors, can persist for 3-5 orders of magnitude in current; below or above some crossover point, where one swamps the other. At much higher current densities, the diode characteristic may be dominated by built in series resistance, rather than semiconductor processes.
But each region can be explained by actual physics processes.
So if global surface / lower troposphere / upper atmosphere / whatever Temperature; or the corresponding W/m^2 “forcing” due to CO2 / O3 / H2O / whatever is linearly / logarithmically / square rootedly / whatever, to the atmospheric CO2 volume / mass / molecular relative abundance / whatever; just what is the actual physical process that theoretically would produce that particular mathematical form for that particular regime of CO2 relative abundance.
So far, in the Mauna Loa data record, we have about 25 % of one single octave of CO2 increase.
So for that experimental 25 % range, how much of that range is theoretically shown to be linear; how much of that range has been shown to be theoretically logarithmic, and how much of that 25 % range has been shown to be theoretically square root. ??
So what is wrong with simply saying the Temp / forcing / CO2, in whatever combination you want to relate them, is just …..””” Non- Linear “”” ….. With no known theoretical physical reason.
The ML data and any of the GISS / RSS / HADCRU / UAH fit a perfectly linear graph, at least as well as almost any other mathematical function. You can even reverse the two variables, and jigger an equally good fit to the same equation.
You can even fit it to the form:- y = exp (-1/x^2) with suitable parameters.
So far, nobody has seen a doubling of CO2 in the earth’s atmosphere, and measured either the Temperature, anywhere they want to designate, from the ground on up, or the “forcing”, in whatever language they want to put it.
No laboratory environment, can duplicate the real earth atmosphere environment.

joeldshore
May 12, 2014 4:21 pm

George e. smith says:

So far, in the Mauna Loa data record, we have about 25 % of one single octave of CO2 increase.
So for that experimental 25 % range, how much of that range is theoretically shown to be linear; how much of that range has been shown to be theoretically logarithmic, and how much of that 25 % range has been shown to be theoretically square root. ??
So what is wrong with simply saying the Temp / forcing / CO2, in whatever combination you want to relate them, is just …..””” Non- Linear “”” ….. With no known theoretical physical reason.

So far, nobody has seen a doubling of CO2 in the earth’s atmosphere, and measured either the Temperature, anywhere they want to designate, from the ground on up, or the “forcing”, in whatever language they want to put it.
No laboratory environment, can duplicate the real earth atmosphere environment.

You don’t need a laboratory experiment of the full earth atmosphere in order to gain confidence in the sort of radiative transfer calculations that are done to determine the forcing. Radiative transfer in the atmosphere (and radiative transfer in general) is tested in a variety of ways and underlies an entire field of technology (remote sensing).
Granted, the translation of this into a resulting temperature increase involves considerably more uncertainty, but that is not what we are talking about here. Part of doing science is distinguishing what we understand better from what we understand less well. Just throwing one’s hands up and saying, “Woe is us. It is so complex. How can we understand anything” may serve the political purposes of those you who are so strongly ideologically opposed to taking action but it does not really serve the cause of advancing knowledge, or clarifying the true sources of uncertainty from those aspects that are actually very well understood.