By Christopher Monckton of Brenchley
Joel Shore, who has been questioning my climate-sensitivity calculations, just as a good skeptic should, has kindly provided at my request a reference to a paper by Dr. Andrew Lacis and others at the Goddard Institute of Space Studies to support his assertion that CO2 exercises about 75% of the radiative forcings from all greenhouse gases, because water vapor, the most significant greenhouse gas because of its high concentration in the atmosphere, condenses out rapidly, while the non-condensing gases, such as CO2, linger for years.
Dr. Lacis writes in a commentary on his paper: “While the non-condensing greenhouse gases account for only 25% of the total greenhouse effect, it is these non-condensing GHGs that actually control the strength of the terrestrial greenhouse effect, since the water vapor and cloud feedback contributions are not self-sustaining and, as such, only provide amplification.”
Dr. Lacis’ argument, then, is that the radiative forcing from water vapor should be treated as a feedback, because if all greenhouse gases were removed from the atmosphere most of the water vapor now in the atmosphere would condense or precipitate out within ten years, and within 50 years global temperatures would be some 21 K colder than the present.
I have many concerns about this paper, which – for instance – takes no account of the fact that evaporation from the surface occurs at thrice the rate imagined by computer models (Wentz et al., 2007). So there would be a good deal more water vapor in the atmosphere even without greenhouse gases than the models assume.
The paper also says the atmospheric residence time of CO2 is “measured in thousands of years”. Even the IPCC, prone to exaggeration as it is, puts the residence time at 50-200 years. On notice I can cite three dozen papers dating back to Revelle in the 1950s that find the CO2 residence time to be just seven years, though Professor Lindzen says that for various reasons 40 years is a good central estimate.
Furthermore, it is questionable whether the nakedly political paragraph with which the paper ends should have been included in what is supposed to be an impartial scientific analysis. To assert without evidence that beyond 300-350 ppmv CO2 concentration “dangerous anthropogenic interference in the climate system would exceed the 25% risk tolerance for impending degradation of land and ocean ecosystems, sea-level rise [at just 2 inches per century over the past eight years, according to Envisat], and inevitable disruption of socioeconomic and food-producing infrastructure” is not merely unsupported and accordingly unscientific: it is rankly political.
One realizes that many of the scientists at GISS belong to a particular political faction, and that at least one of them used to make regular and substantial donations to Al Gore’s re-election campaigns, but learned journals are not the place for über-Left politics.
My chief concern, though, is that the central argument in the paper is in effect a petitio principii – a circular and accordingly invalid argument in which one of the premises – that feedbacks are strongly net-positive, greatly amplifying the warming triggered by a radiative forcing – is also the conclusion.
The paper turns out to be based not on measurement, observation and the application of established theory to the results but – you guessed it – on playing with a notorious computer model of the climate: Giss ModelE. The model, in effect, assumes very large net-positive feedbacks for which there is precious little reliable empirical or theoretical evidence.
At the time when Dr. Lacis’ paper was written, ModelE contained “flux adjustments” (in plain English, fudge-factors) amounting to some 50 Watts per square meter, many times the magnitude of the rather small forcing that we are capable of exerting on the climate.
Dr. Lacis says ModelE is rooted in well-understood physical processes. If that were so, one would not expect such large fudge-factors (mentioned and quantified in the model’s operating manual) to be necessary.
Also, one would expect the predictive capacity of this and other models to be a great deal more successful than it has proven to be. As the formidable Dr. John Christy of NASA has written recently, in the satellite era (most of which in any event coincides with the natural warming phase of the Pacific Decadal Oscillation) temperatures have been rising at a rate between a quarter and a half of the rate that models such as ModelE have been predicting.
It will be helpful to introduce a little elementary climatological physics at this point – nothing too difficult (otherwise I wouldn’t understand it). I propose to apply the IPCC/GISS central estimates of forcing, feedbacks, and warming to what has actually been observed or inferred in the period since 1750.
Let us start with the forcings. Dr. Blasing and his colleagues at the Carbon Dioxide Information and Analysis Center have recently determined that total greenhouse-gas forcings since 1750 are 3.1 Watts per square meter.
From this value, using the IPCC’s table of forcings, we must deduct 35%, or 1.1 Watts per square meter, to allow for negative anthropogenic forcings, notably the particles of soot that act as tiny parasols sheltering us from the Sun. Net anthropogenic forcings since 1750, therefore, are 2 Watts per square meter.
We multiply 2 Watts per square meter by the pre-feedback climate-sensitivity parameter 0.313 Kelvin per Watt per square meter, so as to obtain warming of 0.6 K before any feedbacks have operated.
Next, we apply the IPCC’s implicit centennial-scale feedback factor 1.6 (not the equilibrium factor 2.8, because equilibrium is thousands of years off: Solomon et al., 2009).
Accordingly, after all feedbacks over the period have operated, a central estimate of the warming predicted by ModelE and other models favored by the IPCC is 1.0 K.
We verify that the centennial-scale feedback factor 1.6, implicit rather than explicit (like so much else) in the IPCC’s reports, is appropriate by noting that 1 K of warming divided by 2 Watts per square meter of original forcing is 0.5 Kelvin per Watt per square meter, which is indeed the transient-sensitivity parameter for centennial-scale analyses that is implicit (again, not explicit: it’s almost as though They don’t want us to check stuff) in each of the IPCC’s six CO2 emissions scenarios and also in their mean.
Dr. Lacis’ paper is saying, in effect, that 80% of the forcing from all greenhouse gases is attributable to CO2. The IPCC’s current implicit central estimate, again in all six scenarios and in their mean, is in the same ballpark, at 70%.
However, using the IPCC’s own forcing function for CO2, 5.35 times the natural logarithm of (390 ppmv / 280 ppmv), respectively the perturbed and unperturbed concentrations of CO2 over the period of study, is 1.8 Watts per square meter.
Multiply this by the IPCC’s transient-sensitivity factor 0.5 and one gets 0.9 K – which, however, is the whole of the actual warming that has occurred since 1750. What about the 20-30% of warming contributed by the other greenhouse gases? That is an indication that the CO2 forcing may have been somewhat exaggerated.
The IPCC, in its 2007 report, says no more than that between half and all of the warming observed since 1950 (and, in effect, since 1750) is attributable to us. Therefore, 0.45-0.9 K of observed warming is attributable to us. Even taking the higher value, if we use the IPCC/GISS parameter values and methods CO2 accounts not for 70-80% of observed warming over the period but for all of it.
In response to points like this, the usual, tired deus ex machina winched creakingly onstage by the IPCC’s perhaps too-unquestioning adherents is that the missing warming is playing hide-and-seek with us, lurking furtively at the bottom of the oceans waiting to pounce. However, elementary thermodynamic considerations indicate that such notions must be nonsense.
None of this tells us how big feedbacks really are – merely what the IPCC imagines them to be. Unless one posits very high net-positive feedbacks, one cannot create a climate problem. Indeed, even with the unrealistically high feedbacks imagined by the IPCC, there is not a climate problem at all, as I shall now demonstrate.
Though the IPCC at last makes explicit its estimate of the equilibrium climate sensitivity parameter (albeit that it is in a confused footnote on page 631 of the 2007 report), it is not explicit about the transient-sensitivity parameter – and it is the latter, not the former, that will be policy-relevant over the next few centuries.
So, even though we have reason to suspect there is a not insignificant exaggeration of predicted warming inherent in the IPCC’s predictions (or “projections”, as it coyly calls them), and a still greater exaggeration in Giss ModelE, let us apply their central estimates – without argument at this stage – to what is foreseeable this century.
The IPCC tells us that each of the six emissions scenarios is of equal validity. That means we may legitimately average them. Let us do so. Then the CO2 concentration in 2100 will be 712 ppmv compared with 392 ppmv today. So the CO2 forcing will be 5.35 ln(712/392), or 3.2 Watts per square meter, which we divide by 0.75 (the average of the GISS and IPCC estimates of the proportion of total greenhouse forcings represented by CO2) to allow for the other greenhouse gases, making 4.25 Watts per square meter.
We reduce this value by about 35% to allow for negative forcings from our soot-parasols etc., giving 2.75 Watts per square meter of net anthropogenic forcings between now and 2100.
Nest, multiply by the centennial-scale transient-sensitivity parameter 0.5 Kelvin per Watt per square meter. This gives us a reasonable central estimate of the warming to be expected by 2100 if we follow the IPCC’s and GISS’ methods and values every step of the way. And the warming we should expect this century if we do things their way? Well, it’s not quite 1.4 K.
Now we go back to that discrepancy we noted before. The IPCC says that between half and all of the warming since 1950 was our fault, and its methods and parameter values seem to give an exaggeration of some 20-30% even if we assume that all of the warming since 1950 was down to us, and a very much greater exaggeration if only half of the warming was ours.
Allowing for this exaggeration knocks back this century’s anthropogenic warming to not much more than 1 K – about a third of the 3-4 K that we normally hear so much about.
Note how artfully this tripling of the true rate of warming has been achieved, by a series of little exaggerations which, when taken together, amount to a whopper. And it is quite difficult to spot the exaggerations, not only because most of them are not all that great but also because so few of the necessary parameter values to allow anyone to spot what is going on are explicitly stated in the IPCC’s reports.
The Stern Report in 2006 took the IPCC’s central estimate of 3 K warming over the 20th century and said that the cost of not preventing that warming would be 3% of 21st-century GDP. But GDP tends to grow at 3% a year, so, even if the IPCC were right about 3 K of warming, all we’d lose over the whole century, even on Stern’s much-exaggerated costings (he has been roundly criticized for them even in the journal of which he is an editor, World Economics), would be the equivalent of the GDP growth that might be expected to occur in the year 2100 alone. That is all.
To make matters worse, Stern used an artificially low discount rate for inter-generational cost comparison which his office told me at the time was 0.1%. When he was taken apart in the peer-reviewed economic journals for using so low a discount rate, he said the economists who had criticized him were “confused”, and that he had really used 1.4%. William Nordhaus, who has written many reviewed articles critical of Stern, says that it is quite impossible to verify or to replicate any of Stern’s work because so little of the methodology is explicit and available. And how often have we heard that before? It is almost as if They don’t want us to check stuff.
The absolute minimum commercially-appropriate discount rate is equivalent to the minimum real rate of return on capital – i.e. 5%. Let us oblige Stern by assuming that he had used a 1.4% discount rate and not the 0.1% that his office told me of.
Even if the IPCC is right to try to maintain – contrary to the analysis above, indicating 1 K manmade warming this century – that we shall see 3 K warming by 2100 (progress in the first one-ninth of the century: 0 K), the cost of doing nothing about it, discounted at 5% rather than 1.4%, comes down from Stern’s 3% to just 0.5% of global 21st-century GDP.
No surprise, then, that the cost of forestalling 3 K of warming would be at least an order of magnitude greater than the cost of the climate-related damage that might arise if we just did nothing and adapted, as our species does so well.
But if the warming we cause turns out to be just 1 K by 2100, then on most analyses that gentle warming will be not merely harmless but also beneficial. There will be no net cost at all. Far from it: there will be a net economic benefit.
And that, in a nutshell, is why governments should shut down the UNFCCC and the IPCC, cut climate funding by at least nine-tenths, de-fund all but two or three computer models of the climate, and get back to addressing the real problems of the world – such as the impending energy shortage in Britain and the US because the climate-extremists and their artful nonsense have fatally delayed the building of new coal-fired and nuclear-fired power stations that are now urgently needed.
Time to get back down to Earth and use our fossil fuels, shale gas and all, to give electricity to the billions that don’t have it: for that is the fastest way to lift them out of poverty and, in so doing, painlessly to stabilize the world’s population. That would bring real environmental benefits.
And now you know why building many more power stations won’t hurt the climate, and why – even if there was a real risk of 3 K warming this century – it would be many times more cost-effective to adapt to it than to try to stop it.
As they say at Lloyds of London, “If the cost of the premium exceeds the cost of the risk, don’t insure.” And even that apophthegm presupposes that there is a risk – which in this instance there isn’t.
The Viscount Monckton of Brenchley
===========================================================
Part 1 of Sense and Sensitivity can be found here
“JohnWho says:
January 16, 2012 at 5:54 am
Moreover, if CO2 has increased 100 ppm over the last century or so, what other atmospheric gas or gasses decreased that 100 ppm? After all, there are only 1 million parts per million.”
Good question! Oxygen would be what would have decreased. If no CO2 went into the ocean, an increase of 0.01% in CO2 would have caused the O2 to go down from 20.96% to 20.95%. (C + O2 –> CO2) When factoring in the CO2 that gets dissolved in the ocean, it gets more complicated, but in the end, it is irrelevant.
>> davidmhoffer says:
January 16, 2012 at 6:53 am
Markus;
Moreover, Davidmhoffer, those 10cm’s of aquifers are full in Australasia, the sub continent and large parts of Africa.
Markus, please do learn to read.
The contention was that water from deep aquifers that are NOT replenished are being used for irrigation and that this water, that previously was NOT part of the water cycle, will raise the levels of the oceans by 10 cm. <<
The contention by JFD as I read it was that the extraction and use of 'fossil water' is raising the oceans 2.6 mm per year. That would result in a change of 10 inches (not cm) in a century, but I don't think JFD expects the 'fossil water' to last that long.
A quick calc gives 2.6 mm of ocean = a volume of 928 km3. I don't know how much water is extracted, but that amount per year doesn't sound unreasonable.
Pamela Gray says, January 16, 2012 at 7:38 am:
I have to laugh at the total disregard of the elephant poop in the room (aka oceans and atmospheric drivers of land temps intrinsic to Earth’s natural systems) while looking for mouse turds.
Ah, but then elephant poop is so very obvious, therefore anybody can see it.
Mouse turds, however … well, finding mouse turds is very difficult and time consuming, therefore it needs specialists which have to be well paid.
And then they have to distinguish between indoor mouse turds and outdoor mouse turds, which takes years of study and very fine brains.
Elephant poop, well, it’s big and it’s there, that’s all we need to know …
😉
Monckton of Brenchley says:
I think it is a little confusing to say that “CO2 exercises about 75% of the radiative forcings from all greenhouse gases”. It is probably better to say what the direct radiative effect of the CO2 in the atmosphere is but then that since CO2 levels control water vapor levels (through their control of the temperature), removing CO2 reduces the radiative effects by a lot more than is calculated by considering just the radiative effect of the reduction in CO2 levels alone.
Exactly…and before we go further and discuss whether Dr. Lacis is correct, we must now take stock of the fact that your previous argument for determining the climate sensitivity was a circular argument: You assumed that a picture such of Dr. Lacis’s is incorrect in order to conclude that such a picture is incorrect. I.e., you did the calculation for the sensitivity under the assumption that all the water vapor in the atmosphere had to be added in “by hand” and none was put into the atmosphere because of the temperature rise when CO2 is added to the atmosphere.
Independent of whether Dr. Lacis’s picture is correct, we can now conclude that your calculation was definitely wrong. It did not calculate a climate sensitivity that included water vapor (or ice albedo or clouds) as a feedback but rather as forcings. As such, it was actually just equivalent to calculating the no-feedback value of the climate sensitivity.
This was a fatal error in your original calculation. We are now forced to conclude what I have been saying all along: That your calculation provided exactly zero evidence of what the climate sensitivity is in the presence of feedbacks.
This is such an important point that I will let this stand on its own before I address your criticisms of Lacis et al.
higley7 says:
January 16, 2012 at 7:56 am
now there’s the most interesting theory of the greenhouse effect that I’ve heard in a long time. Could you give us references to further reading on this?
The temperature rise from pre-industrial to 2100 was going to be 3-4 C. To see how this comes about, use ln(712/280)/ln(2)=1.34 and multiply by the sensitivity which is 2-3 degrees per doubling. This is the CO2 effect alone. Using the current effect of aerosols and assuming it applies proportionately to 2100 is a faulty assumption because it assumes aerosol cooling will increase when evidence says it is now steady or decreasing from its maximum, while the CO2 growth rate more than doubles from the current value, so this is what makes it worse than he thought. In fact it currently looks like other GHGs like methane and NO2 are cancelling the negative effect of aerosols (AR4 forcing attribution), so the current warming is in line with the expected forcing effect of CO2 alone.
Monckton of Brenchley says:
This statement is incorrect. What they showed is that the INCREASE in the evaporation with increasing temperatures has been three times what is observed. In particular (as I recall), that the models predicted evaporation to go up by ~2.3% whereas measurements suggest that it went up by about 7%. So, in other words, if evaporation was 100 in some units, it has gone up to 107 rather than just up to 102.5. Note that 107 is not nearly three times as large as 102.3, as your statement has claimed.
And, while Wentz et al.’s observation represents a bit of a puzzle, there are many other things that could account for some or all of the discrepancy other than errors in the models, including variability in evaporation not due to changes in temperature, errors in measurements, and so forth.
(1) The value of seven years is not a number that is at all relevant in determining how long a perturbation in CO2 levels will remain in the atmosphere. It is instead a value that reflects the fact that CO2 molecules are rapidly passed back and forth between the atmosphere, biosphere, and ocean mixed layer but that for a perturbation of CO2 levels to decay, the CO2 must be transferred to some other reservoir like the deep ocean and this is a much slower process.
(2) The main reason for the discrepancy in estimates of a decay time is because the decay of a perturbation of CO2 levels is a highly non-exponential process. Exponential processes are characterized by a single decay time, e.g., if the characteristic decay time is 50 years then only 1/e (about 37%) remains after 50 years and then 1/e^2 (about 13.5%) after 100 years, 1/e^3 (about 5.0%) after 150 years, and so on. However, for CO2, there is no one decay time, so while more than half might disappear after 100 years or so, it will take some non-negligible fraction (something like 25% as I recall) thousands of years to disappear. acis et al. could probably have stated this a little more clearly.
The full statement made is actually:
And, two references are provided to back up this statement that such concern indeed exists.
And, you and Heartland and CEI and Western Fuels Association and Senator Inhofe, and … don’t belong to any particular political faction?
No…The argument is not at all circular. It is based on a study using a climate model, which incorporates the known laws of physics governing our climate. Of course, all models are only approximations to reality and one can rightfully ask how robust this result is to changes in various things that go into the model. However, that does not make the argument circular. (By contrast, your previous argument really was circular in that you assumed that none of the water vapor present in our current atmosphere is there as a result of a feedback on an increase in temperature due to the addition of the non-condensable greenhouse gases and used this to basically conclude that there was no such feedback…Or, at least that the net feedbacks were essentially zero.)
Furthermore, the basic idea of the water vapor feedback follows from quite general physics principles (the saturation vapor pressure for water is a rapidly increasing function of temperature and, as a result, the rate of evaporation increases strongly with temperature). And, the water vapor feedback also now has considerable empirical support for both its existence and approximate magnitude. (See, for example, http://www.sciencemag.org/content/323/5917/1020.summary )
Jim D says:
“…the current warming is in line with the expected forcing effect of CO2 alone.”
What “current warming”?
http://members.shaw.ca/sch25/FOS/GlobalTroposphereTemperaturesAverage.jpg
Smokey, two words, natural variation. It goes downwards too and can cancel the warming on decadal scales, but not multidecadal ones because its amplitude is less than two tenths of a degree.
Dear Lord Mockton,
The short and long time constants in the CO2 cycle are ground for a lot of confusion amomg skeptics. Let me explain the the two constants with examples from hydrology and economics.
Imagine an aquarium with a pump and a leak. The leaking speed is only dependent on the average water level: a higher level gives a faster leak, a lower level gives a slower leak. The leak has an exponential decay with a long time constant. The waterpump gives the refreshment rate, this is usually a shorter time constant than the leak, and is not exponential but simply a multiplier of pump volume divided by pump-frequency. The nice thing of flow calculations is that the systems are independent. The sink speed is not dependent on pump volume or frequency, and the pump refreshment rate is only dependent on the total water volume.
The economic example is a savings account, The amount of annual interest is only dependent on the balance sheet, not on the actual cash flow, if you earn a lot and also spent a lot has the same effect as earning little and spending little.
Back to the CO2 cycle. The CO2 sink speed is only dependent on the CO2 level in the atmosphere and not on the co2 flux. It has an exponential decay time constant of approximate 55 years (see the studies of Peter Dietze http://www.john-daly.com/forcing/moderr.htm ). The atmospheric co2 refresment rate on the other handis simply the annual biosphere metabolism voulume diveuded by the total atmospheric CO2 volume which is roughly a factor five, which give the well known 5 year time constant. So the constants dont disagree, they are simply a manifestation two totaly independant physical phemomena: mixing and end-storage.
R. Gates says:
January 16, 2012 at 8:23 am
“For those wanted one of the best summaries available anywhere on the web of the true science behind the effects of CO2 in the atmosphere, I highly recommend reading …” [… some link to a Science of Doom website] .
The problem, Mr.Gates, is that most people here do not want to understand. Their minds are closed. They don’t want to believe a greenhouse effect for ideological reasons, and science is not going to change that. So they resort instead to . a pseudo-science. There are ample sources of disinformation on most sites, including this one. I think that Andrew Lacis once said
“Actually, the Gerlich and Tscheuschner, Claes Johnson, and Miskolczi papers are a good test to evaluate one’s understanding of radiative transfer. If you looked through these papers and did not immediately realize that they were nonsense, then it is very likely that you are simply not up to speed on radiative transfer”
…..Andy Lacis
Unfortunately, most people only read what they already agree with, that which reinforces their own prejudices. We all have a tendency to do that, for example, what newspaper do you buy?
Apart from that, the link you provide is probably one of the best and most informative for the minority here who may just possibly wish to learn.
In this, his latest published article, Lord Monckton repeats mistakes that he persistently makes. In making these mistakes, Monckton inadvertently muddies the waters that surround the IPCC’s claim to CAGW.
In the first of two mistakes, Monckton adopts as a premise to his argument the existence of the equilibrium climate sensitivity (TECS) as a scientifically legitimate concept. It is not legitimate.
TECS is based upon the proposition that the logarithm of an increase in the CO2 concentration maps to an increase in the equilibrium temperature but as the increase in the equilibrium temperature is not an observable, speculations regarding the magnitude of this increase are non-falsifiable thus lying outside science.
In the second of his mistakes, Monckton asserts that Giss Model E makes “predictions.” Actually, it makes “projections.” In judging the scientific validity of the IPCC’s claim, it is necessary for the distinction between predictions and projections to be maintained, for while predictions are falsifiable, projections do not have this property.
Werner Brozek says:
“January 16, 2012 at 9:36 am
Thank you (“Coldish says:
January 16, 2012 at 8:31 am”) for making the important distinction between residence time and ‘adjustment time’
Regarding ‘adjustment time’, see:
http://www.john-daly.com/carbon.htm”
Thanks, Werner. John Daly and his colleagues seem to have sorted out so much about climate change years before many of us (including myself) had even heard of the issue. He was a giant of scientific communication.
As a critique of the GISS model the preceding makes very little physical sense..
While this explains the conspicuous absence of the proprietor of Minckton’s Shirts of Pitlocherty from peer reviewed publication, it compounds the mystery of his career change. While his service as a Number 10 factotum failed to gain him political office in the 1980’s, his wares gave perfect satisfaction during his tenure as my London neighborhood shirtmaker in the nineties, and remain wearable today.
[snip]
MODTRAN, an Air Force atmospheric radiation estimation program, seems to indicate that CO2 has a minimal ‘raw’ temperature forcing capability. (That is, before applying any hypothetical climatic positive or negative feedback factors.) In the following link it is almost impossible to see the green curve for radiation at 300 PPM CO2 concentration that is hidden behind the almost identical blue curve for 600 PPM.
This is a complete doubling of the CO2 concentration with almost no effect on the calculated radiation leaving the Earth over the wavenumber frequency range of 100 to 1500 cycles per centimeter. (kayzers) Note: A wavenumber plot has the advantage of uniform radiant energy density per unit of measure.
http://en.wikipedia.org/wiki/File:ModtranRadiativeForcingDoubleCO2.png
R. Gates says:
January 16, 2012 at 8:06 am
“Take away CO2, and the Earth goes back to an snowball planet in a fairly short order (less than a century). Once cooling starts the cycle of condensing more and more water vapor, more cooling, and glacial growth begins. CO2 does not react this way of course and provides a buffer to the creation of a snowball planet. So, even though the actual contribution of CO2 as measured may be only 25% of the atmospheric warming, it is a critical 25%, such that without it, you could lose quite a bit of the remaining warming as the amount of water vapor in the atmosphere would fall to very low levels overall, such as we find over Antarctica and would once more in a snowball planet situation.”
The whole “snowball Earth” conjecture is complete BS ( bad science or bovine excrement, you pick em). As the paper I linked in comment above shows
Dave Wendt says:
January 15, 2012 at 11:21 pm
Even in Antarctica in the dead of winter when temps are routinely at 200K or below, H2O doesn’t doesn’t disappear from the atmosphere and even at the admittedly greatly diminished levels of humidity there, H2O still dominates the radiative activity of CO2 by a 2 to 1 margin. At present temperatures in Tropical latitudes routinely hit 300K to 330K. Even if we stipulate to the rather dubious notion that CO2 is responsible for 25% of the GHE, we’re talking about 8-9K and of course in the real world there is absolutely no chance of CO2 going away anytime soon or ever, absent some very large extraterrestrial body coming in at very high velocity, at which point the whole discussion will be entirely moot anyway.
Another indication that the feedbacks are extreme is Dr. Lacis’ mention that in model runs if all the CO2 is removed global average temperature drops something like 10C the first year and the oceans freeze to the equator in a decade.
That’s a lot of pop for 400ppm.
An interesting and broad ranging review. One question, though: Lord Monckton noted that the Stern Report stood for the proposition that:
“the cost of not preventing that warming would be 3% of 21st-century GDP.”
Based on his reference to Nordhaus’s critical review(s), I read Nordhaus’s “The Stern Review on the Economics of Climate Change” (May 2007): http://nordhaus.econ.yale.edu/stern_050307.pdf
It is an excellent review, and roundly criticizes the absurd discount rate and related assumptions in the Stern Review. Nordhaus, however, quotes the Stern review for the proposition that:
“…if we don’t act, the overall costs and risk of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever.”
I would never suggest that this is any way, shape or form correct (indeed, it seems fundamentally absurd), but it seems a different proposition than the one cited by Lord Monckton.
Hsien-Wang Ou thinks it’s water and cloud albedo feedback that stabilizes our climate.
http://journals.ametsoc.org/doi/abs/10.1175/1520-0442%282001%29014%3C2976%3APBOTES%3E2.0.CO%3B2
“,,,From the model derivation, it is found that the surface temperature is narrowly bounded below by the onset of the greenhouse effect and above by the rapid increase of the saturation vapor pressure. Because both are largely intrinsic properties of water, the resulting surface temperature is mostly insensitive to detailed balances or changing external conditions. Even with a 50% change of the solar constant from its present-day value, the model temperature has varied by only about 10 K. The reason that the heat balances can be maintained is an internal adjustment of the low cloud cover, which offsets the solar effect. The model offers a plausible explanation of an equable climate in the geological past so long as there is a substantial ocean.”
In the past, CO2 followed rather than controlled climate
http://www.usc.edu/uscnews/stories/14288.html
“…Deep-sea temperatures warmed about 1,300 years before the tropical surface ocean and well before the rise in atmospheric CO2, the study found. The finding suggests the rise in greenhouse gas was likely a result of warming – but not its main cause. ”
Tom Segelstad describes the ;CO2 cycle
http://www.geocraft.com/WVFossils/Reference_Docs/Carbon_cycle_update_Segalstad.pdf
http://folk.uio.no/tomvs/
Given the above, how can any rational human being jump to the conclusion that CO2 has more
than a minor effect on climate? Yet R. Gates states,
“…Take away CO2, and the Earth goes back to an snowball planet in a fairly short order (less than a century). Once cooling starts the cycle of condensing more and more water vapor, more cooling, and glacial growth begins.”
Nutty statements like that ignore all of earth’s natural history,
Joel Shore says:
January 16, 2012 at 10:33 am
“I think it is a little confusing to say that “CO2 exercises about 75% of the radiative forcings from all greenhouse gases”. It is probably better to say what the direct radiative effect of the CO2 in the atmosphere…”
If we are going to say “what the direct radiative effect of the CO2” is in the atmosphere then please answer a question I have asked you on a couple of previous occasions and recieved no answer.
What emissivity do you assign CO2 at 1 atm and 288 K?
If radiative heat transfer is going to be involved then that is an important item to know.
mkelly, use of a radiative transfer program such as MODTRAN shows that if you remove all the CO2 from the atmosphere keeping all else constant, its emission to space reduces by 10%. This is like an atmospheric emissivity change of 0.1 out of a total emissivity of 0.6 for clear sky. So, by this definition, the total emissivity of the CO2 in the atmospheric column is about 0.1.
…I should, of course, have said the emission to space _increases_ when you remove CO2 because more radiation escapes. It reduces when you double CO2, trapping more heat.
Spector says:
January 16, 2012 at 12:41 pm
“This is a complete doubling of the CO2 concentration with almost no effect on the calculated radiation leaving the Earth”
Spector, what exactly do you expect to see here with your model? The amount of radiation leaving the Earth will always equal the amount being received from the Sun. Doubling the level of CO2, or anything else for that matter, will not change that. This is called the radiation balance, which applies at the top the atmosphere. Energy in must equal energy out . So what point are you trying to make? ( Jim D take note )
However, we do not live at the top of the atmosphere, we live on the planet’s surface. The greenhouse gases cause the surface to be warmer than it otherwise would be without them.
BtC, I agree, and I was addressing the instantaneous change if you doubled CO2 or removed it. Doubling it would subtract 1% of outgoing radiation, so the earth has to warm to restore the balance. Removing it all, adds 10%, so the earth has to cool to restore it. Much of the feedback in the Lacis paper was actually ice albedo feedback, due to the growing extent of the polar ice caps, but this is also reinforced by water vapor as seen from the Ice Ages. If you don’t let the ice caps grow, the CO2 removal would only have caused about a 10 C cooling.
Lord Monckton, that’s quite an impressive string of investigations and your time uncovering this is appreciated. Those IPCC reports are by no means what you call thin and concise (for the obfuscation). Will read this in depth as time allows.