Guest Post by Willis Eschenbach
When I’m analyzing a system, I divide the variables into three categories—first-, second-, and third-order variables.
First-order variables are those variables that can change the system by more than 10%. Obviously, these must be included in any analysis of the system.
Second-order are those that can change the system by 1% to 10%. These are smaller, but still too large to overlook.
Finally, third-order variables are those than can change the system by less than 1%. These are small enough that they can be ignored in all but the most detailed analyses. To give you an idea of why we can neglect the third order variables, here’s how those three forcings would look on a graph, for an imaginary signal of say 500 W/m2.
Figure 1. Showing the relative sizes of first-, second-, and third-order variables.
Note that the series containing the third-order variable is almost invisibly different from the series where the third-order variable is left out, which is why third-order variables can be safely ignored except when you need extreme precision. So … what does this have to do with climate science?
Let’s do the same kind of analysis on the forcings of the climate system. At the TOA, the “top of atmosphere”, there is downwelling radiation from two sources: the sun, and the longwave “greenhouse” radiation from clouds and “greenhouse” gases (GHGs). The globally-averaged amount of downwelling solar radiation at the earth’s TOA (which is total incoming solar radiation less a small amount absorbed in the stratosphere) is on the order of 330 watts per square metre (W/m2). The amount of downwelling longwave radiation at the TOA, on the other hand, is about 150 W/m2.
Finally, if CO2 doubles it is supposed to change the downwelling radiation at the TOA by 3.7 W/m2 … here’s how that works out:
Figure 2. Sources of downwelling radiation at the top of the atmosphere (TOA), defined as the tropopause by the IPCC.
By that measure, CO2 doubling is clearly a third order forcing, one that we could safely ignore while we figure out what actually makes the climate run.
Or we could look at it another way. How much of the earth’s temperature is due to the sun, and how much is due to the earth’s atmosphere?
If there were no atmosphere and the earth had its current albedo (about 30%), the surface temperature would be about 33°C cooler than it currently is (see here for the calculations). Obviously, downwelling longwave radiation from the greenhouse gases is responsible for some of that warming, with DLR from clouds responsible for the rest. Cloud DLR globally averages about 30 W/m2 (see here for a discussion). So the 150 W/m2 forcing from the GHGs is responsible for on the order of 80% of the 33° temperature rise, or about 25°C.
But if 150 W/m2 of GHG forcing only warms the surface by 25°C, then the so-called “climate sensitivity” is only about 25°C warming for 150 W/m2 of TOA forcing, or a maximum about six tenths of a degree per doubling of CO2, or about 0.2% of the earth’s temperature … again, it is a third order forcing.
Now, if someone wants to claim that a change in the forcings of less than 1% is going to cause catastrophes, I have to ask … why hasn’t it done so in the past? Surely no-one thinks that the forcings have been stable to within 1% in the past hundred years … so where are the catastrophes?
Finally, most of the measurements that we can make of the climate system are imprecise, with uncertainties of up to 10% being common. Given that … how successful are we likely to be at this point in history in looking for a third-order signal that is less than 1% of the total?
w.
PS – In any natural heat engine of this type, which is running as fast as the circumstances permit, losses rise faster than the temperature. So in fact, the analyses above underestimate how small the CO2 effect really is. This is because at equilibrium, losses eat up much of any increase in forcing. So the effect of the CO2 at general climate equilibrium is less than the effect it would have at colder planetary temperatures. In other words, climate sensitivity is an inverse function of temperature.
PPS – Please don’t point out that my numbers are approximations. I know that, and they may be off a bit … but they’re not off enough to turn CO2 into a second-order forcing, much less a first-order forcing.
PPPS – What is a first-order climate variable? Clouds, clouds, clouds …
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Septic Matthew says:
October 4, 2011 at 7:14 pm
Far from being empty phraseology, it is a mathematically derivable result from the Constructal Law. You should know by now that I don’t do “empty” very much.
w.
R. Gates says:
October 4, 2011 at 9:00 pm
Another very interesting post Willis. And yet, of course, this little “3rd order” variable, this tiny trace gas, a mere tiny fraction of our Earth’s atmosphere, is absolutely critical in preventing the Earth from becoming a snow ball planet.
Right, Gates, so let’s do as much as we can to keep CO2 concentrations as low as they’ve ever been over the last 600 million years or so. Hey, maybe we can get the plants to starve and get rid of the teeming hoardes to boot! Conveniently leaving the chosen to graze as “tundra trash”?
And not that I agree with your phobic snowball earth scarecrow to begin with, but it would be nice to at least be able to ward off the next glaciation, right?
Tim Folkerts says:
October 4, 2011 at 7:34 pm
I took the Trenberth diagram, which didn’t balance up and down radiation. I realized it would take two layers to do it. So I wrote a simple radiation/latent heat/sensible heat climate model, and played with it until it reproduced all of the major flows of the Trenberth diagram.
It’s actually an interesting exercise. There’s not a whole lot of wiggle room in the figures if you want to reproduce the Trenberth energy budget with good fidelity. For instance, if there is much loss (in the form of the transport of sensible and latent heat) between the two layers, the efficiency drops like a stone and the surface is nowhere near warm enough. That is why I proposed that one layer is the average tropopause layer and the upper layer is at/just beyond the tropopause.
w/.
ferd berple says:
October 4, 2011 at 9:23 pm
You could … but remember to get work, you need a sink that is cooler than the heat source. It is the difference between this sink and the source that is exploited to create the work.
Downwelling DLR, however, is at about 320 W/m2, which equates to a blackbody temperature of about freezing (0°C). So to get work out of it, you’ll need to have a still lower temperature sink to which to reject the heat … what do you plan to use for that?
You are confusing heat with energy, I think. Heat only flows in one direction, from warm to cold.
Radiant energy, on the other hand, doesn’t care about the temperature of whatever it strikes. It is absorbed by warmer and cooler objects alike.
w.
Willis: it is a mathematically derivable result from the Constructal Law. You should know by now that I don’t do “empty” very much.
“At the edge of turbulence” is like “almost equilibrium”.
It’s easier to believe AGW than it is to believe the “Constructal Law”. Unless the result of the derivation has been severely tested, as this has not, then it is likely that the approximations made in the assumptions are not accurate enough to substantiate the accuracy of the result.
What I love about this piece. Here’s a guy who can put things in perspective. Perspective’s what it’s all about and for all our education our losing touch with the natural world is making us easy prey to Alarmists . Alarmism thrives on getting things out of perspective. Observers who can bring back perspective ( or should I say significance) are a most valuable asset.
Willis said:
If there were no atmosphere and the earth had its current albedo (about 30%), the surface temperature would be about 33°C cooler than it currently is (see here for the calculations).
Problem with that is that most of that 30% is due to clouds, which would necessarily be absent with no atmosphere. The correct calculation would be with an atmosphere but no greenhouse effect, and due to the different surface temperature, different cloud cover, and therefore overall albedo.
A water cooled engine is cooled with a flow of 600 liters/minute. The inlet temperature is 318 K and the outlet is 320.5 K Now this is a delta T of only 2.5 C and since this is only 2.5/318 K times 100 %= 0.8 % of the inlet temperature this is a 3. order forcing. So the engine is not realy cooled at all and we can turn of the cooling system and nothing will happen………. Well a closer look shows that the heat removed from the engine is 105 kW and if the cooling system is stoped the engine will fail within a very short time.
Willis, Trenberth has the IR “window” wrong (40 instead of the actual 66 w/m2) and knows it see slide 26 of the following http://climategate.nl/wp-content/uploads/2010/09/KNMI_voordracht_VanAndel.pdf. One has to ask is he committing scientific fraud by not correcting or withdrawing his papers? As I said on another bloke I believe that he has no understanding of heat and mass transfer (which is a chemical engineering discipline based on empirical evidence, measurements and engineering research). Trenberth can find no reasonable explanation so he continues his fantasy of being a leader of the “team”. His cartoon of global heat balance is just that a cartoon serious people should laugh at. Your B_ll__it nose has failed you.
Everyone gone. Never mind – you are all so clever I dare not say anything in prime time.
gbaike says that the dinasaurs were probably killed by a space rock. When I looked at the very clear chart provided by Smokey on the level of CO2 over the past 500 million years, (at 6.01pm) I noticed that the dinosaurs flourished when CO2 was between 1,200 and 2,000 parts per million.
Perhaps they died because CO2 fell below the high levels required to grow the huge plants that dinosaurs lived on? Perhaps they starved to death, as we will if CO2 levels were to drop. Thank heavens they have risen. Perhaps my grandchildren will live to hear the cry ‘Burn fossil fuels! Burn fossil fuels to increase CO2 levels!!’ They will think that our generation was nuts. Apart from WUWT and friends of course.
Willis Eschenbach says:
October 4, 2011 at 11:51 pm
Downwelling DLR, however, is at about 320 W/m2, which equates to a blackbody temperature of about freezing (0°C). So to get work out of it, you’ll need to have a still lower temperature sink to which to reject the heat … what do you plan to use for that?
DLR shows an emission spectrum for all the GHGs – it’s very “spiky” and thus very far from a smooth Planck curve. Any such temperature calculation is thus invalid. An examination of the Planck “envelope” (upper bound of the spiky spectrum) indicates a much higher temperature than 0°C.
A similar error is frequently made when considering outgoing radiation to space. An “effective” blackbody temperature can be calculated, and has some use, but has no real physical meaning – OLR comprises GHG emission bands, and surface radiation transmitted through the atmospheric “windows”, again not a smooth Planck curve, and with radiation from two main sources at different temperatures.
BTW, DLR is considered to come from a height of around 5 km. OLR is considered to come from TOA, which is logical, as 90-odd percent of the atmosphere is below TOA.
If incoming solar is 500w/m2 and CO2 adsorbs, say, 100w/m2 then the ongoing solar is 400w/m2. So called greenhouse gasses cannot add to the incoming since law 2 must apply.
It is assumed that the LWIR is that reradiated by GHG’s. It is more possibly incoming SWIR that has lost energy and changed frequency due to this energy loss.
richard telford says:
October 4, 2011 at 2:43 pm
Too often folk here get over excited by possibly third-order effects – there was one last week on atmospheric CO2 concentrations not being perfectly mixed. But there is a difference between a third-order effect that varies rapidly and one that steadily increases…..
_________________________________________________________________________
What I get upset about is cherry picking the data. The fallacy of CO2 being mixed is the excuse used to throw out truckloads of data that does not fit the predetermined curve and an excuse for using only a small portion of the actual data. The general public and most scientists are not even aware of the “other data” and if it is mentioned by a scientist, he is mobbed and buried like a rugby player who has the ball.
The actual wet chemistry data shows that CO2 is NOT well mixed and varies from 250 to over 550ppm. Studies of CO2 in ice cores carried out BEFORE CAGW (that is before 1985) show values up to 2450ppm and generally above the data now being taken at Mauna Loa. Neftal et al (1982) in 150 year old ice shows a DECREASING trend and a range of 300ppm to 2350ppm.
The well mixed crap is an artifact of selective evidence and nothing more!
“Callendar (1940, 1958) selected atmospheric CO2 data from the 19th and 20th centuries. Fonselius et al. (1956) showed that the raw data ranged randomly between about 250 and 550 ppmv (parts per million by volume) during this time period, but by selecting the data carefully Callendar was able to present a steadily rising trend from about 290 ppmv for the period 1866 – 1900, to 325 ppmv in 1956.
Callendar was strongly criticized by Slocum (1955), who pointed out a strong bias in Callendar’s data selection method. Slocum pointed out that it was statistically impossible to find a trend in the raw data set, and that the total data set showed a constant average of about 335 ppmv over this period from the 19th to the 20th century. Bray (1959) also criticized the selection method of Callendar, who rejected values 10% or more different from the “general average”, and even more so when Callendar’s “general average” was neither defined nor given.
……North-European stations measured atmospheric CO2 over a 5 year period from 1955 to 1959. Measuring with a wet-chemical technique the atmospheric CO2 level was found to vary between approximately 270 and 380 ppmv, with annual means of 315 – 331 ppmv, and there was no tendency of rising or falling atmospheric CO2 level at any of the 19 stations during this 5 year period (Bischof, 1960). The data are particularly important because they are unselected and therefore free of potential biases from selection procedures, unlike the CO2 measurements based on the procedures at Mauna Loa…… At the Mauna Loa Observatory the measurements were taken with a new infra-red (IR) absorbing instrumental method, never validated versus the accurate wet chemical techniques. Critique has also been directed to the analytical methodology and sampling error problems (Jaworowski et al., 1992 a; and Segalstad, 1996, for further references), and the fact that the results of the measurements were “edited” (Bacastow et al., 1985); large portions of raw data were rejected, leaving just a small fraction of the raw data subjected to averaging techniques (Pales & Keeling, 1965)
…..the CO2 content of air inclusions in cores from ice sheets should reveal paleoatmospheric CO2 levels. Jaworowski et al. (1992 b) compiled all such CO2 data available, finding that CO2 levels ranged from 140 to 7,400 ppmv. However, such paleoatmospheric CO2 levels published after 1985 were never reported to be higher than 330 ppmv. Analyses reported in 1982 (Neftel at al., 1982) from the more than 2,000 m deep Byrd ice core (Antarctica), showing unsystematic values from about 190 to 420 ppmv, were falsely “filtered” when the alleged same data showed a rising trend from about 190 ppmv at 35,000 years ago to about 290 ppmv (Callendar’s pre-industrial baseline) at 4,000 years ago when re-reported in 1988 (Neftel et al., 1988); shown by Jaworowski et al. (1992 b) in their Fig. 5…… http://www.co2web.info/stoten92.pdf
Technical analysis of CO2 data: http://www.co2web.info/np-m-119.pdf
“To capture the public imagination … we have to offer up some scary scenarios, make simplified dramatic statements and little mention of any doubts one might have” – Stephen Schneider a man who calls himself a climatologist
With statements like that from the gate keepers of the data, it becomes obvious that CAGW is all about politics and propaganda and has nothing to do with science.
John Eggert says:
October 4, 2011 at 4:00 pm
“In performing heat balances, the effects of each separate mechanism are cumulative. That is, the net heat tranfer is the sum of convective, conductive and radiative. The climate model energy balances account for this. ”
I was referring to Willis’ pie chart, not to some climate model. And it is titled “Downwelling radiation…”
Also, show me a climate model that correctly simulates convective fronts. I guess you will have to get one from the future, as none exists today.
First of all, Willis, brilliant piece. It’s the simplicity that will shake the known suspects to the core.Joel Shore said: “(2) Climate scientists, even skeptical ones like Roy Spencer or Richard
Lindzen are not able to do basic math, since they haven’t made this obvious point that, if this were really all there is to it, would fatally undermine AGW.”
Preoccupation? The fact is there is a hidden number that nobody talks about therefore nobody considers using it in a similar way that Willis did his figures. Perhaps from a desire to make things more complex than necessary? Perhaps to keep the funding flowing and the alarmism high? Nah. I’d say the concentration of focus on CO2 and CO2 alone has caused eyes to look in one direction only and missing what is standing ‘over there’. We don’t even have to look at those silly watts per square meter. That’s just to make us regular folks’ eyes glaze over and make work for thousands of others. (I’m joking. The watts stuff is fun.)
Perhaps nobody has looked up what the actual PPM of water vapor is. It took me a while since my google fu is next to non-existent but I did track down the number a while back. Then off and on I hunted for a number that Willis actually provided in his brilliant little piece here.
So if we take those two magic numbers we can use the back of a really tiny little envelope and discover that doubling CO2 may only add, minimum, 0.68C to our global temperature.PPM of water vapor is 14,400. Let’s start with a PPM of 300 for CO2. Just cause our envelope is so tiny and doubling it to 600 is easier on the pencil strokes.
14,400 PPM water vapor + 300 PPM of CO2 = 14,700 PPM of greenhouse goodies that delivers
33C of mercury on that non-mercury thermometer.
14,700/33 = 445 PPM per degree C
In envelope-speak that equals 450.
So adding another 300 PPM to double our CO2 will add 2/3 of a degree or 0.68C.
Yeah, yeah, I know. Like what was the exact PPM of water vapor when CO2 was exactly 300PPM, etc. And some really trace gasses missing here. Water vapor is apples and CO2 is
oranges, but they’re both fruits and what CO2 giveth H2O taketh away so they’re not as far apart as you may think in the gestalt of things. That number should go up a bit but I don’t think very far.
First note that the amount of CO2 was only 2.06% of the total greenhouse gases. Definitely in Willis Third Order category. And if you want to figure out how ‘potent’ CO2 is as a greenhouse gas compared to H2O be my guest but keep that 2.06% in mind.
Note this too: Water vapor is mixed, but not well-mixed, and certainly not even close to CO2’s territory as far as mixing is concerned. Yeah, we all know temps haven’t risen nearer the equator nearly as much (relatively speaking) as in the higher latitudes. But they haven’t been rising as much as expected there either. Now if only these brilliant AGW folks could find a way to channel all the CO2 into those gaps we see in satellite water vapor images, perhaps temps might rise a bit closer to what the models predict.
Well, that’s my 2 cents. Thanks for the inspiration, Willis.
Now, everyone scoff. I don’t mind. Find an envelope, brush up on your google fu, and have some fun.
[ps no preview. gulp. I’m heading to sleep]
Anything is possible says:
Every time I see this line of reasoning, it troubles me because it misses out a step. So here’s a serious question for you Willis….
What would the Earth’s surface temperature be if you removed the greenhouse gases, but retained all the nitrogen, oxygen and argon which comprises 99+% of its thickness? Surely that would have to be your GHG “starting point”…..
____________________________________________________________________________
The other part of that is the 70% of the surface, the ocean, a huge heat sink that stores the sun’s energy. Clouds are important but it is water, in all three phases, solid liquid and vapor, that modifies the climate with the help of the atmospheric blanket of gas.
Everyone gets so caught up in the so called “Green house gases” that they forget the oceans and ice. They also forget that nitrogen, oxygen and helium absorb in the visible range so they are not exactly “unaffected” by sunlight (see http://hyperphysics.phy-astr.gsu.edu/hbase/quantum/atspect.html and http://astro.u-strasbg.fr/~koppen/discharge/oxygen.html)
Even in the desert, where water vapor is not present in large amounts, the earth does not cool to the temperature of the moon. The atmosphere is what keeps the earth from re-radiating all of the heat captured during the day. This allows the earth itself to acted as a heat sink.
>>
KevinK says:
October 4, 2011 at 7:11 pm
This is easily confused with the “orders” of a polynomial equation where the “first order” is an f(x), the “second order” is an f(x*x), “third order” is an f(x*x*x) etc.
<<
I guess I’m confused. When I learned math terms many years ago, the highest power in a polynomial was its degree. Order was a term reserved for differential equations. You could have a third-order, second-degree differential equation which would probably be a doozie to solve. A third degree polynomial or cubic wouldn’t be nearly as difficult.
Jim
charles nelson says:
October 4, 2011 at 6:10 pm
Could one refer to the likes of Mann, Briffa, Hansen etc as scientists of the ‘third order’ ?
___________________________________________________________________
I LOVE it.
However I think a better title is propagandists, since I doubt they have even an inkling of what true science is.
Jeff says:
October 4, 2011 at 6:17 pm
I appreciate posts like these. When i was young math came easy, now not so much 🙂 I guess I don’t really look at things normal. There is a deafening scream that there is to much CO2 but just recently on scientist state that we are in a CO2 famine. I have seen more charts,graphs, models of CO2 then i really wanted to. But I have yet to see what is considered normal…..
___________________________________________________________________________
Jeff check out the pdfs at http://www.co2web.info/ it gives the other side of the CO2 story. A real eye opener.
I also agree we are in a CO2 famine. One of those pdfs mentions 250 ppm as the lower limit for some types of plants, I have also seen 220 ppm and 200 ppm mentioned. This is why the “cherry picked ” ice core data with values at or below 180ppm of CO2 “smell real funny”
From farmers/biologists about CO2 in green houses:
“…….Below 200 PPM, plants do not have enough CO2 to carry on the photosynthesis process and essentially stop growing. Because 300 PPM is the atmospheric CO content, this amount is chosen as the 100% growth point. You can see from the chart that increased CO can double or more the growth rate on most normal plants. Above 2,000 PPM, CO2 starts to become toxic to plants and above 4,000 PPM it becomes toxic to people…..” http://www.hydrofarm.com/articles/co2_enrichment.php
One scientist saw “… the depletion occurs rapidly within a few hours after aylight. I was surprised when I observed a rapid 50 ppm drop in CO2 content within a tomato plant canopy just a few minutes after direct sunlight at dawn entered a greenhouse (Harper et al., 1979)” Tomato Plant Culture: In the Field, Greenhouse, and Home Garden By J. Benton Jones
I agree, this makes perfect sense, however sense does not appear to be the rationale behind what is happening to us poor humans who are being strongarmed into paying for totally inefficent mades of energy production.
Such mathematics are INSANE (sarc) you cannot TAX them.
keep up the good work, I find that figures are very had to argue against, look at the state of the economies.
Perhaps they seer the writing on the wall, anyone heard of the FAT tax? “Gawd save us from fools”.
R. Gates says:
October 4, 2011 at 9:00 pm
Another very interesting post Willis. And yet, of course, this little “3rd order” variable, this tiny trace gas, a mere tiny fraction of our Earth’s atmosphere, is absolutely critical in preventing the Earth from becoming a snow ball planet…..
_______________________________________________________________
JPeden says:
October 4, 2011 at 11:13 pm
Right, Gates, so let’s do as much as we can to keep CO2 concentrations as low as they’ve ever been over the last 600 million years or so…..
_______________________________________________________________
How come no one ever mentions adding or subtracting a mere tiny fraction of water from our Earth’s atmosphere to adjust for the amount of CO2 that is released. After all water is a much more powerful GHG. Or better yet how about draining all those methane producing swamps???
As another commenter stated take a look at this graph: http://www.theresilientearth.com/files/images/Greenhouse_Gas_Absorption-dlh-500.png Additional CO2 is really a non starter as “the monster under the bed” compared to the highly variable amount of water vapor in our Earth’s atmosphere.
Willis,
I am a skeptic on the bad effects of AGW, and agree the effect on CO2 on temperature will be small. However your analysis here is misleading for several reasons. A quick analogy is that a small amount of many things can have a large effect (e.g., poisons, toxic gases, etc.), so just showing something is small is not enough. Also note a raise of 3 C is only 1% of the absolute temperature, and an increase of 3 C could have a significant effect (but I do not expect an increase of 3 C). Also the back radiation is not the direct cause of the temperature rise. If there were a much better optical absorber at sea level (something that filled the so called direct window to space for CO2 and water vapor), back radiation would nearly equal upward radiation, but the surface energy would still be lifted up to the TOA by evapotranspiration, convection and conduction. The only factors that set the surface temperature are the effective average location of the outgoing radiation and the lapse rate, and the lapse rate is only a function of the Cp of the air (as modified by water condensation effects). Thus it is the raising of altitude of outgoing radiation, not back radiation, that is the cause of the “greenhouse gas effect”. Back radiation is the result, not cause.
@Willis
“Climate sensitivity is an inverse function of temperature”
This point needs to be emphasised. Apologies if you have used this analogy before, but I would liken it to a coiled spring with the bottom coil bolted to the floor and the top coil attached to the temperature line.
As the temperature goes up, the spring expands and increases resistance, to a point where temps have stretched it to its limit. At this point, a very slight “coldening effect” (Milankovich cycles?) is enough to release the tension and the temps go down quickly, similar to what we see when an interglacial comes to an end. It carries on like that for a while, to the point where the climate sensitivity spring is compressed, and temps shoot up at the slightest perturbation, like when we come to the start of an interglacial.
At the points in between, both remain more or less in balance, but hard up against one another. So a warmening influence such as El Nino or the Ocean cycles push things in favour of the temps a bit but the climate sensitivity resists more. Eventually, the cycle turns and things come more or less back into balance at the extreme.
Tim Folkerts says:
October 4, 2011 at 7:56 pm
If you think ” it must be possible” to warm something with 300 W/m^s of IR from a ~ 300 K object the same way you can warm something with 300 W/m^2 of sunlight from a ~ 5800 K sun, then you think the 2nd law does not hold.
Why not equate 300 w/m^2 IR with 300 w/m^2 from the sun when the K&T energy balance chart does just that only they say 237 W/m^2. Incoming sun outgoing IR.
Even Ira some time ago said “A watt is a watt.” If this were true the foot of a 50000 watt transmitting antenna for a radio station would be the hottest place on earth. But it is not.