# Negative Climate Feedbacks are Real and Large

Guest essay by Leland Park

Global warming theories propose positive feedbacks to explain magnified greenhouse effects that might trigger catastrophic warming. Naturally, any clues in temperature observations that might indicate feedback would be of great interest to climate science.

It turns out that climate feedback is very real, large and negative.

Climate Cause and Effect.

The concept of feedbacks presupposes a dynamic system in which cause and effect are linked through a consistent timing relationship. Thus, evidence of climate operating as a system, if it exists, should be found in comparing a cause of climate behavior with its effects. It is well known that solar energy is a dominant factor in the climate. Thus, the timing relationship between solar levels and temperatures might provide insight into climate cause and effect timing.

Figure 1 is a seasonal timeline of US average daily highs (Tmax) for US HCN stations at 36 degrees North Latitude. On that graph is an overlay of the daylight hours for same latitude, where the amount of daylight serves as a rough proxy for the pattern of solar level changes. (The two vertical scales are not adjusted for proportionality.)

Figure 1 Patterns of Daylight and Tmax for US HCN Stations at 36 No Latitude.

Several interesting observations can be made from this figure;

o the patterns of daylight hours and temperatures are both sinusoidal,

o Temperatures lag solar levels by about a month throughout the year,

o winter to summer temperature changes in excess of 40 deg F are entirely normal,

o this seasonal pattern is a consistent, recurring feature of climate behavior,

Though climate is often thought of as chaotic, it is readily apparent that the historical, repetitive pattern of cause (solar level) and effect (temperature change) means this pattern is not accidental. In fact, the cause and effect nature linking the two functions is a systematic behavior known as stimulus-response in control systems. Since this seasonal pattern actually repeats every year, the huge lag is actually a characteristic of climate behavior. In systems terms, this lag, from cause to effect, constitutes negative feedback.

Stimulus-Response Precision

As a systematic behavior we are interested in the consistency of the stimulus-response behavior. Most aspects of climate behavior exhibit variation over time, it is of interest to examine the statistical behavior associated with the seasonal patterns. Unfortunately, the ideal solar function used in Figure 1 cannot be used to match up with short term temperature variations. However, the statistical pattern of temperatures through the seasonal changes may provide insight into climate operation under change.

Figure 2 is a composite of the histograms for four selected months representing the solstice and equinox periods. Because of the climate lag, the solstice and equinox periods fall in the months of January, April, June and October. Variations in Tmax were calculated for individual stations by reference to station historical averages before inclusion into the histogram counts.

The statistical patterns for each month are nominally symmetrical about those historical averages so they correlate with the Tmax function in Figure 1. The results of these calculations is presented in Figure 2 where the mean of the distributions (and the monthly Tmax average) is represented by the 0 axis. While we do not have measured solar levels for comparison, it is clear that the climate is following the solar pattern with remarkable precision.

Figure 2 Tmax Variation Statistics for Selected Months

There are many factors that can produce variations in the seasonal Tmax. Among these are variations (by time and location) in the albedo, cloud cover, water vapor and ocean and atmospheric circulations. Despite the many reasons for variation, the climate follows the seasonal pattern closely even though the solar level is undergoing continuous change. In fact there is at least a 90 deg F round trip from winter to summer that is entirely normal at this latitude.

Figure 2 is, thus, an excellent illustration of the dynamic stability in the climate system. A passive system could not deliver the demonstrated seasonal tracking precision in following the solar stimulus. In short, the climate is behaving as if it is an active control system. More precisely, Figures 1 and 2 illustrate behavior that is consistent with that of a linear control system subjected to a sinusoidal stimulus.

Naturally, the temperature data alone is inadequate for explaining all of the effects. It should be sufficient, however, to establish that the dynamics are not “out of control”..

The Lag is Produced by Retarding Heat Changes.

During the January to July phase, the effect of the negative feedback is to retard temperature increases despite the increasing solar levels (Figure 1). Conversely, from July on, the negative feedback retards the loss in temperatures despite the waning solar levels. So the effect of the negative feedback is to retard, or delay, the effects of the solar level changes, whether increasing or decreasing.

So the cause of the temperature lag is something in the climate that can retard both heat gains and losses across the entire continent. Although atmospheric lag contributes, the daily lag of about 4 hours is far too small to account for the seasonal lag.

It is possible to surmise major characteristics of the cause of the lag. These are; 1) high relative heat capacity, 2) very large total heat capacity and 3) global impact. Those characteristics can only be met by the water in the oceans. In effect, the ocean contains such enormous amounts of water (high relative heat capacity) that they represent a vast thermal reservoir for heat absorption and subsequent release as the seasonal solar changes require.

In addition to its high heat capacity, water has special thermal properties that provide a critical link between the oceans and global climate. For example, as the sun heats the ocean a portion of the surface is evaporated carrying a large amount of heat energy into the atmosphere to be globally circulated. The relationship between water and the climate is far too complex for elaboration here, but it is clearly critical to the global climate.

Implications for Climate When Feedback is Negative.

A system with a large negative feedback is inherently stable in its operation. In this case, climate behavior is operating as if it is a linear control system where the stimulus is sinusoidal. That is, the climate will dutifully follow the solar stimulus but with a persistent delay (lag). Not only is it responding to the solar stimulus, it is responding with great precision, despite the approximately 90 deg F seasonal round trip in ambient temperatures over the year.

In fact, a system with this much lag would be quite stable with respect to minor perturbations. Inducing such a system to a permanent change in equilibrium conditions would not cause catastrophic instability. Instead the system would slowly seek a new equilibrium operating condition. For a major change in equilibrium conditions, however, a correspondingly large change in the system fundamentals would be required as well as considerable time for the change to take effect. Suffice it to say that minor changes in atmospheric trace gases would not be likely to force an equilibrium change in the system.

Implications for Climate Science.

US HCN data was mined for the graphs used in this analysis and the illustrative graphs. However, awareness of the climate lag and its approximate size could have been gleaned from an ordinary calendar. The seasonal pattern of solar level changes is well known and the solstice and equinox points are often marked on the calendars. Furthermore, calendars and almanacs have long noted that the warmest and coldest temperatures lag the solstice points.

Climate science should have begun to understand that the climate is stable when they found it necessary to continuously adjust temperature measurements to maintain the fiction that the earth is warming. If ever there was an excuse for “adjusting” field measurements, making continuing adjustments demonstrates that climate science has knowingly perpetrated a fraud.

The negative feedback between solar levels and temperatures has always existed – but never noticed, officially. I, for one, will be interested to learn how quickly climate science can adapt CO2 theory to explain away its implications.

## 203 thoughts on “Negative Climate Feedbacks are Real and Large”

1. Steven Mosher says:

“Climate science should have begun to understand that the climate is stable when they found it necessary to continuously adjust temperature measurements to maintain the fiction that the earth is warming. If ever there was an excuse for “adjusting” field measurements, making continuing adjustments demonstrates that climate science has knowingly perpetrated a fraud.”
https://bobtisdale.files.wordpress.com/2016/05/figure-1.png
Some fraud, we adjusted the temperatures cooler.

• The slope from ≈1900 to ≈1945 looks just like the recent warming event.
Since CO2 has been much higher recently, then there must be another explanation for recent global warming.

• David A says:

When an alarmist proclaims how small the adjustments are, but provides no dates for what adjustments they are referring to, when they were made, and the reasons. You can be certain they are only talking about a small number of changes to data sets already manipulated to the point of FUBAR. Some history from Steven Goddard… (Links to the data available here. http://realclimatescience.com/2016/05/alterations-to-surface-temperatures-since-1974/ )
The graph below shows how NASA has been steadily erasing the 1940’s blip, and subsequent 1940 to 1970 global cooling.
Just since 2001, NASA has increased 1880 to 2000 warming by 0.5 degrees by altering the data.
The next graph superimposes NCAR 1974 at the same scale, and shows how Hansen was already erasing the 19140-1970 cooling in his 1981 version.
Note above how all of the NASA graphs show renewed warming starting around 1967, yet that warming does not appear in the NCAR graph. Similarly, in 1978 NOAA did not show any surface or troposphere warming through 1977.
In 1978, it was reported by an international team of specialists that there is “no end in sight to the cooling trend of the last 30 years” They also reported southern hemisphere data was too meager to be reliable, and that the Arctic ice cap was growing. They blamed the expanded polar vortex on global cooling.
TimesMachine: January 5, 1978 – NYTimes.com
In Climategate E-mails, the team made clear their desire to manipulate the temperature record and remove the post-1940 cooling.
From: Tom Wigley
To: Phil Jones
Subject: 1940s
Date: Sun, 27 Sep 2009 23:25:38 -0600
Cc: Ben Santer
So, if we could reduce the ocean blip by, say, 0.15 degC,
then this would be significant for the global mean — but
we’d still have to explain the land blip.
It would be good to remove at least part of the 1940s blip, but we are still left with “why the blip”.
di2.nu/foia/1254108338.txt
In another Climategate E-mail, Phil Jones said that much of the southern hemisphere data was “mostly made up.”
date: Wed Apr 15 14:29:03 2009
from: Phil Jones subject: Re: Fwd: Re: contribution to RealClimate.org
to: Thomas Crowley
Tom,
The issue Ray alludes to is that in addition to the issue
of many more drifters providing measurements over the last
5-10 years, the measurements are coming in from places where
we didn’t have much ship data in the past. For much of the SH between 40 and 60S the normals are mostly made up as there is very little ship data there.
Cheers
Phil
di2.nu/foia/foia2011/mail/2729.txt
Yes, they have steadily cooled the past and increased the trend.
The Best graphics produce an absolute temperature that is about 1.5 degrees different then NASA-GISS.
The models produce an absolute T range of 3 degrees C for any time period.
The models produce

• Mark says:

Mosher you are funny, poor estimations are not a reason for policy. Evidence is a reason for policy.
You obviously do not understand risk management. If I tried to get my company to spend 2 billion based on the kind of evidence that supports AGW, I’d be marched from the building.
You have no idea what you are talking about

• Greg says:

Very funny Mosh’ . Link to an article by Nuttercelli flying under the banner “Climate Consensus – the 97%”
which repeatedly propagates the lie that the 97% refers to humans causing the majority of “climate change”.
I’d certainly be interested in reading about that court case but won’t even bother reading the stupid lies and spin that Nuttertcelli manages to print in the scientifically illerate Guardian coverage of climate.
You point about adjustments making rise cooler is also misleading since, as you point out the WWI adjustment makes the long term rise cooler while all the recent ‘corrections” increase the later part of the record. The combined effect is to increase the appearance of correlation with CO2 increase.

• billw1984 says:

Mosher is not an alarmist. He is a lukewarmer. Although I believe the slopes might be better reported as being 0.1C/decade +/- 0.05 🙂

• Steven Mosher replied to my argument:
“tell it to the judge”
Quoting a leftist judge, and the even farther left Guardian to make a scientific argument?
I simply pointed out that past warming rises and the current rise look the same. That’s at least a somewhat scientific observation, no?
But the Guardian ??

• Joel Snider says:

billw1984: ‘Mosher is not an alarmist. He is a lukewarmer.’
Well, not exactly. Judith Curry is a lukewarmer. Mosher’s an alarmist who has kicked the can down the road a century or so, well beyond any empirical confirmation – not in terms of C02 forcing so much, as CONSEQUENCES.
Mosher is not a bad guy (although I do think he works at a lot of damage control), I actually believe him to be honest, and mostly sincere – as well as being a valuable asset to this board, in particular, as a knowledgeable counterpoint – but he does come from a perspective that is separate from just the specific science.
Because, again, what is mostly at issue at this level is consequences of effect, not necessarily the effect itself – some small degree of human-induced forcing, I think is a given – but alarmist consequences are alarmist consequences, and so much is based on the presumption that ANY effect at all by humans is, by nature, alien, unnatural and destructive, and that human activity trumps and overrides natural geologic forces – something that I have a hard time accepting.
The idea that we can’t leave a ‘footprint’ on the Earth is ridiculous. No one is advocating for waste or destruction, but we get to live, we get to BE here, and if simple acts of feeding ourselves, heating and cooling our homes, and transportation, leaves a mark, than so be it. It’s not like we have regulatory power over any of this – try to move a rock with a lever that provides 3% torque versus the rock’s weight.
And all this argument over hundredths of a degree simply enables those who would push horror stories of Armageddon to enact statist political agendas – or worse, simply preserve their ‘warm-fuzzy’ self-image under the self-delusion that anything they do in the name of the environment is by definition a positive thing, when very often these actions are the most destructive of all – and we don’t have to wait a century because the consequences are immediate and short term – that’s how we got the DDT ban.

• JEyon,
The article was nonsense. They editorialized instead of reporting. And YOU believed them.

• dbstealey,
please read my post again – this time try to figure out what i’m replying to – good luck

• JEyon,
You were commenting on the Guardian article. I was commenting on your comment.
If I was confusing in my response, my apologies.

• dbstealey,
LOL!!! – you reread my post – right? – and you came to that determination
if you can misinterpret such a easy post as mine – how trustworthy are you in judging something like the main article – i guess i’m going to have to read it with more sympathy now – LOL!!!

• TimTheToolMan says:

Mosher writes

What matters for policy is ECS.
ECS is determined by the LONG trend.

ECS only matters if it is greater than TCS.
And it only matters for policy if it’s considerably bigger and reachable in maybe a couple of hundred years or less.
And that a warmer world is a worse world.

• charles nelson says:

How many thermometers were there in the Arctic and Antarctic in 1880…ah that’s right. None.
Even the graph you use do demonstrate the Global Warming is a laughable artefact.
The fact that it is being doctored and manipulated on a daily basis makes it absolutely worthless.

• indefatigablefrog says:

Not just the arctic. Most of the planet, did not have a temperature record in 1880.
Have a look at this depiction of the global coverage over the last 150 years.
Only locations with BROWN dots are stations that existed over 130 years ago. i.e. before 1886. There are plenty of brown dots in America, and the colonies of the British Empire. Not very many anywhere else it seems.
And the only other source of data was the activities of sailors with their buckets of sea-water.
This hardly appears to justify the pretense of 0.1degree precision:

• Peter Miller says:

I think most of us agree there has been some warming over the past century, the question is how much. The raw data in Mosher’s chart has been manipulated to exaggerate the amount of actual warming in part by ‘homogenisation’ and understating the UHI effect.
However, the point here is whether climate feedback is negative, as believed by sceptics, or positive which is the bedrock of CAGW theory.
A moment’s thought demonstrates that this feedback has to be somewhere between zero and significantly negative, or we would not be here today. The climate has to have an inbuilt natural regulator, or thermostat.

• Mark says:

“tell it to the judge
There is a Consequence to relying on weak arguments for skepticism.

if coal is losing, isn’t Exxon winning, Gas will replace coal. Exxon are in the gas business, killing coal helps Exxon no end. So any claims Exxon and co are trying to prevent action on climate change is illogical nonsense. Especially when banks own huge chunks of the oil companies, the banks that stand to make fortunes from “climate change”.
As if we are going to stop using oil before it runs out anyway, only someone away with the fairies would think that.
Your wamist story doesn’t add up, it’s illogical and as a result, bunk.
and Guardian links lol, Their article on Willie Soon is something to behold. They are hardly distinguishable from Sks

• graphicconception says:

“They [The Guardian newspaper] are hardly distinguishable from Sks”. Could there be any connection with the fact that Dana Nuccitelli, who is a leading light at SkS, also writes about the climate at the Guardian and is also a member of the Cook et al team that produced the 97% paper that Obama likes?
Maybe 97% of scientists all agree or perhaps it is more of a one man hit squad?

• MarkW says:

Everyone knows that judges are the ultimate arbiters of scientific questions.

• ossqss says:

Is there a repository for the information associated with just the GISS (as an example) data and changes? One that documents all of this publicly funded research, including the process for adjustments, Raw data, etc. for review?
I would think BEST would have asked for the same thing and documented that as part of the research you participated in.
Thanks for your prior reply to a previous question relating to the same subject. I do find it strange to use Bob’s graph. Are there no others showing the same comparison?

• Steven Mosher says:

Its uninteresting.
Skeptics asked us to use all the data we could find. We did
They asked us to use known methods, kriging. we did
They suggested we slice records, we did..
They suggested double blind tests for adjustments.. we did them
Bottomline.
The LIA was real
The earth is warming.
Think otherwise?
tell it to the judge
There is a Consequence to relying on weak arguments for skepticism.

• David A says:

Reduce to the 97 percent quoting alarmist Guardian Mosher? It took me three quotes from peer reviewed articles to get banned by the Guardian. That you quote a site afraid of open debate, and the 97 percent garbage, is nothing but a discredit to you.

• Mark says:

“Its uninteresting.
Skeptics asked us to use all the data we could find. We did (No you didn’t, and you changed the data you used)
They asked us to use known methods, kriging. we did
They suggested we slice records, we did..
They suggested double blind tests for adjustments.. we did them
(All nonsense)
Bottomline.
The LIA was real (we know so was the MWP and US hottest year was in the 1930s say heatwaves and number 90f days)
The earth is warming. (we know it always warms and cools)
Think otherwise? (you know we dont, it’s called being disingenuous on your part)
tell it to the judge (repeat the same junk)
There is a Consequence to relying on weak arguments for skepticism.
(again you are quite the troll cherry picker)
You have no integrity, obviously, because instead of working to stop scientific misrepresentation on a grand scale in the media, you come here trolling (when you are not rubbing ATTP’s butt)

• Steven Mosher says:
The LIA was real
The earth is warming.

Who are you arguing with? That’s exactly what skeptics have been saying.
Let me guess what you’re leaving out:
…and human CO2 emissions are the cause of the warming.
If I’m wrong…
…tell it to the judge. ☺

• Menicholas says:

Mr. Mosher.
Cooler in the past makes it seem as though there has been more warming than there has actually been, and removes inconvenient blips that do not reconcile well with the CO2 as thermostat meme.
Fraud.
Do you really think that people do not understand that making the past cooler creates the appearance of warming, and that this is not fraud?

• chilemike says:

Yeah right. From what I’ve seen the past was cooled to make the graph slope higher. These adjusted records are a joke. Might as well make all of it up and just tell everyone it’s five degrees hotter on average, what the hell is the difference anymore? GISS is made up of actual temperature measurements, right? Or is it an algorithm to ‘fill in the blanks’?

• GIStemp takes in monthly averages of daily MIN MAX temperatures , a statistic that is not a temperature, then homoginizes and fills in fictional derived values in time and space globally. It then makes anomalies between two sets of these fictions and claim it means something about trends in temperatures.
IMHO, it is nothing but statistical rubbish. Last I looked, they used 16000 “grid boxes” but only had about 1700 data point locations active currently in GHCN. Of what use are the fictional really empty 14300 gridboxes Good question…

• David A says:

It is true that percentage of “made up” data has steadily increased.

• Mike Jonas says:

The effect of man-made CO2 did not cut in seriously, it is generally agreed I think, until around 1950-ish. I have examined closely all the linear trends in your chart, and I cannot see any hint of acceleration in any of them around 1950, or at any other date.

• JohnB says:

That Mike, is because just as the CO2 was heating things up, the natural drivers (which nobody actually identifies, but yesiree, they’re there) were cooling off. Hence the three warming periods all have the same slope even though they all (supposedly) have differing forcings causing them.
Isn’t coincidence wonderful?

• MarkW says:

Apparently teleconnections can operate across time as well as space.

• Reality Observer says:

Ah, but “cooler” in the past – to make it “warmer” now. Sorry, still fraud.
Of course, you do not address the blatant fraud of positive feedback that is brought up by the author. That fraud is absolutely required for the climate con artists to accumulate money and power.

• John Bills says:
• CNC says:

Assuming this is all correct and I assume it is, is it worth spending trillions for mitigation or would it be much better spends billions for adaptation?
A temperature rise of 0.78C in the next 100 years does not seem like much of a problem.

• Menicholas says:

We will adapt whether we like it or not, or whether we want to or not…as has always been done as the climate regimes of the Earth have warmed and cooled naturally since forever.

• John Harmsworth says:

Interestingly, the Chinese do not seem very concerned about AGW. This might seem surprising since China has over 200 million people living in coastal areas. The Chinese sign on to many nonbinding climate agreements and they’re happy to see the West commit economic hari-kari, but meanwhile they are busy building dozens of coal fired power plants.Do they plan to relocate all those people and watch cities that are still being built sink beneath rising seas, or is it more likely that they understand how laughably deficient this “science” is and how they can profit from it? We are taking ourselves for fools and selling out Western civilization and Science for a pocket full of sunshine.

• Not only is China taking away jobs and subverting intellectual property laws, they will be the climate science leaders once the West realizes the horrible mistake we are making and the body of work summarized by the IPCC is relegated to the trash heap. This is because Chinese science is not influenced by far left politics and they can actually address the science without bias.

• b fagan says:

I don’t see Joseph Fourier back in the 1820s as being a leftist. Nor Tyndall and Arrhennius later that century. Or Dr. Richard Alley at Penn State, who is a libertarian Republican.
I have my uncle’s book “Infrared Physics and Engineering” from McGraw-Hill in 1963. I don’t think the chapter on “Targets” was written by leftist liberal designers of heat-seeking missiles. It was helpful for his job as an engineer, doing his bit as part of the industrial/military complex.

• B fagan,
The political bias originates at the IPCC which needs CAGW to justify its far left agenda of redistributive economics under the guise of climate reparations. Certainly, none of the work of Arrhennius or Forrier or others at the time were influenced by the IPCC, but none of their work provides support for the high sensitivity claimed. The idea of a high sensitivity originated in AR1 based on unwarranted assumptions layered on prior art and not subject to adequate peer review. These assumptions were accepted only because they provided the wiggle room to support an otherwise impossibly high sensitivity.

• b fagan says:

The IPCC reports are signed off on by all the nations, including the fossil-producing nations (oil, gas, coal) and also by all of the wealthy nations. I find it very unlikely that every nation in the world, especially the ones who would supposedly be giving money away, would set up an elaborate plan of tens of thousands of peer-reviewed papers simply to move money around. Believe if you wish, but it’s far simpler to look at the real greenhouse effect and the measured increases in GHGs and conclude that more greenhouse gas = more greenhouse effect.

• B Fagan,
You said,
“I find it very unlikely that every nation in the world, especially the ones who would supposedly be giving money away, would set up an elaborate plan of tens of thousands of peer-reviewed papers simply to move money around.”
Yes, its hard to believe that so many ostensibly intelligent scientists could be so incredibly wrong about something so important. But this is what happens when politics interferes. The first causality is objectivity.
Can’t you see the conflict if interest where the IPCC, which requires CAGW to justify its existence, became the authority to which those who believe in CAGW defer? This is the primary positive feedback at work and its acting on climate science funding, peer review and publishing and not the climate itself. It’s driven the science so far off track because of the accumulated effect of this bias over decades.
Like most of the CAGW crowd, you don’t want to understand dissenting views and disregard them with prejudice and insist that your point of view is the only valid one. This is because you deny the obvious truth that climate science is the most controversial science of the modern age. Do you think that this kind of denial is conducive to the advancement of science? Do you recognize this behavior as the mechanism of subjective political dissent?
You seem to misunderstand the end to end effects of water vapor. The more water that evaporates, the more water falls as rain which is an intrinsically cooling influence on the surface. This is because a finite fraction of the latent heat of evaporation is converted into the work of weather. When the water condenses to fall as rain, even as the latent heat is returned to the water increasing its temperature, it’s somewhat cooler than the water it evaporated from. This is evidenced by the trail of cold water hurricanes leave in their wake and a basic requirement of the Second Law, which tells us that a heat engine, like the one driving weather, can not warm its source of heat.
Downwelling radiation is nearly impossible to measure, as there’s significant radiation in all directions and even the best sensors are not directional enough to discern what is actually being returned to the surface. In any event a 0.2 W/m^2 increase relative to the 150 W/m^2 or so of net downwelling radiation (latent heat, thermals, etc are not returned to the surface as ‘radiation’) implies an accuracy far greater than currently available.

• b fagan says:

Regarding water – while the water vapor is in the atmosphere it is doing its thing as a powerful greenhouse gas. When it “cools the surface” that heat isn’t vanishing, either. It’s just moving around. And with increased greenhouse gases, moving out to space is more difficult, so it raises the heat that little bit more.
Did you read the study about the measurement of measured increase in downwelling IR over a decade? You make a claim about accuracy of measurements, but did you read the study?
Regarding politicization of the issue, what is your opinion about the scientists under the pay of Exxon when they were also concluding back in the 1980s that more greenhouse gas increases greenhouse warming? Both sides of the political heap found the same results. John McCain was trying to get an emissions bill passed as recently as 2007. Nixon formed the EPA and signed other important pollution bills.
One thing you also don’t appear to consider when you talk about all this transfer of wealth – consider where a lot of money flows when developing nations need to buy, install and run complex energy systems – the money flows back to the developed nations who have the manufacturing skills, the financial skills, the project management skills. I’d like for American companies to be able to profit worldwide while doing some good for nations that need advanced energy systems. There are more jobs in renewable energy now than in the coal industry, why aren’t we training even more skilled engineers and project managers to deploy out in the developing world?

• B Fagan,
Regarding water, yes it’s a powerful GHG, but you can’t isolate the GHG effects from the rest of the hydro cycle. Only the end to end effect matter and the fact that water vapor is a GHG is a small part, although it is the only part of the cycle that you can point to as a warming influence. The end to end effect is clearly cooling. How can you say otherwise? Don’t you see how the Second Law precludes the hydro cycle, specifically the heat engine manifesting weather and driven by latent heat, from warming its source of heat, which by and large is the surface?
Politicians always hedge their bets and is the reason some on the right buy in to the lies, especially given the aggressive fear tactics used to promote them. Fear is a powerful motivator, but more importantly, the IPCC relies on the intrinsic division of partisan politics. Most people tend to be in the center and whether they lean left or right depends on how they resolve conflicts between social responsibility and fiscal responsibility. If you understand that social responsibility without fiscal responsibility is unsustainable, you lean right. If you believe that cost is no object to be socially responsible, you lean left. The IPCC positions their solutions as the socially responsible thing to do which puts about half the population on their side regardless of the science, especially the most gullible.
The simple fact is that anyone why buys in to the CAGW meme doesn’t understand the science well enough to make a sound judgement. And BTW, there are also liberals on the side of the skeptics.
It’s not the developed worlds responsibility to give the developing world everything they need that we have worked so hard to achieve. We can and do provide significant assistance already and do so by choice. Being forced to do more under threat of retaliation is called extortion.
Yes, I did read the report and was not impressed. The uncertainties far outweigh the trend they claim to have observed.
Now, will you answer my questions or are you going to continue to filibuster? Specifically, how can you violate COE, SB and the Second Law in order to support a high sensitivity and why don’t you recognize the COI at the IPCC and how it resulted in a self serving consensus?

• b fagan says:

Your funny comments about the Second law and your use of a term (CAGW) that’s only used by those who buy into or promote a specific “not our fault” meme means it is pointless to continue this.
At a few places in this overall comment section I presented some information from two science textbooks and from the history of several hundred years of climate science. People who might be looking for information, rather than just being told there’s nothing to worry about, might look to those sources.
And for those people, Spencer Weart’s “The Discovery of Global Warming – A History” makes it quite easy to follow a 200-year timeline that belies the meme that climate science and its conclusions are a recent, liberal invention.
The history is here: https://www.aip.org/history/climate/index.htm
Here’s the Wikipedia history of Exxon’s internal consensus about the warming effects of fossil fuels – it includes 119 footnoted references:
https://en.wikipedia.org/wiki/ExxonMobil_climate_change_controversy
It includes this in the “Early Research” section: “In 1966, Esso scientist James Black and the National Academies of Science published a report that the rate of build-up of carbon dioxide (CO2), the main contributor to climate change, in the atmosphere corresponded with the rate of production of carbon dioxide by human consumption of fossil fuels. In July 1977, Black, then a senior scientist in Exxon’s Research & Engineering division, warned company executives of the danger of atmospheric carbon dioxide increases from the burning of fossil fuels. Black reported that there was general scientific agreement at that time that the burning of fossil fuels was most likely manner in which mankind was influencing global climate change.”
I really wish the true conservatives would get back to trying to protect systems that provide us with our air, drinking water and the ability to grow food in a fairly predictable manner. Google “Climate Stewardship Acts” where John McCain and Joe Lieberman tried repeatedly to pass legislation to rein in emissions. I’d like conservatives to be working on plans rather than leave it all to liberals and middle-of-the roaders like myself. We need different perspectives on it, since it’s such a complex issue.
The Republican Party was more rational about it, less than ten years ago.
Even Sarah Palin, when she was actually working for her state as Governor. Here’s a link to her press release about creating a sub-Cabinet group to come up with a state-wide action plan to combat the effects of warming.
Of course, her replacement, a former oil executive, killed that in a hurry.
http://www.themudflats.net/archives/35683
I’m hanging on to hope that the conservatives on the Great Plains will soon realize that a wind turbine lease won’t leak onto your property, that the states can sell power to the liberals in the cities, and that turbines don’t produce so much saline wastewater that Oklahoma now has more earthquakes a year than California.

• b fagan,
If you believe this…
…carbon dioxide (CO2), the main contributor to climate change…
Then you’ve been fed a lot of misinformation.
There is no empirical evidence to support that statement. There are no measurements to support it, either. Therefore, it’s just a belief.
But there is plenty of evidence that CO2 is not the main contributor to ‘climate change’ (whatever they mean by that vague phrase).

• b fagan says:

dbstealy, you made an assertion without providing any evidence, and like “co2isnotevil”, you seem very certain of the outcome of our current experiment with the planet.
I’ll qualify my CO2 statement. “Due to the sudden and ongoing rise in CO2 emissions, atmospheric CO2 appears to be the principal driver of changes in the climate system, particularly over the last several decades, and judging by past times when CO2 was higher and solar insolation was weaker, it appears that the planet will warm an average of several degrees Celsius. Due to the nature of heat retention and ocean circulation, any warming from current and future emissions of all greenhouse gases will likely keep the planet at a temperature above those since at least the last interglacial. Sea level will rise and the impact of declining ocean pH and dissolved O2 content in warming waters will have unknown impacts on life in the oceans”.
And for observations of CO2 having an influence, I’ll post the measurement again:
First Direct Observation of Carbon Dioxide’s Increasing Greenhouse Effect at the Earth’s Surface
Berkeley Lab researchers link rising CO2 levels from fossil fuels to an upward trend in radiative forcing at two locations
News Release • FEBRUARY 25, 2015
http://newscenter.lbl.gov/2015/02/25/co2-greenhouse-effect-increase/
You and CO2IsNotEvil are very certain. I just read a lot and notice that there are events happening, like the recent trend of “Ice patch archaeology”, which is growing around the world, anywhere where elevations have typically been ice covered over last several thousand years. Otzi the iceman is one example, but there are discoveries in North America, Europe and elsewhere of artifacts that are thousands of years old, exposed for the first time to the elements.
Look up “ice patch archaeology”. That’s evidence without a thermometer. Materials hundreds or thousands of years old, that would be scavenged or decay quickly after removal from freezing, are being discovered. Anyone looking for a rewarding, low-paying job as a scientist might want to get involved in taking part in adding to what can be learned by collecting as many of these artifacts as possible in the short window we’ll have.
For those who actually want to learn more about climate, I’ve found that the Princeton Primers on Climate have been quite helpful. List of currently published and future titles here:
http://press.princeton.edu/catalogs/series/title/princeton-primers-in-climate.html#forth
I haven’t finished “The Sun’s Influence on Climate” yet, but the others are good.
And co2isnotevil? I’ll not bother with your equations. The temperatures on Earth are driven by the Sun, and we’re not going to stop warming until outgoing radiation matches incoming radiation – that’s the imbalance. If I want to see equations I’ll look them up in my textbooks or read some of the peer-reviewed papers.
We don’t know what the overall or the regional impacts will be, all the experts give us a range of possibilities. I do know something about risk management, and saying “maybe it will all work out” or “maybe the chemists/solar physicists/oceanographers/cryophysicists/biologists/paleontologists/meteorological modelers/astrophysicists are wrong about 9 things at once” is not effective risk management.

• B Fagan,
I’m certain of what I have said about the magnitude of the effect incremental CO2 has on the temperature because I believe in the scientific method and that the most important part is testing hypotheses. Every test I apply to the hypothesis that the sensitivity is 0.8 +/- 0.4C per W/m^2 falsifies it, while every test of my model that predicts sensitivity limits of between 0.2 and 0.3 C per W/m^2 confirms it. What part of the scientific method do you find so objectionable? It seems to me like you object because it doesn’t support what you want to believe.
Here’s another test os sorts. If the surface temperature increases by 3C, which is the nominal amount claimed to arise from the 3.7 W/m^2 of forcing from doubling CO2, surface emissions will increase by more than 16 W/m^2. What physical process can amplify 3.7 W/m^2 into more than 16 W/m^2 without requiring new input power from a hidden source?
If you think it’s the feedback, then you don;t understand what feedback means for a passive system like the Earth. The extra power must have an origin, its not just power recirculating back to the surface. This is what the original 3.7 W/m^2 represents and amplifying beyond 3.7 W/m^2 requires powered gain, which is not a characteristic of the climate system. It’s also not from melting ice since if all the ice on the planet was gone, it still wouldn’t be enough to increase the global average input from the Sun by 3.7 W/m^2. About 2/3 of the surface is covered by clouds and the reflectivity difference is zero and the part of the planet covered in ice receives a smaller fraction of solar input to reflect.
I fully understand how hard it is to accept that so many smart people can be so incredibly wrong about something so important, so I expanded my theoretical research into the sociological and political aspects of climate science to try and understand how it got to be so incredibly misguided and I now understand quite well how the conflict of interest at the IPCC is the genesis of its self serving consensus responsible for this devastating scientific failure. I can also point to specific papers where assumptions, never subject to peer review, were made inappropriately, for example, with regard the Hansen and Schlesinger’s application of Bode’s control theory to the climate. I’ve identified errors in Trenberth’s energy balance papers and every paper I’ve found that has tried to lay out a theoretical foundation for a high sensitivity. How so much garbage got past peer review is astounding, but then again, funding, peer review and publishing has been seriously degraded by requiring research to conform to the demonstrably false CAGW narrative.
The only support your side has is from speculatively interpreted, highly adjusted, relatively sparse data sets from which trends are divined. and reported along with a significant understatement of uncertainties. The skeptics have the laws of physics, full coverage satellite data and even unadjusted sparse data supporting them and I’ve replicated the results of many others as well as produced unique supporting results of my own. Which would you be more certain of?

• b fagan says:

co2isnotevil: You are extremely certain that, single-handedly, you have “expanded your theoretical research” beyond mastering the Earth climate system to also master sociology and global politics, got it all right, and have shown up the thousands of educated, working professionals who have all gotten it wrong.
Got it. That speaks volumes.

• b fagan,
You cite as your authority “thousands of educated, working professionals”.
I challenge you to name those “thousands of educated, working professionals”.
Because I can name tens of thousands of professionals, all educated in the hard sciences (including more than 9,000 PhD’s), who have stated in writing that CO2 is harmless, and beneficial to the biosphere.
Next, you wrote:
…you made an assertion without providing any evidence
What I did was to question your assertion, which remains baseless. I am a skeptic; I have nothing to prove.
But you unequivocally stated that “carbon dioxide (CO2), the main contributor to climate change”, therefore you must either back it up, or you lose the argument.
You made the statement. All I did was question it. So the ball is in your court. The onus is on you.

• B Fagan,
I’ve only presented the tiny slice of politics and sociology related to climate science which you fail to acknowledge, despite how obviously they have affected the scientific process. Of course, this is why you stubbornly hold on to the wrong side, even as many tests applied to your favored hypothesis falsify it via many avenues of attack (COE, SB, Second Law, broken feedback, bad assumptions, etc.) and have not made a single attempt to falsify anything I’ve said. Clearly, real science doesn’t influence your position so I can only conclude that politics alone is driving your beliefs.

• B Fagan,
Here’s an example of a test.
Owing to the Stefan-Boltzmann Law, surface emissions are proportional to T^4 (independent of emissivity), thus the incremental sensitivity must be less than the average sensitivity for all accumulated forcing since each additional degree of surface temperature requires exponentially more power to sustain then the last. Consider that the first W/m^2 increases the temperature fr0m 0K to 64.8K for a sensitivity of 64.8C per W/m^2 and the second W/m^2 increases the temperature to 77K for a sensitivity of 12.2 C per W/m^2. The test is to identify the rest of this sensitivity function as a function of temperature that when integrated across all 239 W/m^2 of accumulated solar forcing ends up at a sensitivity of 0.8C per W/m^2 at a final temperature of 287.5K.
I will bet any amount of money that you will be unable to identify this function. However, I can identify this function precisely only when it ends at a sensitivity between 0.2C about 0.3C per W/m^2 and it’s simply the slope of the SB function corresponding to the current average temperature times an emissivity between 1.0 and 0.62; moreover; the data confirms this exactly.

• b fagan
First the sensitivity is way overstated.
Second the relationship between forcing and warming is logarithmic not linear
Third, there is no way you can explain the failure of the climate models without adjusting temperatures
and Four, admitting to the LIA in the absence of any coherent explanation as to what caused the drop in temperatures or the amount of warming during the MWP. is not going to magically make the this argument go away. Obviously, co2 causing either the LIA or MWP was not a factor. What was? The burden of proving what was does not reside with a skeptic, it’s your theory, not mine. And yea, I already know the argument that CAGW is putting out there that the MWP wasn’t that warm. The evidence is overwhelming in the geo field, written history, and in the isotopic data in long lived trees that it was warmer than the current warm period.
In a fair court, I or anyone else can prove any of that. A kangaroo court, not so much.

• Looks like fagan has skedaddled.

• b fagan says:

I have a job so can’t hover waiting for comments.
I’m posting this and I’m done.
“co2isnotevil” – I’ll not bother with your math, the books I read provide enough, and it’s been looked at by other experts so I’m more inclined to trust it. I’ve put a list of them below.
dbstealey – You present the Oregon Petition, a lobbying effort in deceptive packaging, which came packaged with an unreviewed opinion piece disguised as a National Academies peer-reviewed paper (“another deliverable”) co-authored by Willie Soon, and a WSJ op-ed by two members of the non-climate research-related “Oregon Institute”. Your “proof” was a petition to try to block legislation, so was not eliciting studied responses about the science, it was another attempt to forestall climate regulations.
Why not mention the Cornwall Alliance? They claim God made the world to be a great place for us, and that somehow God will prevent us bringing ourselves to harm. Seems these supposedly pious people forgot about the Ark – didn’t God wipe almost all of humanity off the planet once already for our misdeeds? They are a creationist organization, but they seem selective in assigning nothing but kind-hearted motives to a God that has taken harsh steps against us in the past.
Anyway, here are sources that provide the basis for my understanding that the greenhouse effect is real, and that the impacts of our rapid emissions will make the near future (decades to centuries) unpredictable, very inconvenient for people with coastal infrastructure, and more complicated than we need it to be.
“The Discovery of Global Warming – A History” Spencer Weart
Here’s the summary page https://www.aip.org/history/climate/summary.htm
Here’s the bibliography – over 2000 references by date https://www.aip.org/history/climate/bibdate.htm.
It starts with Benjamin Franklin but more realistically starts with Joseph Fourier in 1824 and 1827 explaining his observation that the Earth should be much cooler at this distance from the sun than it is – proposing (but not naming) the atmospheric greenhouse effect. This effect still hadn’t been disproved nearly 200 years later.
“The Warming Papers: The Scientific Foundation for the Climate Change Forecast” David Archer, Raymond Pierrehumbert editors
Includes English translations of the Fourier papers.
From the back cover blurb: “The Warming Papers is a compendium of the classic scientific papers that constitute the foundation of the global warming forecast. The paper trail ranges from Fourier and Arrhenius in the 19th Century to Manabe and Hansen in modern times. Archer and Pierrehumbert provide introductions and commentary which places the papers in their context and provide students with tools to develop and extend their understanding of the subject.”
“A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming” Paul Edwards
Lays out the development of the global weather data collection/forecasting/observation system as well as its application to climate science. As a computer guy, the sheer bulk of punch cards that used to make up the observational “database” was mind-boggling. Data sharing’s much easier now.
“Principles of Planetary Climate” Raymond Pierrehumbert.
THE book on figuring out what happens on bodies circling the Sun (or any star) when those bodies have gases that might or might not trap different wavelengths of light. Recommended to “co2isnotevil”.
Princeton Primers in Climate – Princeton University Press – I haven’t finished the last one, but the rest were quite good.
“Atmosphere, Clouds, and Climate” David Randall
“Climate and Ecosystems” David Schimel
“Climate and the Oceans” Geoffrey K. Vallis
“The Cryosphere” Shawn J. Marshall
“The Global Carbon Cycle” David Archer
“Paleoclimate” Michael L. Bender
“Planetary Climates” Andrew Ingersoll
“The Sun’s Influence on Climate” Joanna D. Haigh, Peter Cargill
“Infrared Physics and Engineering” McGraw-Hill 1963
More for design of infrared systems than about climate, but the Air Force was one of the drivers of theory on atmospheric infrared absorption when they started caring about sniperscopes and heat-seeking missiles.
Science, American Scientist, Scientific American, Science News, Technology Review, IEEE Power and Electification Magazine and others keep me up to date on a lot of the science and the technological developments.
And of course, the IPCC reports – all here: http://www.ipcc.ch/publications_and_data/publications_and_data_reports.shtml
And their bibliographies of the tens of thousands of publications they’ve reviewed.
Collectively they show that the greenhouse effect is quite real, that the ocean chemistry is changing, the atmospheric blocking of IR is increasing and has been measured, the heat content of the planets combined surface/ocean/atmosphere will continue increasing until persistent GHG concentrations stabilize and begin to decline.
The books I read, that I’ve listed above aren’t for scaring me, they’re for informing me. They are not political – I never looked at Al Gore’s movie and never will. This isn’t alarmism. This isn’t “C.A.G.W.”
This is nature acting naturally on a change in greenhouse gas concentrations, regardless of the source.
Have a nice weekend.

• b fagan,
Based on your feckless appeals to authority, your understanding of the hard sciences is clearly limited. Rather than try to understand why even the low end of the IPCC’s sensitivity estimate is precluded by the known laws of physics, your sociological bias drives you to deny a truth with consequences you can’t bring yourself to accept, which is that it will completely undermine the Green religion and the progressive politics that embraces it yielding political power to the opposition for decades. This is the danger of mixing politics, religion and/or science and they brought it on themselves.
You also don’t understand that the controversy is not about any real science or that CO2 is a GHG gas, that man is emitting CO2 or any of the emotional distractions. but is about the magnitude of the climate sensitivity to incremental atmospheric CO2 and nothing more. It’s unfortunate that this has been wildly over-estimated and canonized since AR1 which has driven so much bad climate science that depends on this unwarranted assumption, it’s an embarrassment to any legitimate scientist.

• b fagan,
First, you completely avoided my challenge. You cannot name your “thousands of professionals”, because they don’t exist.
Instead, you cut and pasted the same talking points from skepticalscience that have been posted here endlessly by others. Do you think you’re the first one to post exactly the same discredited talking points? They are simply a deflection from the undeniable fact that those 31,000+ named professionals, all with degrees in the hard sciences, had to download the OISM petition, sign it, affix postage, and mail it in (no emails accepted).
That took more effort than most people would expend, therefore they felt strongly enough about that statement that they were willling to go to that much trouble. They co-signed the straightforward OISM statement saying that CO2 is harmless, and beneficial to the biosphere.
Nowhere did you address any of that. You simply engaged in the usual ad hominem attacks, attempting to denigrate every co-signer. And as I pointed out, you failed my challenge.
Next, you deflected to the Cornwall Alliance, whatever that is. But it’s just more proof that you cannot back up your baseless assertion, that “carbon dioxide (CO2), (is) the main contributor to climate change…”
Nonsense. That is no more than a belief, unsupported by a single measurement. FYI: this is the internet’s “Best Science” site, with more traffic than all alarmist blogs combined. As such, we require at least a minimum of verifiable, credible evidence to support beliefs like yours. But you have none. All you have is your opinion; your eco-religious belief that a tiny trace gas is responsible for “climate change”. That belief is arrant silliness, unsupported by any credible scientific data.
Next, we do not need your history lesson cut and pasted here, either. Let’s just cut to the chase: EVERY alarming prediction made by your climate alarmist cult has turned out to be flat wrong. None of those scary predictions ever come true; no exception.
As Prof Richard Feynman pointed out: if your ‘theory’ is contradicted by observations, then your ‘theory’ is wrong. “That’s all there is to it,” said Feynman. And other scientists like Popper, Einstein, and Langmuir said the same thing. When real world observations contradict a hypothesis, then that hypothesis has been falsified.
The CO2=cAGW conjecture has been repeatedly falsified. But you avoided any discussion of those failures. Instead, you attacked Dr. Soon, monkey-piling on the reprehensible smears by others of your ilk.
You cannot credibly support your absurd belief that a tiny trace gas is the ‘control knob’ of the planet’s climate and temperature, so instead you engaged in despicable ad hominem aspersions. And since exactly none of your assertions are supported by real world observations, or measurements, or corroborating data, you lost the science debate.
And you actually believe that a tiny trace gas, CO2 — which has risen by only one part in ten thousand over the past century — will cause runaway global warming and climate catastrophe? Why would you believe something so outlandish? You would not even be aware of such a minuscule change, if very sensitive instruments had not made you aware of it.
Instead, you credulously believe those self-serving propagandists who demonize “carbon”. You also disregard the fact that CO2 is as essential to all life on earth as H2O.
And you disregard the fact that inter-annual CO2 fluctuations are far greater than one part in ten thousand; another disconfirmation of your belief system.
Next, you failed to understand that there is no corellation between changes in CO2 and subsequent changes in global temperature; over the past century, nearly half the decades showed flat to cooling temperatures, while harmless, beneficial CO2 continued its steady rise.
And you cannot understand that a basic requirement of all conjectures, hypotheses, and theories is that that they must be capable of making repeated, accurate predictions.
But none of the scary and alarming predictions made by the climate alarmist cult have ever come true. Not a single one. They were all wrong; no exceptions. When one side of a debate has made numerous predictions, but all of them have been flat wrong, then their argument has been falsified.
Planet Earth — the ultimate Authority — is busy debunking the ‘carbon’ false alarm. You could either accept the basic facts, or cast aspersions on the messengers. You made the decision to play the man, and not the ball when you posted your ad hominem attacks.
You had the choice to either limit your argument to scientific facts, or to deflect, and engage in character assassination because you have no credible facts. The choice was yours.
I think you made the wrong choice.

• David A says:

When an alarmist proclaims how small the adjustments are, but provides no dates for what adjustments they are referring to, when they were made, and the reasons, You can be certain they are only talking about a small number of changes to data sets already manipulated to the point of FUBAR. Some history from Steven Goddard…
The graph below shows how NASA has been steadily erasing the 1940’s blip, and subsequent 1940 to 1970 global cooling.
Just since 2001, NASA has increased 1880 to 2000 warming by 0.5 degrees by altering the data.
The next graph superimposes NCAR 1974 at the same scale, and shows how Hansen was already erasing the 19140-1970 cooling in his 1981 version.
Note above how all of the NASA graphs show renewed warming starting around 1967, yet that warming does not appear in the NCAR graph. Similarly, in 1978 NOAA did not show any surface or troposphere warming through 1977.
In 1978, it was reported by an international team of specialists that there is “no end in sight to the cooling trend of the last 30 years” They also reported southern hemisphere data was too meager to be reliable, and that the Arctic ice cap was growing. They blamed the expanded polar vortex on global cooling.
TimesMachine: January 5, 1978 – NYTimes.com
In Climategate E-mails, the team made clear their desire to manipulate the temperature record and remove the post-1940 cooling.
From: Tom Wigley
To: Phil Jones
Subject: 1940s
Date: Sun, 27 Sep 2009 23:25:38 -0600
Cc: Ben Santer
So, if we could reduce the ocean blip by, say, 0.15 degC,
then this would be significant for the global mean — but
we’d still have to explain the land blip.
It would be good to remove at least part of the 1940s blip, but we are still left with “why the blip”.
di2.nu/foia/1254108338.txt
In another Climategate E-mail, Phil Jones said that much of the southern hemisphere data was “mostly made up.”
date: Wed Apr 15 14:29:03 2009
from: Phil Jones subject: Re: Fwd: Re: contribution to RealClimate.org
to: Thomas Crowley
Tom,
The issue Ray alludes to is that in addition to the issue
of many more drifters providing measurements over the last
5-10 years, the measurements are coming in from places where
we didn’t have much ship data in the past. For much of the SH between 40 and 60S the normals are mostly made up as there is very little ship data there.
Cheers
Phil
di2.nu/foia/foia2011/mail/2729.txt
In absolute temperatures BEST and NASA are 1.5 C apart throughout the record. The climate models are all over the park with a consistent 3 to 3.5 degree variance for any date since 1880.

• Mark says:

Mosher has no answer for this. It is clearly a problem, because those adjustments are all in fact uncertainties too. If you have a measurement of 20c and you adjust it to 20.6c, you add the uncertainty of the measurement to the adjustment value to get total uncertainty so the result should be 26c +/- 0.61c at least.
This is the cheat, NECI GISS and HadCRUT are all massively uncertain, no wonder the CRU destroyed the raw data

• For that willful destruction of data to which my money had contributed, those criminal frauds ought to be detained for some considerable time at Her Majesty’s pleasure. That they are not is a travesty which only highlights the full extent of the systemic corruption.

• AGW is not Science says:

And even IF this non-“data” was real it would make no difference, because absolutely no evidence of CAUSATION exists. The Eco-Nazi camp has nothing but hypothetical BS, an argument that can be summed up (as PC has done) as “If it’s not a dog then it must be a cat.”

• David A says:

You are correct Mark, the adjustments are outside the error bars for what was published at the time. Pick a base period, say 1951 to 1980. You can get a lot of disparate trends depending on WHEN you choose that period, but these major changes make a mockery of claiming the adjustments lower the trend, which, as Bob Tisdale demonstrated, was simply not true (For the adjustments Mosher is referring to) post 1950 (when CO2 is deemed to affect the trend), as after that date almost every adjustment increases the trend.

• Menicholas says:

Mods.
I seem to have a comment in moderation limbo for some time now.
Could someone please check for me?
Thanks.
Nick M
[Nothing in the queues at this time. .mod]

• Bob’s graph correctly states that the raw data is indeed, ‘data’. Immediately it’s adjusted it is no longer ‘data’. This is not a well-known fact, and yet the word is often attached to the adjusted graph. I know there will be those who find this hard to believe, and to them, I say go look up the word’s definition. Whenever you see a graph of adjusted temps, if it says ‘data’, please contact the providers of the graph to let them know that they are using incorrect terminology.

• There is also another problem Mosher. Assuming your graph is accurate, you are implying that co2 is the cause. That may not be the case. Fully the extent of warming caused by co2 should be reduced by at least 33%. The math that gives that figure was based on 1368 w/m^2. We know that figure to be in error. The real number is 1360 w/m^2 which reduces the original number of 1.2 K by 0.4 K.
Statically, the data has been adjusted with the past being cooler and the present being warmer. There should be no argument on that. Additionally, the original data was never verified, it was also adjusted and nobody knows except the people who adjusted it . Who throws away original data? And then when we ask, can we see it? Oh its in a landfill in Denmark. It was beyond recovery.
How much of that graph do you think you should believe that has an implied vector from co2?
It is obvious that there is another variable at work. Very few would be a skeptic if temperatures had tracked with the amount of co2 released in the last 10 years.

• Joel Snider says:

‘Tell it to the judge’?

• whiten says:

Steven Mosher
May 12, 2016 at 4:13 pm.
Hello Mosh.
You seem to be the first one to comment in this particular interesting and very well thought and described post.
I do not see what connection to it your arguments provided here through your comments here have with the above post.
Seems like the only purpose of you been so eager to comment have only to do with hijacking or trolling the thread.
Please if I am wrong with this conclusion of mine, let me know by enlightening me about the actual connection of your arguments with the substance of the above post…which actually I totally and completely fail to identify or “see” it as yet.
Perhaps you maybe just commenting in the wrong post,,,,is that it?!
cheers

• edcaryl says:

In the past, to make the amount of warming greater.

2. Park has a nice, simple model. I do not have enough math to make any critcism, but it looks very good.

3. Germinio says:

This is nonsense. The author has confused a time delay with negative feedback. The two are completely different since you can have a delayed positive feedback or a delayed negative feedback.

• Yes, it is nonsense. The statement
“In systems terms, this lag, from cause to effect, constitutes negative feedback.”
is just untrue. In fact a practical source of instability is when you think you are applying negative feedback, but there is a frequency where a 180° phase shift (lag) makes it positive.

• Paul Penrose says:

Whatever. There’s a better (simpler) way to look at it. Systems dominated by positive feedback are inherently unstable. Stability comes from negative feedback, and the more there is, the more stable the system becomes. For the Earth’s climate to be as stable as it is, it must be dominated by strong negative feedbacks. A strong positive water vapor feedback is ruled out.

• Paul Penrose:

Systems dominated by positive feedback are inherently unstable.

Not necessarily. It’s a loop gain greater than unity that makes a system unstable. Whether positive feedback results in such a loop gain depends on whether the open-loop gain exceeds the reciprocal of the feedback coefficient.

• Frederick Michael says:

Agreed. This has NOTHING to do with whether H2O (+ clouds) in the atmosphere contributes a positive or negative feedback to the CO2 based warming. I suspect it’s negative but this paper doesn’t address that AT ALL.

• Mike Jonas says:

Nick – I agree. The article is simply seeing a lag. It has no bearing on whether there is or isn’t a negative or positive feedback.

• Menicholas says:

I was thinking the same thing when I read the article…interpreting the lag as a feedback did not make sense.
I am glad I am not the only one to have that thought.

• Mark says:

To make it simple dear Nick, heat going into a system, warms it, it then is released later on, mostly by hydrological cycle which cools the surface and creates clouds which block sunlight.
The atmosphere is one massive negative feed back after all while being a far smaller positive feed back as well. If the hydrological cycle stopped, earth would obviously be cooked in no time. Temperature would shoot up 30 degrees, which shows, the atmosphere prevents far more warming than it adds.

• Indeed:
The head post is like one of those nonsense papers that are submitted to journals to determine whether any of the editors understood anything.
The answer appears to be no.

• I agree that this article is nonsense. The delay in temperature rise is due to the large heat capacity of the climate system and the energy required to melt seasonal snow and ice. This has nothing to due with feedbacks and the graphs tell us nothing about feedbacks.
Even if climate sensitivity is 6 °C, implying strong water vapor and cloud positive feedbacks the climate system in total would be dominated by negative feedback and would be stable due to the large planck negative feedback of -3.2 W/m2/°C.

• Yes, time lags in the control system has nothing to do with negative or positive feedback. They are independent phenomenon.

• Yes, thermal mass is not negative feedback. Capacitance does not counteract.

4. charles nelson says:

Anyone with a pair of eyeballs and enough intelligence to understand the term ‘enthalpy’ can simply log on to a site showing infra red satellite imagery of the equatorial oceans and see a rather LARGE negative feedback happening in real time!

5. Evan Jones says:

From a feedback perspective, the lag is caused by the effects of re-radiating heat from the surface, which acts as a heat sink — a sink that gets hotter than the surrounding air, and continues to get hotter faster as solar radiation increases. This absorbed heat (in excess of the air around the sensor) is still being released. This causes Tmax to lag SolMax.
It therefore follows that if the surface area around the sensor is paved or built up, that amplifies the natural effects that are occurring already (both warming and cooling). This will not only be manifested in an amplified offset, it will result in an amplified trend as well, because the delta (sic) between the sink and the surrounding air increase as it warms.
The sensor is still doing its job accurately, but the site is no longer representative of the natural surrounding terrain. The result is that a poorly sited sensor will show a spuriously large trend be it either cooling or warming.
Conclusion: Recent warming trends are real, but have been exaggerated by poor siting. And one will have to suck it in and either adjust for that or else drop the badly sited stations (the latter of which USHCN can afford, but GHCN, not so much).

• Kaiser Derden says:

sink that gets hotter than the surrounding air … ??? really … no second law in your world then ?

• Evan Jones says:

sink that gets hotter than the surrounding air … ???
Much hotter. Burn-the-bottom-of-your-feet-off hotter.
really … no second law in your world then ?
Not walking on tarmac barefoot in July at 2 PM under a blazing sun is not just a good idea. It’s the law.

• Menicholas says:

I think he is referring to the fact that an asphalt (or some other sort of) surface will have a measured temp much hotter than the air at some distance from that surface. While the sun is shining directly, surfaces are much hotter than the air even a few inches above the ground.

• Evan Jones says:

That would be it.

6. charles nelson says:
7. John Manville says:

CO2 is a triatomic molecule and as such possess no internal heat source and thus is incapable of heating anything.
It is time that WUWT stops pandering to those who ‘BELIEVE’ that CO2 is anything but good for all life on this planet. It is time to cut the chord. The green house effect was coined from ignorance. There is no greenhouse effect! All atmospheric gases act similarly in that they are all capable of cooling matter that is warmer and warming matter that is cooler. The second LAW of thermodynamics is alive and well.
Heat from the sun and Gravity are responsible for a positive temperature gradient in concert with increased air density that de-facto requires air molecules closer to the earth to move faster. This increased momentum of these air molecules is measured as temperature; as more energy impinges on the instrument used to measure temperature. That is real world temperature-measurement of useful heat.
There are none, and there will never be, an example of a runaway greenhouse effect.
Life is good. CO2 is good.
It is time to move on. There are glass houses. There are no green-house gasses.

• WBWilson says:

Too true, John.
The laws of thermodynamics rule. Arrhenius was a 19th century scientist who made errors based on 18th century concepts, such as the “aether,” and attributed the CO2 molecule with power to add energy to the system. The whole concept of Catastrophic Anthropogenic Global Warming is based on this error. The rise of CO2 however, is anthropogenic, an unwitting byproduct of our destruction of vast quantities of biomass, which formerly acted as a CO2 sink. This will ultimately be viewed as positive when CO2 is accepted for it’s beneficial effects on the biosphere.
Throughout the geologic history of life on earth warmer times are preferred for proliferation of life. Life is happiest on earth at an average temp above 19C. Today’s 15C is Ice Age Interglacial COLD. And according to the orbital dynamics, about to get colder again. We need all the CO2 we can get. Winter is coming.
http://greenhouse.geologist-1011.net

• KevinK says:

“There are glass houses. There are no green-house gasses.”
Well, some folks simply cannot “move on”, they have to stick with the hypothesis (correction, conjecture) that gases in the atmosphere can strongly affect the “average temperature”.
It is sad in a way that folks have their eyes, ears and minds so shut that they cannot accept the demonstrably real possibility that Arrehinus (and others following his lead) where incorrect when he/they postulated that the mere presence of “green house gases” somehow magically control the temperature of the Earth.
But, if a man is paid to “believe” something no amount of reasoning will lead him away from his paycheck.
For the record I have never received a paycheck from “Big Oil”, unless you want to count those free water glasses given out with each “fill-up” at gas stations back in the 60’s. Ok, I admit it my parents got some of those back then (and they where top notch stuff made in the USA).
Cheers, KevinK.

• WBWilson says:

We got some steak knives back in the sixties.

• JustAnOldGuy says:

We inherited some “Tiger In Your Tank” mugs and will probably pass them on with an injunction that they remain sequestered as a ‘family secret’. Who’da thunk that a tiger in your tank would become a skeleton in your closet.

• John, you are talking nonsense. The AGW theory doesn’t imply that a greenhouse gas is a source of energy.
Greenhouse gases act like an insulator. The insulation in your roof isn’t a heat source either, but it keeps your house warm during a cold winter night.
Please review the greenhouse effect at http://www.friendsofscience.org/index.php?id=483
See Dr. Richard Lindzen’s article, 2nd item.
The greenhouse effect does not conflict with the 2nd law the thermodynamics.
However, I do agree that CO2 is good. See social cost of carbon is about -18 $/tCO2, likely range is -20 to -14$/tCO2, meaning that CO2 is very beneficial. See http://www.friendsofscience.org/index.php?id=2205

• Kaiser Derden says:

your roof does not keep your house warm … ever … with no heater your house would still be freezing …

• Logoswrench says:

If you had a force field containg a large CO2 bubble and tried to grow plants in it like a greenhouse it wouldn’t work. Because CO2 gas does not act like a greenhouse. It is a crap insulator. Greenhouse gas is a HUGE misnomer.
It’s kind of like calling climate science, science.

• John Manville,
Never heard of CO2 lasers? That are CO2 molecules excited with external energy which re-emit that energy at a certain frequency, more than enough to melt steel. Thus CO2 can absorb and re-emit energy and its re-emitted energy can heat up other objects, including the earth.
Nobody, not even the IPCC, says that CO2 heats the earth on its own, it only retains outgoing energy from the earth, thus indirectly from the sun. Indeed the expression “greenhouse effect” doesn’t fit the real effect, but it is so commonly used that it is hard to change into “radiation absorption effect” or something similar.
CO2 absorbs IR waves at a certain frequency and either re-emit it or shift that energy to other molecules (O2, N2) when they collide. Which happens first depends of the average emission time and the average collision time. In the second case, CO2 does warm the atmosphere surrounding it. That was already measured by Tyndall over a hundred years ago. See further about the “greenhouse effect” of different molecules:
http://www.warwickhughes.com/papers/barrett_ee05.pdf
That warming effect of CO2 on the surrounding air is completely independent of the warming caused by gravity.
Of course that doesn’t mean that the effect has negative consequences – to the contrary. There are enough negative feedbacks to keep the earth inhabitable…

• WB Wilson says:

Ferdinand,
Please try the link in my reply to John above. It leads to a well written paper by an Australian geologist named Timothy Casey, entitled: “The Shattered Greenhouse: How Simple Physics Demolishes the “Greenhouse Effect”. It is a fairly comprehensive discussion of the thermodynamic errors made by Arrhenius and his followers.
http://greenhouse.geologist-1011.net

• I think the Casey paper is far too complex and contains no actual computations .
The computations for a ball of any arbitrary uniform color ( spectrum ) in our orbit is at my Calculation of Earth’s Radiative Balance and elsewhere on my Planetary Temperature pages .
There is really only one equation which should elicit any question at all is the ratio of dot products of the relevant spectra :

the essential relationship computed here is to find temperature T such that
dot[ solar ; objSpectrum ] = dot[ Planck[ T ] ; objSpectrum ]
where objSpectrum is the absorptivity=emissivity spectrum of a object

Given any effective spectrum for the planet as seen from the outside ( Top of Atmosphere spectrum ) this generalization of the computation which produces the endlessly parroted 255K value returns the equilibrium temperature .
That’s it as far as what spectra can explain . No spectral effect can explain the difference between that temperature and surface temperature . So far as I see only gravity , the only other macroscopic force , can ( and apparently does ) explain that “trapping” .

• Thomas Homer says:

“In the second case, CO2 does warm the atmosphere surrounding it. That was already measured by Tyndall over a hundred years ago. ”
So there is something that can be measured? If that were true, we’d be measuring it. We’d have charts that track these measurements over time. I want to see how these measurements differ by elevation (Denver vs. Miami). I want to see how atmospheric tides impact these measurements. I want to see these measurements taken from the Mars Rover.
But, there are no charts to refer to, because we don’t measure it! So either all of today’s climate scientists haven’t considered using some of their grant money to take these measurements, or there is nothing to measure.
If there were something to measure, we’d be measuring it.

• Thomas Homer,
It is measured:
https://wattsupwiththat.com/2015/02/25/almost-30-years-after-hansens-1988-alarm-on-global-warming-a-claim-of-confirmation-on-co2-forcing/
What is radiated back down to earth can be separated in the effect of water vapor, CO2 and other GHGs…
Tyndall’s experiment was a direct measurement of the warming of an amount of gas by absorption of IR by different gases: none by O2 and N2, high with water, less with CO2 and other GHGs. See Wiki (for once quite clear):
https://en.wikipedia.org/wiki/John_Tyndall

• WB Wilson,
As Bob Armstrong already said, that paper by Timothy Casey is far too complex and does interpretate what different scientists in the past said as if that is proof that they and/or others were wrong at that time… I have the impression it is more the personal opinion of the writer projected on the past…
Take e.g. his stance that backradiation does make a “perpetuum mobile” as that doubles the energy in the system. Nobody in the past or present say or imply that all outgoing radiation of the earth is reflected back (maybe to a large extent under low clouds only). Half of what is re-radiated is radiated outward, not backward. Thus only part will come back on the surface and helps to change the radiation balance (see the reference in previous message) and part is used to warm the surrounding air.
Not the first time he is wrong on several aspects of what he writes: a geologist who doesn’t know that the 13C/12C ratio of volcanic CO2 is much higher than of fossil fuels?

• As a programmer ( in APL which with respect to things like mapping spectra over a sphere is more concise than traditional textbook notation ) I approach the problem as one of constructing a computable “audit trail” from the output of the sun to our observed temperature .
A few expressions handle the geometry from the Sun to us . Another one ( Stefan-Boltzmann ) from the total energy impinging on a point in our orbit to the gray body temperature ( ~ 278 +- 2.3 given the eccentricity of our orbit ) .
A last one , dot[ solar ; objSpectrum ] = dot[ Planck[ T ] ; objSpectrum ] , handles any uniform spectrum . Give me a Top of Atmosphere spectrum and I’ll give you the equilibrium temperature — and that appears to be lower than the gray body temperature . Give me a map of spectra over the sphere , and I’ll implement the few additional expressions to handle that complexity including the necessary Lambertian cosine map which actually induces spherical geometry into the equations .
But at that point one runs into the Divergence Theorem ( 2nd year calc ) which says that without internal sources or sinks the mean internal temperature ( energy density ) must match that calculated for the surface ( ToA ) over which the spectrum is measured .
No feedbacks , no downward versus upward or sideways radiant bouncings can change that .
That is why you have never seen any equations quantifying the “trapping” of heat energy by spectral phenomena beyond that calculated by the radiative balance equation above . Nor any experimental demonstration of the phenomenon which James Hansen claims responsible for “trapping” over a distance of 250km an energy density on Venus’s surface 25 times that of a point next to it in orbit . Surely that massive of an effect could be detected in a lab experiment .
And you never will , because if one could , as many have mentioned , that would obviously show the way to making a perpetual heat engine .

8. The nature is full of negative feedbacks because it is a self-regulating system. If there were positive feedbacks, any system on the earth would go out of control. In the end there is always a negative feedback also in the case of climate system. If the temperature would start to increase strongly, it would increase evaporation from the ocean. This would increase the cloudiness, which has a strong negative effect on the surface temperature (about 0.1 K/cloudiness-%), because the albedo would increase.
Those researchers in favor of man-made global warming had to create the concept of positive water feedback because it doubles the warming effects of GH gases. In practice this effect is included into the climate sensitivity parameter (CSP) and into its official value of 0.5 K/(W/m2). By multiplying with this value the RF of the doubled concentration of CO2 (3.71 W(/m2)), the official transient climate sensitivity TCS = 0.5 * 3.71 = 1.85 K. This is the basic reason, why IPCC has to maintain the official CSP value as it is. Otherwise the whole construction of scary future temperature increases would collapse. The correct value of the CSP assuming the constant water content in the atmosphere is 0.27 K/(W/m2).
Those in favour of positive water feedback say that actually it is not used in numerous climate models (GCM) but it is a feature coming out from the system utilizing physical relationships. If this is the case, then these models are wrongly constructed. A simple sight test shows that there is no positive water feedback because it means that the relative humidity (RH) would stay constant in the changing climate conditions. The RH measurement since 1948 show that there is no constant RH trend at any level in the atmosphere.
https://static.wixstatic.com/media/c266e2_cb6631c37af44cea8421b8148715240a.jpg/v1/fill/w_600,h_450,al_c,q_80,usm_0.66_1.00_0.01/c266e2_cb6631c37af44cea8421b8148715240a.jpg

9. Perhaps they should look at the molar heat capacity of so2.

• Reality Observer says:

Oh dear, please don’t attract the “Venus” bunch over here. Really.

10. Roger Taguchi says:

1. The Earth is at perihelion (closest to the Sun) in early January, and yet the Earth’s average temperature is lowest then. The obvious reason is that most of the land masses are in the Northern Hemisphere, and the northern lands are covered in highly reflective snow and ice (due to the tilt of the Earth meaning that the Sun is lowest in the sky). The higher surface albedo then means the least Sunlight is absorbed, even though the distance is smallest.
2.(a) Perhaps the 200% positive feedback quoted in the literature (for example, at https://en.wikipedia.org/wiki/Climate_sensitivity ) came from the fact that water vapour is twice as important as CO2 in the atmospheric greenhouse effect (this fact comes from the infrared spectra obtained by satellites looking down on the Earth). So a 1 degree temperature rise on doubling CO2, combined with a 200% positive feedback, results in the literature’s quoted 3 degree rise.
(b) The fatal flaw in this reasoning is that doubling CO2 does not double water vapour. Increasing temperature from 15.0 to 15.6 degrees Celsius increases saturated water vapour from 12.788 to 13.290 mm Hg (see The Handbook of Chemistry and Physics), an increase by a factor of 1.039, or by 3.9%. Multiplying by a weighting factor of 2 (because water vapour is twice as important a greenhouse gas as CO2) would increase this only to 8%, not 200%. This factor of 8% increase would also apply at 50% relative humidity instead of 100%.
(c) Similar calculations starting from 40.0, 30.0, 20.0, 10.0, 0.0, and -10.0 degrees Celsius show increases by only 3.2%, 3.5%, 3.8%, 4.1%, 4.5%, and 4.9% (supercooled water) or 5.5% (ice), respectively. The % increase is roughly constant at 4% because saturated vapour pressure is roughly an exponential function of absolute temperature (see https://en.wikipedia.org/wiki/Clausius-Clapeyron_relation , where it says a temperature increase of 1 degree results in a 7% vapour pressure increase). [For an exponential relation, exp[K(T+0.6)] = exp(KT).exp(0.6K) = constant.exp(KT) for any relevant temperature T, since K is a constant, and therefore 0.6K and exp(0.6K) would both be constants.]
3. It is easy to show that a 1% increase in cloud cover (e.g. due to increased water vapour condensation) would result in a negative feedback (due to increased albedo relative to the cloudless solid and liquid Earth) that would be as great, or greater than the small 8% positive feedback due to increased absorption by increased water vapour. This negative cloud feedback becomes even greater than in my previous Comments if we use an albedo of 0.04 for cloudless ocean, and 0.12 for cloudless land (compared to the overall 0.18 for cloudless surface and 0.37 for clouds I derived earlier from the updated Kiehl & Trenberth energy budget diagram available at https://chriscolose.wordpress.com/2008/12/10/an-update-to-kiehl-and-trenberth-1997 – these overall albedo values of mine include snow-covered areas of the solid surface).
Therefore the net feedback of water vapour and cloud feedback is most likely negative, not positive, so the literature is all wrong about the magnitude of the enhanced greenhouse effect (climate sensitivity). And because there are saturation effects (diminishing returns) on increasing CO2, I estimate that as CO2 increases from the present 400 to 600 ppmv, the IPCC prediction of future warming is likely to be 8 times too large. Hence the theory of CAGW is all wrong, and ought to be dumped in the dustbin of science.
4.(a) Even the 1 degree not including feedbacks, considered “settled science”, is too high. At 288.2 K and emissivity 0.98, the Stefan-Boltzmann law gives 383.34 W/m^2 emitted. If a “radiative forcing” of 3.7 W/m^2 due to doubling CO2 is added (see https://en.wikipedia.org/wiki/Radiative_forcing ), a new total of 387.04 W/m^2 must be emitted for energy balance. Using the Stefan-Boltzmann law backwards, this corresponds at emissivity 0.98 to a new temperature of 288.893 K. The surface of the Earth must then be 288.893 – 288.2 = 0.693 degrees warmer for the temperature sensitivity, not 1 degree.
(b) The MODTRAN spectrum in the Wikipedia article on Radiative forcing states a radiative forcing of 3.39 W/m^2, about 8.4% smaller than the IPCC’s 3.7 W/m^2, and this lower value would mean that the literature value of 1 degree should be 0.63 degrees (and with net negative feedback, an even lower climate sensitivity).
5.(a) The MODTRAN spectrum, which I assume to be accurately computed, nevertheless shows a staggering blunder in the literature re all spectra obtained by satellites (or balloons at high altitude) looking downward on the Earth’s surface. These spectra are called “emission” spectra by Grant Petty in his very good book “A First Course in Atmospheric Radiation, Second edition” [note that the title assumes that the atmosphere radiates] and by Jack Barrett at his otherwise excellent website http://www.barrettbellamyclimate.com/ . Any competent chemist can tell at a glance, however, that these spectra are essentially absorption spectra, not emission spectra, with a little CO2 emission from the stratosphere (powered by incoming Solar UV and visible radiation absorbed by ozone) truncating the downward CO2 absorption ditch with “220 K” emission which is not Planck black body, but only at CO2 resonant frequencies.
(b) This lack of comprehension by the “experts” clearly shows up in their truncation of the horizontal Wavenumber axis at 1600 or 1700 cm^-1 (because “emission” is essentially zero at higher wavenumbers). But the Planck black body spectrum at 288.2 K extends well beyond, to about 2400 cm^-1; the net “emission” appears to be zero because the bond-bending water vapour band extends to 2200 cm^-1 and beyond, and water vapour lines are saturated, showing almost complete absorption of infrared (IR) beyond 1500 cm^-1. In order to see the entire water vapour absorption band from 1200 to 2200 cm^-1, see Fig. 3 at http://smsc.cnes.fr/documentation/IASI/Publications/ClerbauxACP2009.pdf .
(c) As I have previously written, the gases N2, O2 and Ar that make up 99.9% of dry air cannot and do not emit any infrared (IR), so there is no “atmospheric radiation”, black body or otherwise, from these gases. Yes, there is IR back-radiation from CO2 and water vapour molecules, but essentially these photons are equivalent to photons originally emitted from the solid and liquid surfaces of the Earth, and reflected back, after a delay (due to the lifetime of the excited states emitting the photons). And the IR emission from CO2 and ozone to outer space from the 220 K layer in the stratosphere amounts to about 19 W/m^2 (as calculated from the spectrum looking down on a 210 K Thunderstorm Anvil in Petty’s book), nowhere near the 240 W/m^2 required for energy balance. The rest of the required 240 W/m^2 is that portion of the 383.34 W/m^2 emitted from a 288.2 K (15 Celsius) Earth’s surface that is NOT absorbed by CO2 and water vapour molecules in the troposphere. [The absorbed power/m^2 ends up as increased translational and rotational energy of all the gas molecules in the troposphere; i.e. the troposphere warms up, the greenhouse effect, compared to an atmosphere without greenhouse gases.]
(d) So much for the “settled science” of the literature.

• and this lower value would mean that the literature value of 1 degree should be 0.63 degrees (and with net negative feedback, an even lower climate sensitivity).
Actually you and the literature are both correct. I got to the exact same math as you many years ago when I first started reading the literature, and came to the same conclusion. I even pointed out the math on warmist sites and promptly got my carefully worded comments deleted along with rude remarks about my not understanding the literature. No one bothered to explain what I didn’t understand though.
The answer is that the literature defines sensitivity at the “effective black body temperature of earth”. That’s about 255K. Apply your identical math to that temperature and you’ll instantly see where ~1 degree comes from.
That is of course, a bit of misdirection. The effective black body temperature of earth occurs many kilometers above the surface. Where WE live, it is a completely different (and much smaller) number.

• Hi Dave!
I agree that the 1 degree comes from using the “effective black body temperature of the Earth” which is about 255 K (instead of the surface temperature of 288.2 K), as I derived it myself on first reading the literature. But then I realized that the mechanism in the literature (and also mentioned by Richard Lindzen who I deeply respect) is all wrong in assuming that radiative exchange between gas molecules in the troposphere continues until the infrared (IR) photons finally escape from the TOA (Top Of the Atmosphere), assumed to be a black body surface at 255 K. But the spectrum of outgoing IR is definitely NOT a Planck black body spectrum, so the basic mechanism is wrong! For a sample spectrum, see Fig. 3 at http://climateaudit.org/?p=2572 . Such spectra are very closely modelled by the MODTRAN spectrum available at https://en.wikipedia.org/wiki/Radiative_forcing .
So where does the 255 K value come from? It is the temperature for a perfect Planck black body (emissivity 1) whose spectrum shows a total integrated area exactly equal to that under a real, non-Planck spectrum whose total integrated area corresponds to 240 W/m^2. The two values are exactly equal because you put them equal, and use the Stefan-Boltzmann law backwards assuming emissivity 1 to calculate the 255 K temperature! Then the literature furthers the blunder by calculating the altitude at which this “escape of IR photons to outer space” supposedly occurs; using the lapse rate of -6.8 K/km from a surface whose temperature is 288.2 K, this is at 4.9 km altitude. This is a totally non-physical calculation, because the spectrum is not that of a black body, and the reason for this is that the main molecules of the troposphere (N2, O2, Ar) do not emit any significant IR, as shown in spectra looking upward from the ground at nighttime in the “window” at 900 cm^-1. Note that this 4.9 km altitude is half that of the tropopause, where other arguments in the literature say that the IR photons supposedly escape to outer space (but this latter cannot be right, because at 10 km the temperature is only 220 K, and a perfect 220 K Planck black body emits only 133 W/m^2, nowhere near the 240 W/m^2 needed for energy balance).
Also, as I have argued, complete absorption & emission of central IR frequencies in the CO2 band centered at 667 cm^-1 occurs within metres of the surface, and so sudden escape to outer space is unlikely at 10 km, where the total pressure is only a factor of 4 smaller than at the surface. At 4.9 km altitude, the pressure difference is even smaller, so “sudden escape to outer space” at 4.9 km is even less likely. There are so many things wrong with the literature explanation (as argued by Houghton in the climateaudit article, and repeated by Richard Lindzen) that we ought to totally abandon it.
The MODTRAN spectrum in the Wikipedia article on Radiative_forcing shows a TOA flux of 260 W/m^2, but this is for a cloud-free surface at 288.2 K; the average of 240 W/m^2 is lower because the IR photons from clouded areas largely come from liquid droplets or ice crystals at temperatures below 288.2 K (but the spectra are not Planck black body for tropospheric clouds because CO2 and water vapor continue to absorb some of the emitted IR before they reach outer space). The MODTRAN spectrum also shows the downward CO2 absorption ditch truncated at the same 220 K level, for both 300 ppmv and 600 ppmv CO2. This is wrong, because the integration was done only to 20 km altitude (the stratosphere has a constant 220 K temperature, on average, from 10 to 20 km). When the MODTRAN program is run to 70 km altitude, the photons that do escape when CO2 is at 600 ppmv must come from higher altitudes, and due to the temperature inversion in the stratosphere to 40-50 km (caused by absorption of incoming Solar UV and visible radiation by ozone), this means this will be at a slightly higher temperature, and this means that the Earth’s surface need not be as high for energy balance. This reduces the temperature sensitivity from the 0.693 degrees on doubling CO2 (not including feedbacks). See “The hard bit” at http://www.barrettbellamyclimate.com/ . Note that this argument applies only for central CO2 frequencies, not the entire 0-2400 cm^-1 spectrum, and predicts the opposite effect of that in the literature repeated by Lindzen.

• So where does the 255 K value come from? It is the temperature for a perfect Planck black body (emissivity 1) whose spectrum shows a total integrated area exactly equal to that under a real, non-Planck spectrum whose total integrated area corresponds to 240 W/m^2. The two values are exactly equal because you put them equal, and use the Stefan-Boltzmann law backwards assuming emissivity 1 to calculate the 255 K temperature!

This is another example that the most pervasive falsehood and failure to understand the most basic physics of planetary temperature is the 255K meme .
The temperature of a black — or gray , however light or dark — ball in our orbit is about 278.5 +-2.3 depending where it is along the ellipse . That is the temperature which corresponds to the total energy impinging on a point in our orbit , that is , the TSI outside our atmosphere .
The derivation of the 255K value and the calculation for any arbitrary spectra are at Calculation of Earth’s Radiative Balance .

• Rogertaguchi,
“But the spectrum of outgoing IR is definitely NOT a Planck black body spectrum, so the basic mechanism is wrong!”
The SB Law is agnostic to the spectrum and relates equivalent temperature to emissions. 1 W/m^2 of photons can do the same amount of work independent of their frequency, so not being a Planck spectrum is not very important to the result.
Also, if you look at the emission spectrum, the attenuation in absorption bands (even those that are completely opaque to surface emissions) is only about 3 db, so the deviation from a Planck spectrum is not much anyway. Also, plotting the emission spectrum with linear wavenumbers as the X axis is misleading as this significantly overemphasizes the influence of low frequency, lower energy, photons.

• Roger Taguchi says:

1. (a) Thanks for your comment of May 12, 2016 at 8:34 pm. You made me rethink my assumptions and redo calculations which nevertheless come back to the same conclusion, that the climate sensitivity on doubling CO2 is only about 0.6 degrees (not including water vapor and cloud feedbacks), not the 1 degree easily calculated using the 255 K equivalent black body temperature (with which the Stefan-Boltzmann law yields 240 W/m^2, the TOA flux needed for energy balance).
(b) And doubling CO2 does not double water vapor. A 0.6 degree temperature increase results in only a 4% increase in water vapor. Because water vapor absorption is twice as great as CO2 absorption overall, we can accept a positive water vapor feedback of 8% (but not the 200% which boosts 1 degree to the 3 degrees quoted throughout the literature).
(c) Someone with a handle like “abn” (I have forgotten the exact spelling, and can’t find the Comment) stated that the oceans’ albedo is really small (0.04) and the albedo for the overall cloudless Earth’s surface is around 0.08 (meaning the cloudless land albedo is 0.16, probably including a small snow-covered fraction, if we assume the oceans cover 2/3 of the total surface). This means for 62% cloud cover, the cloud albedo is 0.44 [since 0.62(0.44) + 0.38(0.08) = 0.30, the overall Bond albedo]. The ratio of cloud/cloudless albedos is then 0.44/0.08 = 5.5, perhaps more reasonable than the ratio of 0.37/0.18 = 2 using albedos calculated from the data on the left side of the energy budget diagram at https://chriscolose.wordpress.com/2008/12/10/an-update-to-kiehl-and-trenberth-1997 . Using cloud and cloudless albedos of 0.44 and 0.08, a reasonable estimate for an increase in cloud cover on warming due to doubling CO2 yields a negative feedback which overwhelms in magnitude the small positive feedback due to increased water vapor absorption. Thus even the 0.6 degree value for climate sensitivity is too high. No wonder there has been a pause (or very, very small increase) in global warming for 15-18 years, even as CO2 has continued to increase! Yes, there is a small enhanced greenhouse effect on doubling CO2, but the IPCC value of 3 degrees for climate sensitivity predicts future warming too large by a factor of about 8. So the best advice is to do nothing about CO2 emissions, and not worry about it.
……………………………………………………………………………….
2. The following text includes my calculations, which might not be of interest to the casual reader.
(a) The key infrared (IR) spectra obtained by satellites looking down on the Earth were taken in the 1970s, and one is Fig. 3 at http://climateaudit.org/?p=2572 . Others may be found in the section “Emission to space” at Jack Barrett’s excellent website http://www.barrettbellamyclimate.com and in Grant W. Petty’s excellent book “A First Course in Atmospheric Radiation, Second Edition” available at amazon or directly from Petty at the University of Wisconsin-Madison.
(b) A MODTRAN computer-calculated spectrum that very closely models such observed spectra is available at https://en.wikipedia.org/wiki/Radiative_forcing . This shows a TOA (Top Of the Atmosphere) outgoing flux of 260.12 W/m^2 for 300 ppmv CO2 and 256.72 W/m^2 for 600 ppmv. The difference is 3.39 W/m^2, by definition the radiative forcing.
(c) Note that a value of 3.7 W/m^2 is usually quoted (for example, at https://en.wikipedia.org/wiki/Climate_sensitivity ), which is different by 9% from 3.39 W/m^2. Therefore we ought to consider the possible error to be about 9%, and consider factors that result in only 2 or 3% change as insignificant.
(d) For a 288.2 K (15 Celsius) surface with emissivity 0.98, the Stefan-Boltzmann law gives a flux of 383.34 W/m^2. Therefore for 300 ppmv CO2, the total absorption by all greenhouse gases is 383.34 – 260.12 = 123.22 W/m^2. This ends up warming the troposphere, as excited state greenhouse gas molecules transfer energy during radiationless collisions to the main gases (N2, O2, Ar) that outnumber CO2 by 3333:1.
N2, O2 and Ar are non-polar molecules that cannot and do not re-emit any significant IR photons, black body or otherwise.
(e) Note that the two MODTRAN curves (one blue, one green) match almost exactly across the spectrum, except on the sides of the downward CO2 absorption ditch. This means that absorption at water vapor and ozone frequencies are unchanged. So as a first approximation, the extra absorption at 600 ppmv CO2 may be made up by having the Earth’s surface emit 3.39 W/m^2 more, until it equals 383.34 + 3.39 = 386.73 W/m^2. This means for emissivity 0.98, the Stefan-Boltzmann law yields a temperature of 288.83 K, a difference of 288.83 – 288.2 = 0.63 K for climate sensitivity (not including feedbacks).
(f) We are justified in using 288.2 K, the temperature of the Earth’s surface (and not 255 K) because it is the surface that acts as a Planck black body that emits the photons that get absorbed. The non-Planck spectra show that the atmosphere does not emit Planck black body photons, except for central CO2 frequencies at 220 K, and this emission is powered by heating due to absorption of incoming Solar UV and visible radiation by ozone in the stratosphere (and not by IR photons escaping upward from the troposphere). For absolute proof, see Petty’s Fig.8.3(c) which shows a 215 K CO2 emission peak poking above the background 210 K Planck black body spectrum emitted by an opaque Thunderstorm Anvil below. Net heat cannot flow from a 210 K black body to power 215 K emission. Note also the exact fit of this 215 K emission peak to the truncation of the downward CO2 absorption ditch in the spectrum observed looking down on a 297 K Tropical Western Pacific cloudless surface in the same Fig. This means that photons emitted from the 288.2 K surface of the Earth at central CO2 frequencies are essentially 100% absorbed & re-emitted in the opaque troposphere/stratosphere, and finally escape to outer space from 20-40 km in the stratosphere (and not at 4.9 km where the temperature is 255 K). The density of the air at 10 km (32,808 ft.) will be similar to that at the top of Mt. Everest (29,000 ft.), about 1/4 that at the surface. Therefore the density at 20, 30 and 40 km will be approx. 1/16, 1/64, and 1/256 that at the surface (and even smaller in all cases, because the temperature inversion caused by ozone’s heating of the stratosphere will expand the air, making it even thinner). The overall emission from CO2 in the stratosphere amounts to about 16 W/m^2, and the 220 K ozone emission peak at 1040 cm^-1 another 3 W/m^2, for a total of 19 W/m^2 emission, nowhere near the 240 W/m^2 required for energy balance.
(g) The rest of the required 240 W/m^2 TOA flux comes from that portion of the photons emitted from the Earth’s surface that is NOT absorbed by greenhouse gases along the way. The literature “experts” have failed to understand that (except for the Thunderstorm Anvil spectrum) these IR spectra are mostly absorption spectra, not emission spectra. To understand the fundamental difference, see
https://en.wikipedia.org/wiki/Emission_spectrum ,
https://en.wikipedia.org/wiki/Fraunhofer_lines and
https://en.wikipedia.org/wiki/Spectral_line .
(h) Now for something new: Because 260.12 W/m^2 escape the TOA when the 288.2 K Earth’s surface emits 383.34 W/m^2, the fraction that escapes (transmission factor) is 260.12/383.34 = 0.6785. Therefore if 3.39 W/m^2 escapes at the TOA, 3.39/0.6785 = 5.00 W/m^2 needs to be emitted at the surface for energy balance. Going through the steps in Point 2(e) above, the climate sensitivity is 0.94 K (not including feedbacks). This gets us back to the simple calculation using emission from a 255 K black body! But the literature is still wrong for the following reason: all these numbers were for a cloud-free surface.
(i) Since the tops of clouds are at higher altitudes where temperatures are lower, their black body emission will be lower than that of a 288.2 K surface, so there will be lower overall absolute absorption, as well as corresponding lower enhanced greenhouse effect. The path length of absorbing molecules is also smaller from cloud tops to 10 km, meaning fewer absorbing molecules. Since central CO2 frequencies are saturated, the extra absorption as CO2 doubles from 300 to 600 ppmv occurs at the far wings of the absorption band. At lower temperatures at altitude, the band width decreases so the number of molecules capable of absorption decreases further.
(j) You may skip this technical point: For a crash course on CO2 IR spectra, see http://www.barrettbellamyclimate.com for Diagrams 1, 2 & 3 in the section “Spectral transitions”, Diagrams 8-11 in the section “Greenhouse gas spectra”, and Figs. 1-7 in the section “Emission levels”. The wing-shapes for the P- and R-branches result from the distribution of molecules with rotational quantum number J at an equilibrium temperature T, and this is proportional to
(2J+1).exp[-BJ(J+1)hc/kT] where J = 0, 1, 2, 3,….
rotational constant B = 0.390 cm^-1
Planck’s constant h = 6.626 x 10^-34 J.s
vacuum speed of light c = 2.998 x 10^10 cm/s [NOT in m/s]
Boltzmann’s constant k = 1.381 x 10^-23 J/K
absolute temperature T is in K
As T decreases, the argument of the decreasing exponential function becomes more negative, so the extreme wings approach zero, and the band width shrinks.
(k) You might quibble with some of the following approximations, but IMO the net results of minor changes will be essentially the same, well within the possible error of 9%. Thick clouds made up of liquid droplets or ice crystals are essentially opaque to IR (only a few substances, e.g. alkali halides like NaCl, KBr,…,CsI, are transparent to IR and can be used as windows for sample cells and spectrometers), so let’s assume a black body emission surface at 277 K (4 Celsius) at 1.65 km. The relative drop in total IR emission from 288.2 K to 277 K will be (277/288.2)^4 = 0.853 .
(l) Assuming an exponential drop in relative atmospheric density from 1 at the Earth’s surface to 0.25 at altitude h = 10 km, density can be modelled by the function exp(-0.1386 h). Integrating this from h=0 to h=10 gives the definite integral 5.411. The definite integral from h=1.65 to h=10 gives 3.936. Therefore the number of molecules in the path length decreases by the factor 3.936/5.411 = 0.727.
(m) Note that the difference in absorption areas in the MODTRAN green and blue curves available at
https://en.wikipedia.org/wiki/Radiative_forcing are in the sloping flanks of the CO2 absorption ditch, centered at 618 cm^-1 and 721 cm^-1. These absorptions involve photons absorbed by molecules in the bond-bending first excited state, 667.4 cm^-1 above the ground vibrational state [see the vibrational energy level diagram, Diagram 3 in the section “Spectral transitions” at http://www.barrettbellamyclimate.com ]. High energy collisions between ground state CO2 molecules and N2, O2 and Ar can occasionally knock some CO2 molecules into this first excited state, from which they can absorb 618 and 721 cm^-1 photons.
The equilibrium ratio of the number of first excited state to ground state molecules is given by the Boltzmann function, exp(-hcf/kT), where f = 667.4 cm^-1 is the wavenumber for the transition. At T = 288.2 K, this ratio is 0.0355 (only 3.55%), while at T = 277 K, the ratio is 0.0310. Therefore the fraction of molecules in the first excited state drops by a factor of 0.0310/0.0355 = 0.873.
This ratio will also reflect the drop in number of molecules in the wings of the ground state, corresponding to very high values of J, the rotational quantum number. At these low values, roughly 3%, the lines will not be saturated, and Beer-Lambert absorption will approach linearity with concentration (number of molecules).
(n) Combining the three factors listed in Points 2(k)-(m), the absorption from the 277 K cloud tops at 1.65 km altitude relative to that from a 288.2 K surface will drop by a factor of (0.853)(0.727)(0.873) = 0.540.
This means that the forcing over cloud tops will be 0.540(5.00 W/m^2) = 2.70 W/m^2.
And this doesn’t include narrowing of vibration-rotation line widths with lower temperatures and lower pressures at higher altitudes (when lines are essentially 100% saturated, narrowing decreases the total area corresponding to total absorption – see the changes in spectral absorption in Figs. 4-7 in the section
“Emission levels” at http://www.barrettbellamyclimate.com ).
(o) For 62% cloud cover, the net overall forcing is then 0.62(2.70) + 0.38(5.00) = 3.57 W/m^2, very close to our original value of 3.39 W/m^2! This means that climate sensitivity will be much closer to 0.6 degrees than to 1 degree, even if we consider the extra emission necessary to compensate for partial absorption of photons emitted from the 288.2 K surface on their way to escape to outer space.
3. The MODTRAN spectra at https://en.wikipedia.org/wiki/Radiative_forcing were calculated for upward irradiance at 20 km, and showed no difference in the truncation levels at 220 K. However, in the section “The hard bit” at http://www.barrettbellamyclimate.com/ , Jack Barrett has rerun the MODTRAN program to 70 km altitude. The results show slightly increasing emission levels at the base of the upward Q-branch spikes: on doubling CO2 from 380 to 760 ppmv, the emission temperature rises from 223 K to 225 K. This means an emission increase by a ratio of (225/223)^4 = 1.036, and for an initial stratospheric CO2 emission of 16 W/m^2 (as estimated from the area of CO2 emission in Petty’s Fig. 8.3(c) Thunderstorm Anvil spectrum), an increase to 16.6 W/m^2. This increase of 0.6 W/m^2 emission at the TOA that need not be produced means that the forcing at the Earth’s surface can be decreased by 0.6/0.6785 = 0.88 W/m^2.
This gives 3.57 – 0.88 = 2.69 W/m^2 as the radiative forcing for the surface temperature. This positive forcing still means a positive enhanced greenhouse effect, because the increase in absorption in the sloping flanks of the CO2 absorption ditch is greater than the slight rise in emission at central frequencies near 667 cm^-1, as Jack Barrett has shown in the section “The hard bit”.
4.(a) Because the climate sensitivity so far corresponds to 0.6 degrees, I used this value to show that positive feedback due to an increase in water vapor of about 4% is about 0.08(2.69) = 0.21 W/m^2. This number includes the transmission factor of 0.6785 from the surface to the TOA. The weighting factor of 2 comes from the fact that total water vapor absorption in the MODTRAN spectrum is about 80.4 W/m^2 compared to 38.1 W/m^2 for CO2 (a factor of 2.1). Ozone net absorption is about 4.7 W/m^2, for a total of 123.2 W/m^2 absorption. The relative absorptions by these greenhouse gases were calculated from the areas of the downward absorption peaks in the MODTRAN spectrum, traced onto graph paper with mm squares. Because the literature has misinterpreted these spectra as “emission” spectra instead of “absorption” spectra, the horizontal axes are truncated at 1600 or 1700 cm^-1. Since the non-zero 288.2 K Planck black body emission function extends to 2400 cm^-1, I calculated a table of values of
f^3/[exp(hcf/kT) – 1] vs wavenumber f at 100 cm^-1 intervals from 0 to 2400 and normalized them to fit the trace of the MODTRAN spectrum at the window at 900 cm^-1.
For a forcing at the surface including water vapor feedback = 2.69 + 0.21 = 2.90 W/m^2, the increased surface emission must be 383.34 + 2.90 = 386.24 W/m^2. At emissivity 0.98, this means the surface temperature must be 288.74 K, for a climate sensitivity of 0.54 K.
(b) Co2isnotevil commented on May 16, 2016 at 9:50 am that “The SB law is agnostic…1 W/m^2 of photons can do the same amount of work…” I agree. The area of one mm x mm square in the CO2 absorption ditch corresponds to the same W/m^2 as one mm x mm square in a water vapor absorption or in an ozone or methane (CH4) absorption.
However, he stated “Also, plotting the emission spectrum with linear wavenumbers as the x axis is misleading as this significantly overemphasizes the influence of low frequency, lower energy photons.”
The great importance of a linear wavenumber x axis (units of cm^-1) is that multiplied by the vertical scale unit of Radiance in mW/(m^2 sr cm^-1), the area of any shape has units proportional to W/m^2. This would not be the case if the horizontal axis showed wavelength in units of cm or m. IMO it would be the other way, the area of long wavelengths x radiance disproportionately blowing up the seeming importance of low frequency low energy photons.
There is one way to discount the importance of low energy low wavenumber photons, however. At 288.2 K, the average translational energy of a gas molecule is 3kT/2 = 5.97 x 10^-21 J , where k is Boltzmann’s constant. A 100 cm^-1 photon has energy E = hcf = (6.63 x 10^-34 J.s)(3.00 x 10^10 cm/s)(100 cm^-1) = 1.99 x 10^-21 J, 33% of the average translational kinetic energy. So these low energy photons can be easily absorbed and emitted between rotational energy levels of molecules with permanent electric dipole moments (e.g. water molecules). If absorption/emission constantly occurs, then the emission to outer space will follow a Planck black body spectrum with a temperature corresponding to the layer where the molecules become rare enough that the photons can escape to outer space (e.g. for water around 220 K near the tropopause).
Since at low wavenumbers the Planck curves converge for all temperatures, the energy for the photons emitted by water molecules below 100 cm^-1 can be thought of as coming from the 288.2 K Earth’s surface essentially unabsorbed.
5.(a) An estimate for cloud feedback requires assumptions that might lead to large errors, so bear with me with starting numbers that can be adjusted by a factor of 2 or 3 later.
(b) Suppose a 4% increase in water vapor increases cloud volume by 4%. Since volume varies as the cube of linear dimensions, and area as the square of linear dimensions, an increase in volume by a factor of 1.04 will mean an increase in surface area by a factor of the square of the cube root of 1.04, which equals 1.0265. I.e. cloud cover might increase by 2.65%.
(c) If cloud cover is 62% to begin with, this means that 0.62(2.65%) = 1.64% of the Earth’s total surface will be changed from cloudless to clouded.
(d) For a cloudless 288.2 K surface, the TOA flux is about 260 W/m^2, as shown in the MODTRAN spectrum.
(e) Therefore 228 W/m^2 must be the TOA flux from clouds [since 0.62(228) + 0.38(260) = 240 W/m^2 needed for energy balance].
(f) There are two potentially opposing consequences of replacing cloudless surface by reflective cloud on the average incoming 342.25 W/m^2:
(1) If the difference in Bond albedos is 0.44 – 0.08 = 0.36, then replacing 1.64% of the surface which is cloudless by reflective cloud will decrease the incoming Solar radiation at the surface by
1.64(342.25)(0.36)/100 = 2.02 W/m^2. This would result in cooling.
(2) Because the TOA emission from clouds is only 228 W/m^2 on average, there will be a decrease in the outgoing total TOA emissions by 1.64(260-228)/100 = 0.525 W/m^2 , and this would result in warming (since the Earth’s surface emission must make up the deficit).
(g) Therefore the net result is TOA cooling by 2.02 – 0.525 = 1.50 W/m^2.
(h) At 277 K and emissivity 0.98, the Stefan-Boltzmann emission is 327.14 W/m^2. If 228 W/m^2 escapes at the TOA, the transmission factor is 228/327.14 = 0.697. This is very close to the previously derived 0.6785 for IR flux transmission from a 288.2 K cloudless surface.
(i) Therefore the change in surface forcing is 1.50/0.697 = 2.15 W/m^2.
(j) The net surface forcing is then only 2.90 – 2.15 = 0.75 W/m^2.
(k) This means the net climate sensitivity is only 0.14 K when water vapor and cloud feedback are included.
I.e. next to zero.
(l) Obviously, one can question the assumptions in Point 5(b) above. Even if we cut the % cloud cover increase by a factor of 3, the climate sensitivity is only 0.41 K.
(m) The safest, most pessimistic estimate for climate sensitivity would be to use as an upper limit the 0.54 K calculated in Point 4(a), assuming that cloud feedback is zero, so that any mistakes in this calculation will not detract from its acceptability. For example, my previous albedo estimates of 0.18 for land and 0.37 for clouds (calculated from Kiehl & Trenberth’s energy balance diagram) would result in a climate sensitivity of 0.40 K for a 1.64% change in Earth’s surface from cloudless to clouded. If we cut the 1.64% by a factor of 3, the climate sensitivity is 0.57 K. Doubling the 9% error estimate to 18% would hedge our bets, so a value of 0.54 K +/- 0.10 K (which includes water vapor and cloud feedback estimates) would cover all but the result in Point 5(k), the error estimate understood to be within a factor of 2 with probability 95% .
6.(a) It is accepted that climate change is logarithmic with ppmv CO2 due to saturation effects (diminishing returns), with the climate sensitivity the warming on each doubling of CO2.
(b) So how many doublings is the rise from 400 to 600 ppmv? The mathematical question is “What is x, if
2^x = 600/400?” Taking logs of both sides gives x.log2 = log(600/400). Therefore x = [log(600/400)]/log2 = 0.585 doublings.
(c) If climate sensitivity (including water vapor and cloud feedbacks) is 0.54 +/- 0.10 K, then 0.585 (0.54) = 0.32 +/- 0.06 K will be the predicted future warming due to CO2 increasing from 400 to 600 ppmv. This includes feedbacks (although we downplayed the extent of negative cloud feedback).
(d) As CO2 increased from 280 to 400 ppmv during the period 1850 to 2015, the climate warmed only about 0.8 +/- 0.1 degree. Therefore the increase as CO2 went from 300 to 400 ppmv was probably closer to 0.7 degrees. If we use the IPCC’s “best value” of 3 degrees for warming as CO2 doubles from 300 to 600 ppmv, then 3-0.7 = 2.3 degrees would be the predicted warming from 400 (now) to 600 ppmv. This is a factor of
2.3/0.32 = 7 times too large [if we use 0.8 degrees from 300 to 400 ppmv, the factor is 2.2/0.32 = 7 times too large].
(e) Since our value of 0.54 +/- 0.10 K for climate sensitivity is pessimistically on the high side, the IPCC best value could be even greater in error on the high side. It’s obvious.

• Re: my post on May 18,2016 at 6:50 pm. In Point 5(m), “0.57 K” should have read “0.51 K”. My calculated value for climate sensitivity on doubling CO2, including a positive 8% water vapor feedback, was 0.54 K and must be considered an upper limit (when net cloud feedback is assumed to be zero). As I showed, net cloud feedback is negative, and can easily reduce climate sensitivity to 0.40 K or so, and maybe even as low as 0.14 K. Sticking with the upper limit will eliminate haggling over the magnitude of cloud feedbacks, and it is still low enough that the IPCC “best value” of 3 degrees predicts future warming too large by a factor of at least 7 (as CO2 increases from 400 to 600 ppmv). The upper limit also provides wiggle room to reduce the importance of any laundry list of non-quantified factors such as melting of glaciers, defrosting of tundra, increased high clouds & jet contrails in the stratosphere (which are warming), drowning polar bears, etc. that the innumerate True Believers of the CAGW cult might use to bolster their pathetic, failing case.

• David L. Hagen says:

Roger Re: “gases N2, O2 and Ar that make up 99.9% of dry air cannot and do not emit any infrared (IR)”
Please correct – though small, there is an O2 effect. See: Poulsen, Christopher J., Clay Tabor, and Joseph D. White. “Long-term climate forcing by atmospheric oxygen concentrations.” Science 348.6240 (2015): 1238-1241.

• Roger Taguchi says:

OK, I should have added “significant” before “infrared (IR)”. My argument is based on dipole radiation as being the most important (“significant”) mechanism of radiation. I am aware that there are quadrupole and magnetic dipole mechanisms for radiation, even when dipole radiation is zero. For example, the green and red emission spectra of the aurorae are due to “dipole forbidden” transitions of atomic oxygen, but these occur very high in the outer atmosphere. At any rate, for the enhanced greenhouse effect in the troposphere as CO2 increases from 300 to 600 ppmv, the % oxygen is essentially constant, so any non-zero emission/absorption will be essentially constant.

• Thank you Roger for publishing that. I came to the same conculsion as well… although a slightly lower amount.

• Roger Taguchi says:

Hi! Thanks for your comment; I welcome constructive challenges, as we all ought to strive to improve our understanding, even if it means having to redo some calculations (admitting previous work was incomplete). So please elaborate on “slightly lower amount”, with quantitative estimates. IMO the “radiative forcing” of 3.39 W/m^2 on doubling CO2 from 300 ppmv to 600 ppmv quoted at https://en.wikipedia.org/wiki/Radiative_forcing [see the MODTRAN spectrum] differs from the 3.7 W/m^2 quoted without proof in the literature (e.g. at https://en.wikipedia.org/wiki/Climate_sensitivity ) by 8 or 9% (depending on which you take to be the more accurate), so we ought not to aggressively condemn people for approximations which amount to only 4 or 5% difference overall.
In particular, I would like to thank davidmhoffer for making me rethink my calculations. A more complete calculation gives the same value of climate sensitivity (not including water vapor and cloud feedbacks), of about 0.66 W/m^2, not 1 degree, but I am tired and will outline the steps in another email reply to David, who ought to be pleased that he was right all along (so the abusive, unethical, cheating, lying reactions he got from the True Believers of the Catastrophic Anthropogenic Climate Change cult were totally unwarranted).

I’m missing something, from your previous statement to the above. The numbers you gave are similar to the ones quoted in 2001 by the IPCC. I may be reading out of context.
There is definitely something wrong with AGW. I like to be certain in my arguments with true believers.
We have no more than 3% of the warming that occurred as being caused by AGW. Some believe that number to be too high. Translational and kinetic energy is not retained by co2 molecules. The 2nd law of thermodynamics and the efficiency of how well that heat is retained, when it is in the wavelength to be recovered is another. There are a host of other problems with CAGW as well.

• I’ve been looking at my notes, it seems I had a lot lower number of sensitivity from working the forcing of 240 w/m^2 backwards to something like 0.20 w/m^2~ . I have to look at this. At the time it seemed to be in line with some other values I saw published.. so I think that I thought the IPCC took some number of degrees they wanted and worked backwards to get the sensitivity. Also, it seems that the difference in TSI from 1368 w/m^2 to 1360 w/m^2 drops the warming from co2 from 1.2 K to 0.8 K whick is 1/3. and that is in line with a drop in 1 degree to 0.66. No matter how I do the math on this, IPCC looks like smoke and mirrors.

11. Bill Illis says:

“It is possible to surmise major characteristics of the cause of the lag. These are; 1) high relative heat capacity, 2) very large total heat capacity”
What you are describing here is more of heat capacity variety as you noted.
These operate on a millisecond, hourly, daily, monthly and seasonal basis.
Just think of the ground/soil itself. It is absorbing up to 99.7% (nobody really knows) of the solar energy during the sunshine hours and then slowly releasing that at night.
If you turn everything around to a joules/second basis (solar photons/second basis) the numbers are truly weird. In the middle of the day, up to 960 joules/second/m2 can be coming in from the Sun, but the air temperature is only changing by 0.008 joules/second/m2. Where is the 959.992 joules/second/m2 going?
At night, 000.001 joules/second/m2 is coming in but the air temperature cools off as if 0.005 joules/second is being emitted back to the upper layers of the atmosphere.
Nobody has really measured how much is being absorbed directly by the land surface molecules on a per second basis versus how much is being emitted directly back to space in the daytime.
But this is exactly what is physically happening. Photons are flying around everywhere all the time and being absorbed and emitted by surface/air molecules every picosecond. Everything is happening at quantum mechanics speeds and the speed of light. In the middle of the day, 2,897,651,675,218,830,000,000 photons of solar energy are coming in from the Sun per second per m2, yet only 24,147,298,521,136,200 photons are showing up in the air temperature.
Climate science ignores all these issues (which is actually the real fundamental physics).

12. kevin kilty says:

A stable system possesses sufficient phase and gain margin, i.e. a gain sufficiently smaller than 1 and a phase sufficiently short of 180 degrees.

13. Begs the question, Is there intelligent life in Climatology?
The Good Enough for Government thinking, that accepted an unprecedented positive feedback as the dominant climate forcing..
While such a feedback would have already exterminated most of life on earth.
The remarkable continuity of life speaks for itself.
One would have thought.. Oh never mind not during mass hysteria and social decay when feelings trump logic.
The very success of the CAGW meme is proof that far too many, so called educated, people did no thinking.

• John Harmsworth says:

Half the people you meet are lower lower than average intelligence. In climate science and government this figure rises to 97%.

14. Roger Taguchi says:

1. The Earth is at perihelion (closest to the Sun) in early January, and yet the average temperature is then the lowest. The obvious reason is that most of the land masses are in the Northern Hemisphere, and there is more highly reflective snow and ice covering the Earth’s surface, due to the tilt of the Earth which means the Sun is at a lower altitude in the North. The higher albedo means less of the incoming Solar radiation is absorbed, so temperatures are lowest, despite the close approach. The lag in heating is due to the need to melt some of the snow in Spring, which is a phase change (heat is absorbed without change in temperature) which also changes the albedo.
2.(a) The climate sensitivity in the literature is 1 degree (not counting feedbacks), with an assumed extra 2 degrees (200%) due to positive feedback. [See https://en.wikipedia.org/wiki/Climate_sensitivity ]
(b) Perhaps this 200% positive feedback is derived from the fact that water vapor is twice as important a greenhouse gas as CO2 [this comes from the infrared (IR) spectra obtained by satellites looking down on the Earth].
(c) The fatal flaw in this assumption is that doubling CO2 does not double water vapor. Increasing temperature from 15.0 to 15.6 degrees Celsius increases saturated water vapor from 12.788 to 13.290 mm Hg, an increase by a factor of 1.039, or by 3.9% (see the Handbook of Chemistry and Physics). Multiplying by a weighting factor of 2 (because water vapor is twice as important a greenhouse gas as CO2) would increase this only to 8%, not 200%. This factor of 8% increase would also apply at 50% relative humidity instead of 100%.
(d) Similar calculations starting from 40.0, 30.0, 20.0, 10.0, 0.0, and -10 degrees Celsius show increases by only 3.2%, 3.5%, 3.8%, 4.1%, 4.5% and 4.9% (supercooled water) or 5.5% (ice), respectively. The % increase is roughly constant because saturated vapor pressure is roughly an exponential function of temperature (see https://en.wikipedia.org/wiki/Clausius-Clapeyron_relation , which says that a 1 degree increase in temperature results in a 7% increase in vapor pressure). For an exponential relation, exp[K(T+0.6)] = exp(KT).exp(0.6K) = constant.exp(KT) for any T, since K is a constant (and therefore 0.6K and exp(0.6K) would both be constants).
3. From the updated Kiehl & Trenberth energy budget diagram available at https://chriscolose.wordpress.com/2008/12/10/an-update-to-kiehl-and-trenberth-1997 , and the TOA infrared (IR) flux of 260 W/m^2 from a 288.2 K cloudless Earth’s surface (as shown in the MODTRAN spectrum available at https://en.wikipedia.org/wiki/Radiative_forcing , compared to 240 W/m^2 overall for energy balance), the Bond albedo for the cloudless surface is 0.18 (this includes snow-covered surfaces) and for clouds is 0.37. For a cloud cover of 62%, it is easy to show that even a 1% change of the Earth’s total surface from cloudless to clouded will result in a negative feedback which will swamp the 8% positive feedback due to increased water vapor (which is a gas, not to be confused with clouds which are made of macroscopic liquid droplets or ice crystals). Thus there is likely to be a net negative feedback, not a 200% positive feedback.
4. Even the 1 degree warming on doubling CO2 (not including feedbacks) is too large. At 288.2 K and emissivity 0.98, the Stefan-Boltzmann law gives an emission of 383.34 W/m^2. Adding a “radiative forcing” of 3.7 W/m^2 on doubling CO2 gives a new total of 387.04 W/m^2 which must be emitted for energy balance. Using the Stefan-Boltzmann law backwards, this corresponds at emissivity 0.98 to a new temperature of 288.893 K. This gives 288.893 – 288.2 = 0.693 degrees for the temperature sensitivity (not including feedbacks), not 1 degree. The MODTRAN spectrum in the Radiative_forcing Wikipedia article shows a TOA (Top Of the Atmosphere) radiative flux difference of 3.39 W/m^2 on doubling CO2 from 300 to 600 ppmv, so if this value is used instread of 3.7 W/m^2, the temperature sensitivity (not including feedbacks) would be only 0.63 degrees, and even lower when a net negative feedback is included. Because of saturation effects (diminishing returns on increasing CO2), the IPCC prediction of future warming as CO2 increases from the present 400 ppmv to 600 ppmv is a factor of 8 too large. This explains the “hiatus” in global temperatures for the last 15-18 years, even as CO2 has continued to increase, for the CO2 enhanced greenhouse effect (including water vapor and cloud feedbacks) is only a very minor component of the climate. Therefore all the fears of Catastrophic Anthropogenic Climate Change have been unfounded.
5.(a) As for the “settled science” of the physics of the atmosphere, even the spectra have been staggeringly misinterpreted by the “experts”, like Grant Petty in his very good book “A First Course in Atmospheric Radiation, Second Edition” and Jack Barrett at this excellent website http://www.barrettbellamyclimate.com/ .
They have merely repeated the literature interpretation of the spectra as “emission” spectra (consider that Petty’s book title includes “Atmospheric Radiation”, i.e. “emission”). Any competent chemist can tell at a glance that most of the spectra are “absorption” spectra, with the truncation of the downward CO2 absorption ditch by a “220 K” CO2 emission from the stratosphere powered by incoming Solar UV and visible radiation absorbed by ozone (this heating by ozone explains the temperature inversion in the stratosphere). For the difference between emission and absorption spectra, see
https://en.wikipedia.org/wiki/Emission_spectrum and https://en.wikipedia.org/wiki/Fraunhofer_lines and
https://en.wikipedia.org/wiki/Spectral_line .
(b) To understand the blunder in the literature, you don’t even have to know the reason for the details of the spectra. Consider that Petty’s Figs. or the spectrum available as Fig. 3 at http://climateaudit.org/?p=2572 (very well modelled by the MODTRAN computed spectrum) truncate the horizontal Wavenumber axis at around 1600 cm^-1 (which is sensible if we interpret the “emission” going to zero at higher wavenumbers). But the Planck black body emission of a 288.2 K Earth’s surface is not zero at 1600 cm^-1, but extends to 2200 cm^-1. The drop to zero detected by the satellite sensor is due to almost complete 100% absorption of those frequencies due to the R-branch of the water vapor bond-bending spectrum. To see the full water vapor spectrum to 2100 cm^-1, see Fig. 3 at
http://smsc.cnes.fr/documentation/IASI/Publications/ClerbauxACP2009.pdf .
You will note the very wide absorption band for water vapor bond-bending, from 1200 to 2100 cm^-1, with the “band origin” notch of decreased absorption at 1595 cm^-1 (which sometimes appears as a non-zero “emission” in some spectra).
(c) As I have written previously, the main molecules N2, O2 and Ar that make up 99.9% of dry air cannot and do not emit any significant infrared (IR), black body or otherwise, since the molecules are non-polar (do not possess an electric dipole moment). This is easily seen in the “window” at 900 cm^-1, for a spectrometer looking upward at nighttime sees zero emission from the entire thickness of the atmosphere. By Kirchhoff’s Law, a good emitter is also a good absorber, and vice versa, so N2, O2 and Ar cannot absorb infrared (IR) photons either. They do gain energy during radiationless collisions with vibrationally excited CO2 and water vapor molecules formed after they have absorbed some of the IR emitted from the solid and liquid Earth. Since N2, O2 and Ar outnumber CO2 by a factor of 2500:1, most of the energy absorbed as heat ends up as enthalpy (heat content) of these non-emitting molecules (the heat capacity at constant pressure for N2, O2 and linear CO2 is 7k/2 per molecule, where k is Boltzmann’s constant; Cp for Ar is only 5k/2 per molecule.
See https://en.wikipedia.org/wiki/Heat_capacity ).

15. taxed says:

What could be a big negative impact on the NH is if the Polar jet splits into two during the summer months. This is what is going to happen over the northern Atlantic during the first half of next week. lt will be interesting to see what effect it has both in the Arctic and across the North Atlantic. Am expecting it will bring below average temps across much of this area.

• emsnews says:

It is going to cause SNOW on my little mountain in upstate NY. This is one of the coldest springs I have experienced, rapidly becoming as cold as when we had a massive volcanic eruption roughly 30 years ago in the Philippines.

16. 601nan says:

Oh Dear!
“The initial 1912-1928 data” that is the “Mean Sea Level Datum 1929” that took nearly 50 years to rename as the National Geodetic Vertical Datum 1929 by act of Congress and unfortunately is still widely used, in Alaska, amongst others, as the National (vertical) Datum!
The NGVD 29 is NOT mean sea level nor any kind of physical level! It is just … wrong and demonstrated by the National Geodetic Survey in the network re-adjustment that began in 1982 and finished as GPS survey instruments became available.
The situation is more grave that I ever suspected.

17. Dr. Strangelove says:

In climate science, a negative feedback is a one degree Kelvin change in surface temperature in response to TOA radiative forcing of greater than 3.3 W/m^2. A positive feedback is less than 3.3 W/m^2. No feedback corresponds to 1.1 K per doubling of CO2. Independently Lindzen and Spencer found a strong negative feedback of 6 W/m^2 in their analyses of satellite data. It means TCR must be less than 1.1 K per doubling of CO2

18. I agree that highly positive feedbacks are unlikely. This explanation however, doesn’t hold water.
Peak temperature happens after peak insolation for the simple reason that while insolation may have started to fall from the peak, it is still higher than the equilibrium temperature of that place on earth for a time, so the earth there continues to warm even though insolation is falling, and will continue to warm until insolation drops below that equilibrium temperature. So, peak temps lag peak insolation and this has nothing to do with any feedback at all. Min temps lag min insolation for precisely the same reason. Once min insolation starts to rise, it is still below the equilibrium temperature of the earth at that time, so it continues to cool.
I’m in agreement with Germinio and Nick Stokes upthread, and (I think) Joe Born. The author’s observations are of a simple lag, not a feedback.

• Kaiser Derden says:

but you have to agree that the sum of all feedbacks must be negative or temperatures would never drop … that is not a lag but a feedback …

• John Harmsworth says:

There is a constant input and a stable temperature. Therefore, the SUM of the feedbacks is negative. End of story and a simple description of the system.

• There is a constant input and a stable temperature. Therefore, the SUM of the feedbacks is negative. End of story and a simple description of the system.

What “stable temperature”? When has it EVER been stable?

19. Ian H says:

The link between lag and feedback is more complicated than you imply, as is clear if you think about what positive feedback might look like.

20. The lag is not due to feedback, but is the consequence of a time constant. The lag would be present whether the net feedback appears positive or negative, although Bode’s control theory was misapplied to climate analysis, so the very idea that control theory feedback analysis even applies to the climate system is incorrect.
It takes time for the planet to respond to a change. Once a peak in the stimulus occurs, the planet still needs to warm to catch up with the stimulus. It’s not until the input power drops below the level required to sustain the current temperature that the temperature starts to drop. The reverse happens at the minimum where the temperature doesn’t start to rise until the stimulus rises above the level needed to sustain the current temperature.
When you model the planet as the DE Pi = Po + dE/dt, where Pi is the input power, Po is the power leaving the planet and E is the energy stored by planet, time constants trivially arise. Since the surface temperature, T is a function of E and Po is a funciton of T (Po = e*o*T^t), this equation becomes the same form as the LTI describing an RC circuit which is the very genesis of the concept of a time constant. This equation can be re-written as Po = E/tau + dE/dt, where tau is an arbitrary amount of time such that the equation becomes true, in which case, tau is the time constant.
https://en.wikipedia.org/wiki/Time_constant
The only tangible difference is that tau is a decreasing function of temperature making the dependence of Po on E more complicated then in the case of an RC circuit and gives the appearance of net negative feedback.

21. Toneb says:

“Since this seasonal pattern actually repeats every year, the huge lag is actually a characteristic of climate behavior. In systems terms, this lag, from cause to effect, constitutes negative feedback.”
It would help greatly if you understood meteorology.
The primary reason for lagged temp response behind the sun is the uptake of heat into the oceans.
The UK for instance starts spring/early summer with the lowest SST’s around its shores and over the N Atlantic too. This causes air advected over the country to be cooler relative to late summer and the addition of TSI into necessarily gives lower max temps. It is NOT a feedback in any way (globally) it is merely the differential temporal transport through a complex climate system of which the ocean comprises BY FAR the most dominant factor.
A negative feed-back by definition has an attenuating effect on the process CAUSED BY the process.
It is not occurring here.
BTW: this last w/e the country enjoyed a very warm spell for the time of year.
Why?
Because the advected airmass came via an (almost) entirely land trajectory.
No cooling ocean.
By calling it a “negative feed-back” you are essentially say that because it’s getting warmer that is causing it to warm more slowly.
NO.
Additionally when it comes to temp response in continental interiors. The upper air mass is cooler during spring/early summer than later and although the land will readily warm via increased TSI input, that heat is convected into a cooler air-mass (producing eg April showers). This has a knock on (if you like a local feed-back) of additionally cooling the surface temps (but that is NOT the dominant effect).
You have it correct here…..
“In effect, the ocean contains such enormous amounts of water (high relative heat capacity) that they represent a vast thermal reservoir for heat absorption and subsequent release as the seasonal solar changes require.”
However as stated above it is NOT a feed-back.
IOW: you have accurately described the climate systems response to seasonal change of absorbed TSI …. and ascribed to it the term “negative feedback”.
It is not.
It is a lag.
Nothing more.
Nothing less.

22. Martin A says:

In systems terms, this lag, from cause to effect, constitutes negative feedback”
Huh? Sounds like nonsense to me. Lag, from cause to effect, in physical systems, indicates *inertia*, not feedback.

• Toneb says:

Exactly.

• Speaking of lags, changes in CO2 lag changes in global temperature.
Therefore, CO2 is an effect, not a cause (well, maybe a minuscule cause, but since it is too small to measure, the climate Null Hypothesis has not been falsified).

23. I was prepared to save this article based on the title. But soon realized it is not worth much at all. I agree that the heat capacity of the ocean far exceeds that of the atmosphere and the land. But surely we have learned from borehole studies that land has a non-negligible heat capacity.
In some places in the northern US and southern Canada, the land freezes in winter to a depth of one meter (40 inches). This would cause a substantial lag in seasonal ambient temperature relative to insolation. But it’s not what we regard as a feedback.
The following would be classed as negative feedback. Insolation causes evaporation of water from land and sea surfaces. The water vapor rises and forms clouds, some of which reflect back into space part of incoming short wave radiation. In this process, some clouds contribute to Earth’s albedo denoted by A in the formula
Ri = Rt / 4 * (1-A)
where Rt is total radiation falling upon the Earth seen as a disc (about 1360 Wm-2 ) and Rt/4 is about 340 Wm-2 averaged over the Earth’s entire surface.
Earth’s albedo A is known to vary. (Pallé et al. 2009)
If total incoming solar radiation is 341.4 Wm-2 and albedo A is 0.30, the net incoming radiation is 239 Wm-2 (values for illustration). If albedo changes by +/-1% (to 0.297 or 0.303), then net incoming changes by +/-0.003, to 0.697 or 0.703, giving 238.0 Wm-2 or 240.0 Wm-2.
This means that a one per cent change in albedo would result in a change in incoming radiative energy of +/-1 Wm-2. To produce the level of increase in energy imbalance estimated from ocean heat content (OHC) as reported by Hansen and updated to 0.5 Wm-2 in 2012 (Loeb et al. 2012), would be equivalent to a net decline in albedo of 0.5%. (Not a statement that a decline in albedo is or was the cause.)
Thus a decrease in albedo of 0.5% could produce a positive increase in net radiative imbalance (warming)equal to what NASA scientists have observed. (Not a statement that a decline in albedo is or was the cause.)
If, as Kevin Trenberth has asserted, the warming should have been greater according to AGW theory, then we might ask ‘What could account for the missing heat?’
Consider the possibility that CO2 does tend to warm the Earth and thus increases cloud cover and thus increases albedo. This would tend to cool the Earth because more incoming short wave radiation would be reflected back into space
That is an illustration of negative feedback and that might account for the missing warming. The climate system has an imperfect thermostat that cancels part of the tendency for CO2 to warm the planet by increasing cloud formation and thus albedo.
Svensmark and others point out that cloud formation does not work unless there are sufficient particles in the atmosphere to act as condensation nuclei. Opponents claim that his explanation is wrong because galactic cosmic rays (GCRs) cannot form such nuclei. The experiment at CERN was set up to test the physics.
In my opinion, the CERN experiment successfully demonstrated that GCRs can promote the formation of cloud condensation nuclei. Note my wording. GCRs are not themselves the condensation nuclei, but rather act upon substances in the atmosphere to cause clumping to a critical size to form condensation nuclei and cloud formation in supersaturated air.
To what extent GCRs actually do promote cloud formation, increased albedo, and negative feedback is still controversial.(KIRKBY, 2007.)
There it is, on the table. The concepts are easy to understand. The main issues with AGW is cloud and water vapor feedbacks.
Furthermore, there is a huge gap in the AGW consensus in regard to clouds. By now, everybody knows the IPCC’s spaghetti graphic that shows that the leading models (GCMs) have not converged for over 20 years. Those who have compared the models tell us that the main cause of the differences in the model outputs is the treatment of clouds.
PALLÉ, E.; GOODE, P. R.; MONTAÑÉS-RODRÍGUEZ, P. Interannual variations in Earth’s reflectance 1999-2007. Journal of Geophysical Research: Atmospheres, 2009, 114.D10 URL: .http://www.leif.org/EOS/Earthshine_Palle_2008.pdf
Hansen, James, et al. “Earth’s energy imbalance and implications.”Atmospheric Chemistry and Physics 11.24 (2011): 13421-13449.
Loeb, Norman G., et al. “Observed changes in top-of-the-atmosphere radiation and upper-ocean heating consistent within uncertainty.” Nature Geoscience 5.2 (2012): 110-113.
KIRKBY, Jasper. Cosmic rays and climate. Surveys in Geophysics, 2007, 28.5-6: 333-375.
http://arxiv.org/pdf/0804.1938.pdf
http://www.hephy.at/user/mjeitler/LECTURES/Astroparticles/CLOUD_experiment_cern_colloquium_kirkby.pdf
Video:

24. Why do we keep doing this? Misleading headlines that get us reading are a staple of tabloid journalism and should have no place here. The actual article has no real connection with feedbacks and little useful analysis, and as a result the comments have been all over the place…
There are compelling, though circumstantial, reasons to believe that feedbacks of a ‘restoring moment’ type do occur through the WV system and I’ve been waiting a long time now for a proper discussion of this.

• “Why do we keep doing this?”
Of course, I don’t know what goes on in Mr. Watts’ mind, but this is another example that tends to support my theory: he just doesn’t have the wherewithal to make sound technical judgments regarding potential content.
This post reached a congenial conclusion, and he ran it. Other pieces, which express viewpoints not favored by more-favored contributors, are spiked despite their being better reasoned.
Again, I can’t see into others’ minds, but this could explain what I see as a regrettable trend away from scientific inquiry and toward mere cheer leading.

• Couldn’t agree more, Joe.
Despite my admiration for Mr Watts I feel that too much off-boresight content lessons the impact of the more relevant stuff and gives a ready excuse for detractors to sneer at the site, or ignore it altogether.
It’s also a puzzle to me why S. Mosher rises to every bit of bait that he sniffs out here. He’s capable of a very controlled and sound critique and could bring many discussions back on track if he chose to do so, and didn’t show his contempt so obviously.
The better debate is now more often at Judith Curry’s site, because Judith selects each topic sensibly and also sets out a context. Still gets derailed annoyingly easily, though!

• Joe and moth,
I’ve been reading WUWT since it began. Anthony links to articles that he thinks will interest readers. Based on the site traffic, he has an eye for interesting articles.
Anthony also posts articles that readers submit. Anyone is free to submit an article, like this one. However, Anthony routinely posts that he may not agree with them. Some articles he that specifically disagrees with are posted anyway, for readers’ consideration.
I like that, because I often learn as much or more from the comments as I do from the original article. Readers have a talent for simmering down an article to the basics, by rejecting all the extraneous fluff that can creep in.
So don’t blame Anthony for what someone else wrote, unless Anthony specifically says he agrees with it. It’s an unwarranted assumption, and besides, you’re commenting about it, aren’t you?
Rip the article to shreds, if you can. That’s what skeptics do. Anything left standing after the smoke clears is worth considering as being factual. But unloading on the guy who provides the forum isn’t fair, IMHO.

• dbstealey:
1) I don’t blame him for posting something that turns out to be flawed, but I think we’re justified in expecting some minimum level intelligibility.
2) I think he actually does screen for correctness; he has spiked each and every proposed post I’ve submitted since I persisted in criticizing last year’s Monckton et al. paper. I don’t really think he understands the technical issue; I think he just figures what I said is wrong because Monckton et al. say it is.

25. Peter Sable says:

If you follow the argument to its logical conclusion, El Nino should be impossible…
Peter

26. Roy Spencer says:

The features you describe will also be exhibited by climate models with strongly positive climate feedbacks. The time lags are due to the heat capacity of the system, not feedbacks. While i agree climate feedbacks are likely negative, they cannot be estimated this way. It’s not clear that they can be measured at all…we’ve spent years trying.

27. The seasonal lag from heat capacity is not a negative feedback, but only a lag. The stability on ~1-month time scale is due to much of the positive feedbacks, such as albedo due to sea ice, having longer response times due to large heat capacity of the upper ocean. In a shorter time scale the Planck feedback, a negative one, is dominant.

• Your comment highlights one of the problems that bedevils discussions of whether feedback is positive or negative; in a sense the conclusion is arbitrary, depending on what one includes in the open-loop system that the feedback closes.
The Planck feedback is, as you say, negative, and it closes an otherwise open-loop system whose equilibrium gain is infinite. But in many discussions the resultant, finite-equilibrium-gain system is treated as itself being the open-loop system upon which other feedbacks act, and the question becomes whether they are on balance positive or negative. It is at the latter level of abstraction that Dr. Spencer says, “I agree climate feedbacks are likely negative.”
Failure to specify context (a failure of which I am not always innocent myself) tends to muddy the waters in blogosphere feedback discussions.

• Joe Born says, “Your comment highlights one of the problems that bedevils discussions of whether feedback is positive or negative; in a sense the conclusion is arbitrary, depending on what one includes in the open-loop system that the feedback closes.”
The point is well taken. You might be interested in the feedback discussion in this paper, which I think makes a similar point.
J. Ray Bätes, Estimating Climate Sensitivity Using Two-zone Energy Balance Models, 2015.
http://onlinelibrary.wiley.com/doi/10.1002/2015EA000154/pdf

• Frederick Colbourne:
Thank you for the link. I haven’t read it all yet, but it may suggest a feedback taxonomy I hadn’t considered, so I look forward to reading the whole thing.

28. The first indication that the AGW claim had reached me and I was skeptical is my invitation to my 1984 CoSy MidWinter party invitation :
http://cosy.com/Science/CG84-tempsEnhanced.jpg
The temperature data is from Buffalo , ~ 43 degrees North , the closest I could find to Rochester NY . I just approximated the insolation with a sine . ( I’d like it if someone would point me to the correct function wrt latidtude , particularly in APL . It takes the sort of detailed thought I ration because tasks like getting the reference counted recursive lists-of-lists in 4th.CoSy crystalline sucks lots . )
My estimate is that the temperature lag is around 2 to 3 weeks . Note there appears to be a ( 3rd harmonic ) bulge in the fall versus the spring .
The lag is not feedback ; it is inertia . And what impressed me at the time was how damned little there is . How little heat the atmosphere holds . My immediate thought was that if the Sun went out on Friday , we’d be a lifeless snowball before Monday . No way some change could take decades to show itself .

The image ( and the text ) are all constructed as a bitmap in Phil Van Cleave’s APL68000 on this Sage computer :
http://www.cosy.com/language/cosyhard/sagehds.gif
which had 500k of memory in a single address space . This was a few years before the InferiorButMarketable 8088 PC with its 16k segments ate the world and Apple soldered 256k into its 68000 based Mac defeating the potential of the chip’s 24 bit address space and making it useless for business .

• Henry Galt says:

Nice, thanks for that Bob. A lot can happen in half a lifetime 😉

• Well , 0.43 anyway 🙂
Now I’m seeking other minds to form the language community fleshing out 4th.CoSy in all their different goals , including , I hope , an understandable because it is APL succinct model of planetary physics .

29. David L. Hagen says:

Leland Park
Thanks for exploring the temperature lag behind the solar.
May I recommend the work by David Stockwell modeling a 90 degree lag for the 11 year solar cycle etc.
David R.B. Stockwell, “Key evidence for the accumulative model of high solar influence on global temperature” 4 August 23, 2011 http://vixra.org/pdf/1108.0032v1.pdf

Firstly, variations in global temperature at all time scales are more correlated with the accumulated solar anomaly than with direct solar radiation. Secondly, accumulated solar anomaly and sunspot count fits the global temperature from 1900, including the rapid increase in temperature since 1950, and the flat temperature since the turn of the century. The third, crucial piece of evidence is a 90 deg shift in the phase of the response of temperature to the 11 year solar cycle. These results, together with previous physical justifications, show that the accumulation of solar anomaly is a viable explanation for climate change without recourse to changes in heat-trapping greenhouse gasses.

See especially Fig. 3 http://vixra.org/pdf/1108.0032v1.pdf
David R.B. Stockwell shows that the direct correlation of solar irradiance with temperature R^2 is only 0.028 while the cumulative solar irradiance has a correlation R^2 of 0.72 and solar + volcanic has R^2 of 0.78. See Fig. 4 in
David R.B. Stockwell “On the Dynamics of Global Temperature” August 2, 2011 http://vixra.org/pdf/1108.0004v1.pdf
David R.B. Stockwell, “Accumulation of Solar Irradiance Anomaly as a Mechanism for Global Temperature Dynamics” 9 Aug. 2011
http://vixra.org/abs/1108.0020

. . .empirical and physically-based auto-regressive AR(1) model, where temperature response is the integral of the magnitude of solar forcing over its duration. . .The model explains 76% of the variation in GT from the 1950s by solar heating at a rate of $0.06\pm 0.03K W^{-1}m^{-2}Yr^{-1}$ relative to the solar constant of $1366Wm^{-2}$.

Stockwell further shows a 2.75 year Phase Shift in Spencer’s Data

• As I mentioned in connection with Willis Eschenbach’s post comparing daily with yearly time lags, the 2.75-year lag that Stockwell observed is interesting because it suggests more of a lumped-parameter character than does the shorter-term behavior described by Mr. Eschenbach and the head post here.

30. J. Gorman says:

Maybe I’m being dense here but it seems to me the time lag is indicative of an reactive element in the system. The heat capacity of the system may be the “reactive element” and is being used as a feedback element in the control loop. Therefore the time lag is indicative of some kind of a feedback which is what the article attempts to identify. The results of that feedback are probably debatable.

31. Craig Loehle says:

Sorry but the article does not show anything about negative feedbacks. If I take a large object and apply heat to it in a sinusoidal pattern (including no solar at night), it will take a long time to heat up. The maximum temperature may occur AFTER the maximum heat application. But this lag has nothing to do with feedback. To see feedback, one can refer to clouds. If, as Willis postulates, high temperatures cause thunderstorms which dissipate more heat to space, THIS is a negative feedback. The fourth-power of temperature radiative heat loss law is a simple negative feedback (hot objects lose heat faster). But the seasonal lag has nothing to do with it.

32. …this lag has nothing to do with feedback.
I think ‘lag’ has nothing to do with anything but a time delay. Some lags may be due to feedback. Some are not.
For example, ∆CO2 lags ∆temperature. That observation is seen on many different time scales. But it isn’t a forcing. Is it a feedback? I’m not certain.
The problem is the language. ‘Lag’ is a temporal term; nothing more, unless it’s defined further.

33. b fagan says:

The author writes:
“For a major change in equilibrium conditions, however, a correspondingly large change in the system fundamentals would be required as well as considerable time for the change to take effect. Suffice it to say that minor changes in atmospheric trace gases would not be likely to force an equilibrium change in the system.”
The first sentence – absolutely correct. The effect of our CO2 emissions, for example, are a large change in system fundamentals, and the impact is creating a long-term change before the climate comes to equilibrium – at a higher temperature.
The second sentence is misleading. There isn’t a minor change in trace gas that we’re experiencing right now. There is a “correspondingly large change” in concentrations of persistent greenhouse gases.
Persistent greenhouse gases ARE “system fundamentals” in the Earth climate. They are why the surface of the planet is 33C warmer than it should be at this distance from the sun. This has been known for hundreds of years (see Fourier, 1824 and 1827). Without them, the oceans would be frozen over, with liquid below the ice surface, like on Europa.
The significance of a component in a system is not always based upon its ratio to the bulk. That persistent GHGs make up a small portion of the bulk atmosphere is just as irrelevant as saying that less than a microgram of botulin toxin isn’t a lethal dose for a human because it is proportionally a nearly undetectable trace. That tiny trace would certainly create an equilibrium change.

34. Frank says:

Leland: The” existence of negative climate feedback” is fully understood by climate science, but the quantitative aspects are somewhat uncertain. Your post suggests an engineering background, so I’ll use calculus.
“Negative climate feedback” is provided by fact that all materials emit more thermal radiation as they get warmer. In the case of a simple blackbody, W = -oT^4 and the increase in radiation emitted with temperature dW/dT = -4oT^3 = -3.8 W/m2/K when the temperature is 255 K. (dW/dT is negative, because it is the increase in heat LOST with increasing temperature. The average photon escaping to space is emitted from about 5 km above the surface, where the temperature is about 255 K.) This is also called “Planck feedback”, but I prefer the non-standard term “Planck response” to avoid confusion with other feedbacks. The reciprocal (dT/dW = 0.26 T/(W/m2)) is called the “no-feedbacks” (or more accurately, the no-additional-feedbacks”) climate sensitivity. This value is usually multiplied by 3.7 (W/m2)/doubling of CO2 to give 1 K/doubling.
The Earth doesn’t emit like a blackbody. As temperature rises, more water vapor probably will be found in the atmosphere, allowing less thermal emission to reach space. That water vapor will also effect the albedo from clouds and the lapse rate. The surface of a warmer planet will probably be covered by less ice and snow, changing the planetary albedo. These changes are also called feedbacks and they have units of W/m2/K, just like the Planck response.)
If you want to think of the Earth as being a graybody with emissivity e, then: W = -eoT^4 and dW/dT = -4eoT^3 – oT^4*(de/dT). dW/dT for the Earth is called the “climate feedback parameter” and every climate scientist believes it is negative. The de/dT term represents the change in radiation to space associated with a change in the Earth’s emissivity – the sum of the WV, LR, cloud, and albedo feedbacks in the previous paragraph.. If de/dT is negative (making -oT^4*(de/dT) positive) – as most climate scientists believe – they say that positive feedback exists. However, that does not imply that dW/dT – the climate feedback parameter for the whole planet is positive. It simply means that the -oT^4*(de/dT) is positive and negates SOME of the negative Planck response (-4eoT^3). As long as the sum of these two terms is negative – as all climate scientists believe – the overall negative climate feedback you expect to see exists.
If the -oT^4*(de/dT) is big enough, then the climate feedback parameter can be zero or positive. That is called a runaway greenhouse effect. Climate scientist don’t believe that dW/dT is positive, but some believe that the positive feedbacks come close to canceling the negative Planck response. That is how they come up with a high climate sensitivity (the reciprocal of the climate feedback parameter dW/dT). The whole debate about climate change comes down to the sign and magnitude of de/dT and there is uncertainty about this subject.
Therefore, you and climate scientists are in complete agreement: dW/dT (the climate feedback parameter) is negative. However, they believe that it isn’t as negative expected for Planck feedback alone. They call the phenomena that make the climate feedback parameter LESS NEGATIVE, BUT NOT GREATER THAN ZERO “positive feedbacks”.

• Why can’t people get past this 255K meme . The temperature calculated by simply summing the energy impinging on a point in our orbit ( virtually all from the sun ) is about 278.5 +- 2.3 from peri- to ap-helion . That is the black body temperature in our orbit and it is exactly the same for any gray , defined as flat spectrum , body in our orbit .
I see no hope for any forward progress in “climate science” until this most fundamental fact is understood .

• Frank says:

Bob asked: “Why can’t people get past this 255K meme?”
I don’t particularly care for the 255K meme either. Therefore I said: “If you WANT to think of the Earth as being a graybody with emissivity e”, … emissivity changes with temperature (de/dT) and this change is the result of feedbacks that modify Planck response/feedback). What does “emissivity” mean when you are talking about a gas? Engineering types like this approach, and it is easier to discuss than the alternative used by chemists and physicists, who are used to thinking in terms of absorption and emission by groups of molecules. I judged that the author of this post didn’t think in terms of the behavior of molecules.
At a more fundamental level, CHANGES in the spectral intensity of radiation (dI) at a particularly wavelength (lambda) traveling through a layer of the atmosphere (dz thick, but thin enough that temperature, pressure and radiation can be treated as constants) are calculated using the Schwarzschild eqn:
dI = n*o*B(lambda,T)*dz – n*o*I*dz
where n is the density of GHG, o is the absorption cross-section for the GHG at wavelength lambda, B(lambda,T) is the Planck function, and I is the spectral intensity of the radiation entering the layer. The first term is the emission and the second term is the absorption by the molecules in the layer. This equation is numerically integrated over a path, usually from the surface to space for OLR or space to the surface for DLR and then the spectral intensity is integrated over all relevant wavelengths to produce the power flux. Using the “plane-parallel approximation”, only the component of the flux perpendicular to the surface is used since the fluxes in the other two directions cancel. Online programs like MODTRAN and HITRAN contain a database of absorption cross-sections and automate the numerical integration – at the cost of making the whole process seem like as much of a “black-box” as the graybody approach. If you are in the laboratory using a powerful light source, the emission term can be neglected and integration affords Beer’s law. Additional terms are added when scattering is important. Above the stratosphere, molecular collisions are less-frequent, a Boltzmann distribution of energy states (or local thermodynamic equilibrium, LTE) no longer exists and the emission term needs to be modified. Fortunately, all the processes important to climate occur where LTE exists.
When absorption and emission have come into equilibrium and dI is zero, the radiation has “blackbody” spectral intensity, B(lambda,T). Planck’s Law and the S-B equation tells us what happens when radiation is in equilibrium with its surroundings (absorption = emission). Such an equilibrium does not exist everywhere in the atmosphere at all wavelengths. The Schwarzschild eqn tell us what happens when such equilibrium doesn’t exist . Hopefully, this helps get past the 255 K meme.
http://climatemodels.uchicago.edu/modtran/
In the end, the planet emits more radiation to space as it warms and therefore has a negative climate feedback parameter. Climate scientists believe that Planck response/feedback is made less negative by the effect of “positive feedbacks”. Confusing. If you want to get past the flawed science presented at WUWT, try scienceofdoom.com . Despite the name, the host is generally agnostic about CAGW, but fanatic about getting the science correct.

• Sorry , I work at the most superficial level . So we agree that before one starts descending into the atmosphere the “temperature” in our orbit which must be matched by any purported graph of temperature as a function of altitude is ~ 278.5 +- 2.3 .
All my energies right now are fleshing out 4th.CoSy to make it more accessible and useful to other minds and perhaps seduce some of them to join in implementing such necessary vocabulary as the Schwarzschild absorption equation and the vocabulary of physics . generally .
Looking more closely at what is being claimed for this Schwarzschild equation , I see it is claimed to be the crux equation for the trapping of heat by a spectral phenomenon . That certainly raises my interest in implementing it .
However , tip the standard image on its side and consider a tube with a surface of some particular absorptive , reflective surface at one end , and a series of spectra filters between it and a radiant source of some particular power spectrum at the other . The claim is that integrating across the stack of filters , the energy density , ie : temperature , next to the surface can be made to be higher than the energy density between the first filter and the radiant source .
Surely if you can construct such a surface and such a stack of spectral filters , we have the makings of a perpetual heat engine . And surely you can specify what the optimal absorption reflection spectrum of the surface , and the absorption transmission spectrum of the filters have to be and how much of a temperature gradient will created ,
Actually , since all the filters can be collapsed into 1 thru the appropriate multiplication , can you give us an example surface ( absorption ; reflection ) spectrum and dz filter and radiant source to demonstrate the effect ?
David Appell claims its intrinsically not possible to replicate the effect which is claimed to trap an energy density at Venus’s surface 25 times ( 400k greater ) that just 250km next to it in its orbit .
Surely you could construct such a tube to conduct such an experiment in the tunnel along side the SLAC 3.2km accelerator for a lot less money than the cost of the accelerator creating over a 5c temperature difference if you can come close to matching Venus’s gradient .
Can you show us a numerical example of a surface , a filter dz and an input power spectrum creating a greater energy density , ie : temperature , on the side of the filter away from the source ?

• “Climate scientists believe that Planck response/feedback is made less negative by the effect of “positive feedbacks”.
This is something that climate science has so wrong its absurd. Feedback, positive or negative has little effect on the sensitivity associated with the Planck response (dT/dW). The dependence of the sensitivity on everything (slope of the relationship between temperature and input) where input == emissions at LTE goes as 1/e*T^3, where e is the ‘effective’ emissivity. The sensitivity decreases far more rapidly owing to the temperature term than decreasing the effective emissivity of the planet can increase it. Both increasing clouds and increasing GHG’s will linearly decrease the effective emissivity and nothing else has any effect.
Another problem is that climate science is absolutely clueless about feedback system analysis and have horribly misapplied Bode’s feedback analysis to the climate system. Bode’s amplifier measures the input and feedback to determine how much output to deliver from an implied, infinite source. The climate system ‘amplifier’ consumes the input and feedback to produce the output and this COE constraint has never been accounted for and is why many believe that positive feedback has such a large effect. In fact, the very idea of runaway feedback is precluded by the absence of the infinite source assumed by Bode. Note that the assumption of a climate system ‘amplifier’ with an implied power supply was baked into AR1 and has never been fixed.
Earth is not a ideal black body like the Moon (after albedo), but is a non ideal black body which is called a gray body and if you consider that the temperature of this gray body is the surface temperature and its output is the emissions by the planet, the measured response of the planet to LTE changes in solar input (changes to planet output in LTE) is so close to what you would expect from a gray body, especially the long term averages, to claim anything else is to deny the data and the physics. I’ve shown this plot before and as far as I’m concerned, the only way to explain the data (3 decades of satellite data from ISCCP/GISS) is that the planet acts like a nearly ideal gray body whose temperature is the average surface temperature (about 287K) and whose effective emissivity is about 0.62 (the green line in the picture is the theoretical response).
Each little dot is 1 month of average emissions vs. average surface temperature for constant latitude slices of the planet. The larger dots are the 3 decade averages per slice. The solid lines represent the SB relationship plotted at the same scale as the data. Note that this plot also shows that the IPCC sensitivity of 0.8C per W/m^2 seems to have arisen as the result of a linearization error (i.e. ignoring the T^4 dependence of emissions on temperature).
Main stream climate science (and many skeptics) deny the applicability of gray bodies because they think that the system can not be this simple. In fact much of the complexity was added to IPCC AR’s as assumptions without adequate peer review in order to provides the necessary wiggle room to accommodate the desired result of a high sensitivity.
If Earth had an atmosphere containing only N2 and O2 it would behave like a nearly ideal black body. Adding trace amounts of GHG’s plus clouds will not cause the system deviate so far away from the requirements of physical laws that the absurdly high sensitivity claimed can be supported. As an exercise, start from a black body planet with an atmosphere containing only N2/O2 and incrementally add GHG’s and clouds to see what happens.

• Frank says:

Bob said: ” I work at the most superficial level.”
At the most superficial level, you need consider conservation of energy. The energy flux passing downward through the atmosphere needs to be equal the energy flux passing upward through the atmosphere at every altitude – or the temperature (internal energy) is changing and a steady-state doesn’t exist. There are two major mechanisms by which energy can travel through the atmosphere: radiation and convection (mostly the latent heat associated with the evaporation and condensation of water).
The Schwarzschild eqn tells us how both LWR and SWR behave in a clear atmosphere. (To a first approximation, clouds can be modeled as semi-transparent surfaces that transmit, absorb, reflect and emit radiation.) To use the Schwarzschild eqn., you need to know how the GHG density (especially humidity), temperate and absorption coefficients vary with altitude. In this equation,it would be more accurate to write: dI(z), I(z), n(z), T(z), and o(z). So, one needs to INPUT the state of the atmosphere (everywhere on the planet) to calculate the radiative fluxes passing through it. Locations where absorption is greater than (or less than) emission are being warmed (or cooled) by the net flux of radiation into or out of them. The Schwarzschild eqn is what allows us to predict that an instantaneous doubling of CO2 (leaving all other parameters the same), will change the net flux of radiation by -3.7 W/m2. That implies that the atmosphere will warm because it is now absorbing more radiation than it is emitting. If you calculate the temperature vs altitude profile for an atmosphere heated from below by SWR and cooled only by emission of LWR, the temperature rises exponentially as one approaches the surface. If no convection existed and radiative equilibrium controlled temperature, the surface of the planet would be about 350 K.
Convection involves fluid flow, so we can’t calculate convective flux of energy in W/m2 from first principles, (as we can with radiation). However, we can calculate from first principles the maximum “lapse rate” (-dT/dz) that can exist without creating instability to buoyancy-driven convection. Instability occurs when a rising packet of air expands and cools, but still remains warmer and less dense than the surrounding air. The maximum stable lapse rate is 9.8 K/km for “dry air” and as low as 4.9 K/km for the most humid air on the planet, which releases heat as water condenses due to falling temperature. Frustratingly, we can only calculate when convection will occur (an unstable lapse rate), but not how much heat is transported upwards (in W/m2) when an unstable lapse rate develops.
So purely radiative equilibrium says that the lapse rate increases exponentially as you go lower in the atmosphere, but there is a maximum lapse rate compatible with stability to buoyancy-driven convection. In the 1960’s Manabe combined these two principle into the concept of radiative-convective equilibrium: Whenever net radiation flux alone can’t maintain a stable lapse rate, convection will carry heat upwards fast enough to produce a marginally stable average lapse rate. Probes descending through the Venusian atmosphere show a nearly constant lapse rate agreeing with theory from near the top of the atmosphere at 250 K to the surface at 750 K. On Earth, the lapse rate averages 6.5 K/km through the troposphere. The tropopause is where radiation can cool enough to maintain a stable lapse rate without any assistance from convection. At the surface, net LWR (OLR-DLR = 390-333 = 57 W/m2) needs to be supplemented by about 100 W/m2 of convection to equal SWR absorbed by the surface. Average rainfall of about 1 m per year represents about 80 W/m2 of latent heat.
Pressure differences also drive horizontal mass flow via winds.
AOGCMs and weather prediction programs calculate mass, radiative, and convective fluxes between grid cells. Because condensation and turbulent flow occur on scales much smaller than the size of a grid cell, adjustable parameters are needed to represent these processes. Do clouds begin to form in a grid cell when the average relative humidity is 98% or 99%? The planet’s albedo changes dramatically with this parameter. How much turbulent mixing occurs between nearby rising and falling air masses? ECS can change 1 K/doubling by adjusting this parameter! Increasing aerosols reduce the initial droplet size when clouds condense, increasing their reflection, but these smaller droplets evaporate and condense many times produce thermodynamically more-stable larger droplets. What parameters best describe the radiative properties a cloudy grid cell? The rate of evaporation is controlled by the speed of the wind and relative humidity (and therefore temperature) and a tunable parameter. One can tune such parameters one at a time and find a set of parameters that provide a reasonable representation for today’s climate and the historical temperature record. One can’t find a set of parameters that optimally describes all aspect of today’s climate at the same time. The record of surface and troposphere warming is statistically inconsistent with the projections of the mean of the IPPCs models. Observations of the seasonal cycle in OLR and SWR associated with an seasonal changes in each hemisphere (that produce a 3.5 K increase in GMST not seen with temperature anomalies) show that all of today’s models are flawed. This paper is by the Manabe who first published radiative-convective equilibrium and the first AOGCM:
http://www.pnas.org/content/110/19/7568.full.pdf
People like the author of this post spread the idea that climate science is plagued by massive errors that even amateurs can spot. For example, the lack of negative feedback in climate PROVES climate science is a hoax. Their ignorance is appalling and they don’t want to know why they may be wrong. (My reply got no response.) The deceptions by the other side are equally appalling, and many of them are real scientists who understand the science, but can’t accept their that science can’t reliably describe a future with rising CO2 and thereby save the planet. David Appell is an example of a partisan press who provide more propaganda than accurate science.
ScienceofDoom.com was the biggest help on my journey to this level of understanding. Most WUWT readers believe that it is hoax, another sign of ignorance. I journeyed there at the recommendation of Steve McIntyre. (Real Climate sent me to ClimateAudit by insulting my intelligence with their defense of AIT and insults of McIntyre. If the hated him so much, I wanted to know why.

• I’m superficial , but I like equations and computations .
An equation is worth a thousand words and a computation demonstrating it ten thousand , and an experimental demonstration the whole bible .
I really find it mentally useful to tip the problem no its side . Let’s say the surface is on the left , and the radiant source on the right and a filter following the Schwarzschild equation in between . You are claiming that for some set of parameter values the Schwarzschild differential will cause a higher temperature to be maintained on the left of the filter than between it and the radiant source .
Could you please give us a set of parameters , that is , surface , filter and source spectra quantitatively demonstrating this effect .
I’m sorry , I’ve read far too many words that go wandering off talking about all sorts of things like climate models like your link does , or word wave about pressure ( a gravitational phenomenon ) or clouds .
Can we prove and test just this one equation which is claimed to be the crux ? That’s the way real classical physics is done .

• Frank says:

Bob wrote: “Can we prove and test just this one equation which is claimed to be the crux ? That’s the way real classical physics is done.”
Great question.
ScienceofDoom has several series of posts on understanding and visualization atmospheric radiation, some of which include comparison of theory and observation. The best might be the post linked below. The relevance of the Schwarzschild eqn to our atmosphere has been demonstrated by studying the spectrum of OLR reaching satellites in space and planes in the upper atmosphere, as well as the spectrum of DLR reaching the ground. Usually the temperature, density and humidity of the atmosphere at all altitudes is collected by a radiosonde as part of the study.
Some of SOD’s material is taken from Grant Petty’s \$39 text for meteorologists (AGW is not mentioned), “A First Course in Atmospheric Radiation”. If you want to be able to look up information from a reliable source, this is the place to start. I own a copy.
dI = n*o*B(lambda,T)*dz – n*o*I*dz
The atmosphere is a difficult place to accurately study fundamental physics. Carefully controlled experiments in the laboratory have many advantages. Using hot high intensity light sources in the laboratory makes the first term negligible.
dI = – n*o*I*dz
I/I_0 = exp(-noz) Beers Law
When n and o are constant, the equation has an analytical solution: Beer’s Law. This equation is used in laboratories every day. The absorption cross-sections (o) for GHGs have been studied for decades in the laboratory using Beer’s Law and the HITRAN database containing the best values was started in the 1960s so aeronautical engineers could calculate radiative heat flux around high performance aircraft and spacecraft. One paper I read used apparatus with a path length of 190 m via multiple reflections so samples could be studied at the low pressures and temperatures found at the tropopause.
Is is hard to study emission of thermal infrared in the laboratory, because your sample and everything else in the lab emits thermal infrared. The first term comes from the study of the blackbody radiation emitted by heated black cavities (designed to promote equilibrium between absorption and emission) that led to the development of quantum mechanics. The Planck function (B(lambda,T)) tells us how the relative intensity of such radiation varies with wavelength. Conservation of energy demands that the Planck function be multiplied by n*o*dz in the Schwarzschild eqn so that light of blackbody intensity, i = B(lambda,T), where absorption equals emission produces dI = 0.
FWIW, I was intensely frustrated for a long time, because I recognized the doubled CO2 would be absorb AND EMIT twice as much radiation. No one could convincingly explain why the net result of two large opposing changes would be a slight reduction in OLR at the TOA. Within minutes of seeing the Schwarzschild eqn, everything became clear. For example, dI varies with n, but what determines whether dI is negative or positive? (:))
When people used to talk about “settled science”, the only part that was really settled IMO was the Schwarzschild eqn. Nobody wants to write it down, because there is nothing intuitive about an equation that requires numerical integration. So the consensus waves their hands about “trapping” heat in the atmosphere – which is a gross and embarrassing over-simplification

• 1sky1 says:

I’m glad to see the highly ambiguous, simplistic notion of “climate feedback” examined by others from the standpoint of first principles. Two caveats with respect to climate “sensitivity” seem to be in order: 1) what climate science calls feedback is more akin to time-variable changes in system-response characteristics (or to lagged recirculation of thermal energy) than to any rigorous concept of instantaneous feedback in time-invariant control systems with op-amps and 2) on the basis of available empirical evidence there is no compelling reason to assume thermodynamic equilibrium beyond LTE.
IMHO, co2isnotevil comes very close to portraying the actual physical situation. I cannot agree, however, with Frank’s assessment of scienceofdoom as devoted to physical rigor, when it portrays the effect of atmospheric backradiation as being the equivalent of a “second sun.” Unlike any sun, the atmosphere does NOT produce any energy; it merely absorbs and isotropically re-emits (recirculates) LWIR arising from thermalization of insolation at the surface.
Such are the popular myths of “climate science.”

• Frank says:

Frank wrote: “Climate scientists believe that Planck response/feedback is made less negative by the effect of “positive feedbacks”.
CO2isnotevil says: “This is something that climate science has so wrong its absurd. Feedback, positive or negative has little effect on the sensitivity associated with the Planck response (dT/dW).”
Frank responds: CO2isnotevil doesn’t have the slightest idea of what he is talking about. Planck feedback (dW/dT. not dT/dW) is the response (increased emission of radiation) of a blackbody to warming. Although Planck feedback varies slightly with temperature (dW/dT = 4oT^3 = 3.8 W/m2 at 255 K and 5.4 W/m2 at 288K), other feedbacks do not change Planck feedback. The tendency of all matter to emit more radiation as it gets warmer is innate and not unchanged by outside factors. Other feedbacks change the climate feedback parameter for the planet, not Planck feedback. If you surround a blackbody with some material (like our atmosphere) that changes how much radiation escapes from our planet to space (or changes how much incoming SWR is reflected back to space), the amount of net radiation leaving the planet will not be the amount expected for Planck feedback alone.
CO2isnotevil shows us a graph of surface temperature at various locations on the planet and the amount of OLR exiting the planet for space above that location. From that information dT/dW is deduced for the planet. First, he has the wrong variable on the wrong axis: the independent variable is surface temperature and the dependent variable is temperature. Differences in local surface temperature cause differences in local TOA OLR, not vice versa. Now let’s look at the temperature range: 100K to 300K. Where on the planet is the temperature between 100K and 250K. Mostly Antarctica and perhaps some Arctic regions in winter. So the slope of this graph is mostly determined by what happens in Antarctica when it warms, and it contains little information about what is happening on the rest of the planet. Antarctica is extremely dry, so water vapor feedback and lapse rate feedback and cloud feedback between 100 K and 250 K are not typical of what happens over most of the planet.
If you switch axes and look at OLR from the tropics (T greater than 295 K), you see a lot of points that are not on the curve that roughly fits the rest of the data. OLR ranges from 250-330 W/m2 almost independently from temperature. This region represent about half of the surface of the planet.
What we really want to know is how total GLOBAL OLR AND reflected SWR (not shown on this graph) change when the surface temperature rises a few degK, not how local OLR varies with local surface temperature across 200 K. The best information on this subject comes from the seasonal 3.5 K increase in GMST associated with summer in the NH (which is due to its lower heat capacity). (The process of converting to temperature anomalies removes this large annual change from the data we normally see.) We have observed this cycle from space for more than 20 years. OLR increases about 2.2 W/m2/K from both clear skies (water vapor and lapse rate feedback only) and all skies, not the 3.8 W/m2/K we expect for Planck feedback at 255 K. There is no doubt that water vapor+lapse rate feedback reduces OLR from that expected for Planck feedback alone. However, seasonal warming is not global warming – one hemisphere is cooling while the other is warming. There are also changes in reflected SWR from surface snow cover, but little there is little seasonal snow cover in the SH. A full interpretation is difficult. However, one thing is clear: the data in this paper show that all AOGCMs get some aspects of the seasonal change in OLR and reflected SWR wrong, and that they are all mutually INCONSISTENT.
http://www.pnas.org/content/110/19/7568/F1.medium.gif
http://www.pnas.org/content/110/19/7568.full.pdf
CO2isnotevil also says: “Another problem is that climate science is absolutely clueless about feedback system analysis and have horribly misapplied Bode’s feedback analysis to the climate system. Bode’s amplifier measures the input and feedback to determine how much output to deliver from an implied, infinite source. The climate system ‘amplifier’ consumes the input and feedback to produce the output and this COE constraint has never been accounted for and is why many believe that positive feedback has such a large effect. In fact, the very idea of runaway feedback is precluded by the absence of the infinite source assumed by Bode. Note that the assumption of a climate system ‘amplifier’ with an implied power supply was baked into AR1 and has never been fixed.
Frank responds: More nonsense from CO2isnotevil. Unlike an amplifier, which needs a power source, the Earth has a power source – the sun. The Earth can warm because OLR escapes to space more slowly or because less SWR is reflected to space. A doubling of CO2 will reduce OLR escaping to space by about 4 W/m2. Using just a Planck feedback of about 4 W/m2/K, temperature will need to rise about 1 K to emit the same amount of radiation as before doubling. Warming of 1 K will change the amount of water vapor in the atmosphere (and thereby change lapse rate), change cloud cover, and reduce seasonal snow cover (surface albedo). Suppose the total of these non-Planck feedbacks reduce OLR and reflected SWR by 2 W/m2/K. 1 K of warming will then produce a radiative imbalance of 2 W/m2. That change in radiative balance will cause an additional 0.5 K of warming. That 0.5 K of warming will reduce OLR and reflected SWR by 0.5 K * 2 = 1 W/m2/K That 1 W/m2 change in radiative balance will produce an additional 0.25 K of warming. The result is an infinite geometric series 1 + 1/2 + 1/4 + 1/8 … = 2 K of warming. If all of the non-planck feedbacks total 3 W/m2/K, the geometric series is 1 + 2/3 + 4/9 + 8/27 … = 3 K. If the total of all non-Planck feedbacks is only 1 W/m2/K, then the series is 1 + 1/4 + 1/16 + 1/64 … = 4/3 K. One doesn’t need to draw any analogies to Bode amplifiers to work out the consequences of feedback, but the mathematics is similar.
However, one doesn’t need to consider “amplification” by feedback at all. The average photon escaping to space is emitted from about 5 km where temperature is 255 K. If we think in terms of a blackbody + feedbacks, the Earth loses an additional 4 W/m2 for every 1 K rise in surface temperature: -4 W/m2/K. The additional water vapor in the atmosphere blocks about 2 W/m2 for every 1 K rise in surface temperature: +2 W/m2/K. The fact that more humidity cause more warming in the upper troposphere than at the surface (lapse rate feedback) adds an additional 1 W/m2 to OLR: -1 W/m2/K. During the winter, a 1 K increase in temperature will reduce seasonal snow cover and reflect a few tenths of a W/m2 less SWR from the surface to space, and if we wait for centuries, changes in polar ice caps also will reflect a little less: perhaps +0.3 W/m2/K. Total: -2.7 W/m2/K before accounting for changes in clouds. We know that cloud cover diminishes in the summer, so cloud feedback is likely positive, +0.7 W/m2/K to make a wild guess. Total -2 W/m2/K. Take the reciprocal: 0.5 K/W/m2. A doubling of CO2 is 4 W/m2, making ECS = 2 K/doubling. This is about what Otto (2013) and Lewis&Curry (2014) deduce from the change in forcing and the observed change in surface temperature. If cloud feedback is increased to +1.7 W/m2/K, the sum of all feedbacks is -1 W/m2/K and climate sensitivity rises to 4 K/doubling.
Slightly different numbers are used for some of the round numbers I used above. Observations of OLR through clear skies from space (Planck + water vapor + lapse rate feedbacks) amount to about -2 W/m2/K. Climate scientists don’t have the ability to distinguish between an ECS of 2 or 4+ K/doubling. Any higher gets us too close to a runaway greenhouse effect to be believable.
None of CO2isnotevil’s scientific criticism is accurate. This post is wrong, climate feedback is negative in the eyes of the consensus, whether you are an engineer who thinks of the atmosphere as a blackbox with a single temperature and emissivity or whether you are a chemist who thinks in terms of molecules or GHG and uses the Schwarzschild eqn – which has been validated by observations of the atmosphere. The existence positive feedbacks that reduce negative Planck feedback is not controversial, but the uncertainty in the magnitude of these feedbacks is high.

• Frank says:

1Sky1 wrote: “I’m glad to see the highly ambiguous, simplistic notion of “climate feedback” examined by others from the standpoint of first principles. Two caveats with respect to climate “sensitivity” seem to be in order: 1) what climate science calls feedback is more akin to time-variable changes in system-response characteristics (or to lagged recirculation of thermal energy) than to any rigorous concept of instantaneous feedback in time-invariant control systems with op-amps and 2) on the basis of available empirical evidence there is no compelling reason to assume thermodynamic equilibrium beyond LTE.”
Except for changes in ice caps, most feedbacks occur quickly – within one or perhaps two months. If you take the average amount of water in the atmosphere (25.3 cm) and divide by average rainfall (2.7 mm/day), you will find that the average water molecule remains in the atmosphere for about 9 days. So feedbacks associated humidity and cloud cover respond to changes in surface temperature in weeks, not years. When you are looking a monthly temperature averages, the response appears instantaneous. Typical surface prevailing wind speed is around 10 m/s or 36 km/hr, or 1000 km/day or around the world in 1 month. Jet streams in the upper atmosphere move far faster. T Seasonal snow cover (surface albedo) responds quickly to changes in local temperature. The atmosphere is effectively stirred on a time scale of 1 month and local feedback about as fast as we record temperature – i.e. monthly. There is little lag between temperature and most feedback.
Total precipitable water vapor: http://www.esrl.noaa.gov/gmd/publications/annual_meetings/2014/slides/22-140327-C.pdf
(However, Roy Spenser claims that it takes about 2 months for the heat from El Nino to rise to the top of the troposphere.)
1sky1 wrote: IMHO, co2isnotevil comes very close to portraying the actual physical situation. I cannot agree, however, with Frank’s assessment of scienceofdoom as devoted to physical rigor, when it portrays the effect of atmospheric backradiation as being the equivalent of a “second sun.” Unlike any sun, the atmosphere does NOT produce any energy; it merely absorbs and isotropically re-emits (recirculates) LWIR arising from thermalization of insolation at the surface.
Frank responds: The graph provided by co2isnotevil is dominated by signals from Antarctic and ignores feedbacks in SWR (See above). I provided a graph and link to a paper on the GLOBAL response to warming – which is far more relevant than local OLR emitted in response to local temperature.
Whether you believe in a 2-way flux of photons (OLR and DLR) or some form of cancellation producing a 1-way flux, we agree on the net flux (390-333 = 57 W/m2 for example). So the world of SOD isn’t significantly different from yours.
Of course, anyone who has taken statistical mechanics realizes that the behavior of individual photon and molecules is not constrained by the 2LoT. The laws of thermodynamics are not fundamental; they are a consequence of universe comprised of a very large number of molecules and photons following the laws of quantum mechanics. Temperature is only defined thermodynamically for large collections of colliding molecules and the NET FLUX of photons is always from hot to cold when the terms hot and cold have meaning. It makes no difference whether we have OLR and DLR or a net flux equal to OLR-DLR, but it is far more practical to calculate OLR and DLR using consensus physics.
Or, as Feynman says below, “if you can’t accept the way nature really behaves, go to another universe” ,,, we the rules are simpler .. more philosophically pleasing.

Or go to Venus, where the 740 K surface emits 17,000 W/m2 of thermal radiation and almost no sunlight penetrates to the surface. Why doesn’t that surface cool rapidly?

• Frank wrote :

Or go to Venus, where the 740 K surface emits 17,000 W/m2 of thermal radiation and almost no sunlight penetrates to the surface. Why doesn’t that surface cool rapidly?

Venus is the test . It’s surface temperature is about 2.25 that of a gray body in its orbit as opposed to our 1.03 . Our 3% could possibly be explained as a spectral phenomenon ( altho our ToA spectrum is apparently in the other direction giving an equilibrium temperature of about the endlessly parroted 255K ) . But no material spectrum can create a 225% solar heat gain . And as you point out , only about 3% of the impinging solar energy reaches the surface .
When talking about temperature in a situation like Venus’s surface , I prefer to think in terms of energy density which is simply the power times a light-second . It’s a presumption to assert the energy is going anywhere .
This is what leads me to conclude that HockeySchtick’s and others computations based on constant total energy balance , including gravity , is the only possible answer .
Of course , that is outside the entire GHG paradigm and means any “forcing” calculations not based on effects on the ToA spectrum are void . It is also why the Schwarzschild absorption differential , which is the only equation I have seen claimed to explain the “trapping” of energy in excess of that calculated for the ToA spectrum on the side of a filter away from the source , is the next thing I’d like to see implemented in an APL — so its parameter space can be easily played with and explored .

• Frank says:

Bob commented: “the next thing I’d like to see implemented in an APL — so its parameter space can be easily played with and explored.”
There is a group in England (climateprediction.net) that recruited thousands of volunteers to donate unused time on their personal computers to running large (1000+) ensembles of climate models with parameters chosen from within a physically reasonable range. They have shown modelers – but not the public or even the wider climate science community – that the output from the IPCC’s models represents only a tiny fraction of future climates that are compatible with physics. Most of their papers are linked. not behind paywalls and accessible through this web page:
http://www.climateprediction.net/publications/?letter=&type=&theme=
Looking I see a lot of recent politicized science trying to attribute extreme weather to aGHGs (a scientifically dubious and meaningless activity). But interspersed are papers that demonstrate that the parameters of climate models are not globally optimum values and that no set of parameters provides a superior representation of today’s climate. In fact, systematic exploration proved that they couldn’t narrow range of ANY parameter because some part of that range consistently produced inferior results.
http://www.climateprediction.net/wp-content/publications/nature_first_results.pdf
http://www.cgd.ucar.edu/ccr/bsander/subgrid.pdf
http://rsta.royalsocietypublishing.org/content/365/1857/2145.short
I suspect most skeptics will find these papers too close to the consensus position, but they illustrate the problems with the IPCCs models. Recognition of greater uncertainty means expansion at both the high and low ends of the IPCC range.

• 1sky1 says:

The problem with “consensus physics” as portrayed by climate scientists is that it stands basic principles on their head and ignores empirical observations. Nowhere is that more evident than in the failure to recognize that the climate system responds, albeit in a very complicated way, to the virtually sole forcing of insolation as a FEED-THROUGH system, wherein the heat produced by thermalization of the surface is transferred primarily by moist convection to the atmosphere and thereupon by radiation to space. Feedback in any rigorous sense of looped response is not at play in that process. Treatments relying upon radiative transfer alone are simply inadequate analyses of the thermodynamic problem on an aqueous planet.

• 1sky1 ,
I’ve been working on a image to make the equation and calculation I want to see before I’ll buy off on the GHG heat trapping hypothesis . It’s been complained that my analysis of a simple uniformly colored ball is too simple represent a planet . This complaint is made by people who parrot the 255K meme without understanding that that calculation is just the special case a particular step function spectrum with 0.7 absorptivity=emissivity ae over the solar power peak and 1.0 over longer wavelengths .
So I’ve gone even simpler : a 1 dimensional representation :
http://cosy.com/Science/1DeqDiagram640.jpg
The computation for a simple colored ball , eg , the lumped earth-atmosphere spectrum , is modeled on the lower row . The surface either absorbs=emits or reflects as a function of wave length . The filter either absorbs=emits or transmits as a function of wavelength .
The spectrum is the result of what’s absorbed and reflected by the surface and atmosphere together and the temperature works out to whatever the Top of Atmosphere spectrum is measured to be . Call that T .
The top line has separated the effective atmospheric filter , the same on both sides . The GHG claim is that there exists filter function such that b + c % 2  , ie , the average temperature at the surface is greater than T .
Ok , show me a power spectrum for the source , the absorption=emission spectrum of the surface and absorption=emission , transmission spectrum of the filter which displays this phenomenon .
As I say , this is just a draft I’m whipping up . But I think it clarifies the essential physical equations I have never seen presented and don’t ever expect to .

• 1sky1 says:

Bob Armstrong:
I’m not sure that I’m following your logic. Since the “filtration” done by the atmosphere is very much different in the SW range than in the LWIR, it would seem that the two processes would not combine readily into one effective filter. BTW, in modeling, I’ve found it very effective to treat the system as an LRC filter, rather than just a RC filter, to account for the internal storage of thermal energy.

• 1ski1
An RC (or RL) filter is either a low pass or high pass filter depending on where the C (or L) is and both C and L store energy. An LC circuit is a bandpass filter while an RLC circuit is a lossy bandpass filter. The potentially relevant feature of an LC or RLC circuit model is the periodic transfer of energy back and forth between being energy stored in the L and being stored in the C (resonance).
The Earth/atmosphere system could be modelled with the C being the surface and L being the atmosphere. The size of the relevant R can be determined by how long it takes this periodic exchange, once initiated by some change, to decay down to nothing. This can be precluded, or at least considered to be insignificant, as the difference in energy stored by the surface and the atmosphere is orders of magnitude. One thing we don’t see is the characteristic ringing expected for this kind of filter if the L was of any consequence relative to R and C.
Another possibility is to model cold as C and hot as L. The total amount of hot and total amount of cold on the planet at any one time varies periodically with the seasons (the p-p magnitude in the N hemisphere is not exactly cancelled by that in the S), although the average across a year is relatively constant and the average seasonal variability is easily cancelled out (anomaly analysis depends on this). As long as analysis covers a multiple of 12 months, an RC circuit model should be more than accurate enough.
How are you mapping the R, L and C to the actual properties of the physical system? What is the resonant frequency? It it matches to a circuit with a resonant frequency of 1 cycle per year, yearly averaging will eliminate the requirement for an L.
For the RC model, you can trivially map the product of RC (the time constant) to the amount of time it would take for all of the energy stored by the system to be emitted at its current average rate (see my reply to Esilex in these comments). The way this maps physically is that the C represents where energy is stored by the planet (ocean, land and atmosphere and the R represents the limiting factor for energy emitted into space (Planck emissions from the energy store, some of which are prevented from escaping out into space and instead are returned back to the surface).

• I got comfortable computing with spectra when I was learning the math required to think about visual psychophysics in the ’70s . It’s when I gained respect for the dot : { +/ * } | sum across products of corresponding pairs  ( generalized in APL to  { f/ g } ) as one of the most important computations in any quantitative field . Thus the equation for the radiative balance for any source power spectrum and any particular color , absorptivity=emissivity , spectrum and any particular sink , in terms of energy is simply the ratio of the dot products
( dot[ Source ; object ] % dot[ sink ; object ] ) * TotalIrradiance
This computation applies to any measured spectra , and is easily extended to any arrangement of source and object and sink spectra . It turns into the 4th root of the ratio times the gray body temperature in our orbit of about 278.5 +- 2.3 .
Given that actual spectra are available , in particular the spectrum of the lumped earth and atmosphere as seen from space and the solar spectrum , why isn’t that observed and computed temperature cited instead of the endlessly parroted 255 value . That crude approximation is useless in quantitatively understanding the 4th decimal place variations we’ve observed over the century .
While models in terms of simple filter functions may have some explanatory value , they too are hopelessly crude when the actual measured spectra are available .
Note : I’d love to see a description of just how the ae spectrum of the planet is measured . Certainly satellites at the Lagrangian points should provide the essential data .

• 1sky1 says:

co2isnotevil:
The only reason I brought up LRC filters as an aside is because they have impulse responses that maximize not at zero, but at a lagged time. That seems to be the characteristic of linearized planetary temperature response to variations in TSI over intervals much longer than a year. It’s purely an empirical “blackbox” relationship involving a critically damped series filter, without any obvious association with specific physical variables (although I have my ideas). Without imposing sharp resonance, it provides a mathematical means of exploring weakly oscillatory second-order response that is unavailable with first-order RC models. Let’s not get distracted by this side comment.

35. Stephen Obeda says:

Sometimes the net feedback is negative, and sometimes the net feedback is positive. This means that the level of feedback varies as conditions vary. Why is it not logical to presume that net feedback effects increase as the temperature level increases?
And if the net feedback is a variable, and not a constant, how can 100 years possibly hope to be enough data to determine the function, especially when we’re talking about multiple functions all interacting together?

• Stephen Obeda,
The net feedback can never be positive. If it was, there would be runaway global warming (or cooling, depending on the positive feedback).
There has never been runaway global warming, despite CO2 being more than 15X higher than now.

• b fagan says:

The net feedback can certainly be positive while the climate system is out of equalibrium – that’s the current problem. We’re warming the planet and that kicks off positive feedbacks in albedo change, water vapor content in the atmosphere and, possibly, in net cloud impact. This will continue until the top-of-atmosphere radiation again matches incoming radiation, and that equilibrium keeps getting pushed farther out in time as we continue increaseing the persistent greenhouse gases.
No researchers say we are headed into a runaway greenhouse, in fact, they specifically rule it out for a very long time. Here’s from Pierrehumbert’s Principles of Planetary Climate – section 4.6, page 284:
“Runaway greenhouse on Earth: With present absorbed solar radiation (adjusted for net cloud effects) of 265 W/m squared, the Earth at present is comfortably below the Kombayashi-Ingersoll limit for a planet of Earth’s gravity. According to Eq. (1.1), as the solar luminosity continues to increase, the Earth will pass the 291 W/m squared threshold where a runaway becomes possible in about 700 million years. In 1.7 billion years, it will pass the 310 W/m squared threshold where a runaway becomes inevitable for an atmosphere with 1 bar of N2 and no greenhouse gases other than water vapor.”
So the science doesn’t say “runaway greenhouse is coming soon”, the science says “our increasing greenhouse gas concentrations will raise the planet’s temperature a certain amount as it has done when greenhouse gases increased in the past”. That’s all.

• B Fagan.,
During the summer, each hemisphere is receiving more than its emitting and warms; during its winter each is emitting more than its receiving and cools. Around the spring and fall equinox, the planet is in near perfect instantaneous equilibrium. The sinusoidal seasonal variability in the N hemisphere, delayed slightly from the sinusoidal seasonal stimulus from the Sun, is significantly larger in the N than the S (larger fraction of land in the N), so the planet exhibits a net variability of the N which again passes through perfect balance in the spring and fall which is complicated by the fact that the profile of solar variability owing to the 20 W/m^2 average difference (80 W/m^2 total) between perihelion and aphelion is 180 degrees out of phase with the N hemispheres dominant seasonal response. We can analyze the hemispheres independently as long as we are considering yearly averages since when averaged across a multiple of years, there is very little net energy crossing the equator relative to what’s coming from the Sun.
The imbalance you speak of is not real given the approximate seasonal variability around balance of about +/- 100 W/m^2 per hemisphere. The planet adapts very quickly to change, otherwise, we would see little difference between summer and winter and the onset of volcanic related cooling would take decades. About the best you can claim regarding any imbalance consequential to emissions is that given measured seasonal time constants on the order of a year, the equivalent amount of CO2 emissions from all prior years that is not already accounted for in the balance is about equal to the prior years emissions which at the measured sensitivity of about 0.2C per W/m^2 works out to a unrealized temperature gain of almost nothing. Even at the highly inflated 0.8C per W/m^2 claimed by the IPCC, it results in only a slightly larger nothing.

• b fagan says:

I don’t know if it’s a coincidence that you mention “measured sensitivity of about 0.2C per W/m^2 “, but there is a measured increase in surface energy of exactly that amount, due to the change in CO2 concentrations over a decade. So that measured increase in downwelling radiation from CO2 increase in ten years is just a fraction of what the overall increase will be as we shoot past doubling and eventually stop adding more.
First Direct Observation of Carbon Dioxide’s Increasing Greenhouse Effect at the Earth’s Surface
Berkeley Lab researchers link rising CO2 levels from fossil fuels to an upward trend in radiative forcing at two locations
News Release • FEBRUARY 25, 2015
http://newscenter.lbl.gov/2015/02/25/co2-greenhouse-effect-increase/
That’s an open-access summary to a Letter in Nature titled “Observational determination of surface radiative forcing by CO2 from 2000 to 2010” Nature (2015) doi:10.1038/nature14240
Over a decade they measured changing IR radiation on the North Slope in Alaska, and at a site in Oklahoma.
From the final section of the paper:
“Increasing atmospheric CO2 concentrations between 2000 and 2010
have led to increases in clear-sky surface radiative forcing of over
0.2 W m at mid- and high-latitudes. Fossil fuel emissions and fires
contributed substantially to the observed increase. The climate perturbation
from this surface forcing will be larger than the observed effect,
since it has been found that the water-vapour feedback enhances greenhouse
gas forcing at the surface by a factor of three and will increase,
largely owing to thermodynamic constraints. The evolving roles of
atmospheric constituents, including water vapour and CO2 (ref. 30), in
their radiative contributions to the surface energy balance can be tracked
with surface spectroscopicmeasurementsfrom stand-alone (or networks
of) AERI instruments”

• B Fagan
The w/m cannot be 265. Using CAGW math it’s about 255. The only way for a run away greenhouse is for either of two things to occur or both. The magnetic field goes away, and as a result water gets reduced and blown away, and/or the surface of the planet becomes hot. The other thing that you’ve implied is that co2 hangs around in the atmosphere for centuries, it doesnt. Look at the current sink rate and the amount of co2 increase each year.
I don’t think anyone would doubt AGW if it were getting as warm as claimed. The measured amount of incoming is 363. If we are in fact holding on to an additional 25, up from the stated amount back in 2002 of 240, everybody would notice. You wouldn’t have to play adjustments.

• b fagan says:

Hi, rishrac.
First thing – I didn’t say there would be runaway greenhouse, dbstealey brought it into the conversation as a possibility if net feedbacks were positive and I typed in a few sentences from my copy of “Principles of Planetary Climate” to show that the scientists who study this stuff know we’re not running ourselves into a runaway greenhouse – which is a relief.
https://wattsupwiththat.com/2016/05/12/negative-climate-feedbacks-are-real-and-large/#comment-2214863
Note that net feedbacks don’t have to be permanent – net feedbacks for a period of time can and have been either positive or negative over the history of the earth – negative feedbacks have given us icehouse earths that only thawed when volcanic CO2 emissions (not drawn down by biological uptake or silicate weathering) reached concentrations high enough to turn the cycle to positive feedbacks.
I also disagree with the terms CAGW math or AGW math. There’s greenhouse gas and its effect, and the various feedback cycles that ensue until the global energy balance comes to equilibrium with the change in GHG concentration (and, of course, other forcings and variations acting during the same time). The fact that we’re how the gas is emitted now doesn’t matter to the physical processes – it’s just a gas.
I don’t imply CO2’s residence – again, the science is there for us to point to possible paths. According to some information in Chapter 4 of David Archer’s “The Global Carbon Cycle” (Princeton Press)
– average CO2 molecule – maybe 3.5 years before uptake by a plant or the ocean
– a huge change in atmospheric concentration becomes more difficult. Residence of a large burst of CO2 will result the the concentration dropping as more and more goes into plants and especially as it goes into the deeper ocean, but here feedback loops also complicate things. The atmospheric concentration drops a lot within a fairly short time (century or more?) of emission stopping, but then there’s the long tail.
Warmer water holds less gas, declines in pH slow CO2 intake, ocean circulation changes could slow transport to the polar regions where deepwater is formed, etc. On land, warming might increase soil outgassing, boreal or tropical forests might reach a growth limit based on other nutrients, etc.
But for the lifetime of an increased CO2 concentration, it is reasonable to think hundreds to tens of thousands of years or longer. Why? Because while the CO2 fraction in the atmosphere might decline, it will then result in outgassing of CO2 that had been pulled into deeper water, and by the time we’re done adding CO2, the ocean will be absorbing a pretty hefty slug.
The long-term reduction of CO2 in nature is by combining with minerals in rock – so subject enough mountains to weathering and you solve the problem. Read up on CO2 concentrations during the PETM for a preview – relatively sudden spike in concentration followed by warming followed by long, slow drawdown.

• Per year the sink for co2 is 1 and half times greater now than all of the man made co2 in 1965. That requires some explaining. The long slow draw down is speculation. Supporting an idea from speculation is not science. A great deal of things have been said about what sinks co2. Supposedly tropical rainforest were a big sink. Are they bigger today than they were in 1965? A warmer ocean can not hold as much co2 as a cooler one. Is the ocean warmer or cooler? And before you get into a long winded explanation of how’s that’s possible, let me point out that during the el nino of 1998 was the year of the highest level per any year of co2 added. In spite of the fact that we added in increasing amounts over an extra billion metric tons of co2 per year.
Next, run away greenhouse effect is implied in the ever increasing retention of heat. The “window” the IPCC talks about is the amount of net heat that escapes into space. With more co2 the less heat escapes and more warming ensues. There isn’t any way of separating the two in AGW.
The surface warming I am referring to is the internal heat from the earth. Such as the volcanic activity under the ice sheet that is melting in the west Antarctic ice sheet, not due to atmospheric warming.

• Seems to me that a reasonable definition of feedback doesn’t mean just a -k * dx  simple spring like response to deviation from equilibrium . These asserted feedbacks are asserted to move the equilibrium or eliminate it in a catastrophe ala Thom and Zeeman .
Considering that thermal radiation is a T ^ 4 function , it’s rather hard , ie : bonkers , to imagine suppressing its 4 * T ^ 3  derivative below 0 .

• I’ve thought about it a lot, and I don’t know. We can have a good estimate of how much is man made, natural is a question. One feature that is in the co2 record that bothers the heck out of me, is that early on there are no negative numbers. It bothers me because without a huge input of natural co2, the carbon cycle would be close to zero in 100 years without the industrial Revolution. The cause of this huge sink is a mystery. It’s always been there and functioning or its developed in response to increased co2 or a combination of both.
I wonder what kind of assessment will be made if we are producing as much co2 as we are and the amount co2 does go negative or close to zero . That’s a scary thought.
One other thought I had is that co2 follows temperature that many here have stated. Until recently the highest year increase in co2 was 1998 at 2.93 ppm. The following year, 1999, it dropped to 0.93. ( that’s like saying in today’s sink an extra 24 billion metric tons vanished) 2008 and 2009, which I believe we’re colder years the levels were 1.60 and 1.89 respectively. With this current el nino dying it’ll be interesting to see if the ppm drops from this year.
Fractal geometry probably plays a big role in this. Similar to the laws of supply and demand with diminishing returns. We have to keep increasingly produce more co2 to get a small increase in ppm.

• rishrac,
Correctomundo, compadre. Although B.E.S.T. claims to have measured AGW, they haven’t. If/when someone produces a verifiable measurement of AGW, it will be accepted across the board by scientists on both sides of the debate.
That hasn’t happened. If it did, then for one thing the climate sensitivity number would be accurately known for 2xCO2, and future global warming due to a rise in CO2 could be accurately predicted.
But as we know, no one was able to predict the most significant global temperature event of the past century: the fact that global warming stopped for nearly twenty years.
Conclusion: although AGW may well exist (I think it does), the fact that it has never been accurately quantified means that it must be quite small. In that case, for all practical purposes AGW can be disregarded as a non-problem.

• b fagan says:

Hi, dbstealey. My understanding is that BEST simply verified that the surface is warming, that it’s warming more or less as the other surface records show, and that even rural sites show similar patterns of warming. Muller wasn’t trying to measure “AGW” he was looking at warming without the adjective up front.
But as I reminded you here https://wattsupwiththat.com/2016/05/12/negative-climate-feedbacks-are-real-and-large/#comment-2216346 there HAS been a measurement of the change in greenhouse warming from the change in CO2 concentrations over a decade. Again, the instruments measured IR radiation in CO2 bands, showing an increase in IR as the CO2 concentration increased by about 20ppm during the study. They didn’t look at the molecules for “anthropogenic” tags, they were measuring the greenhouse effect.
So, when do scientists on “both sides” now accept it?
By the way, let me reframe your conclusion to point to a bit of a hole in your logic. You said: “Conclusion: although AGW may well exist (I think it does), the fact that it has never been accurately quantified means that it must be quite small. In that case, for all practical purposes AGW can be disregarded as a non-problem.”
So, to use an analogy, you bring your car to the shop for the mechanic to look at it, because the engine light came on and its running funny. He looks for a while and can conclusively something’s is wrong but that:
a) it’s not the bulb ( a small risk to your wallet)
b) it’s not the entire engine ( a large risk to your wallet)
but he’s going to run some more diagnostics to find out what it really is. Your conclusion is “can’t immediately quantify risk, so the car is fine”?. I’d see the same thing as “it’s going to cost more than that light bulb and less than an engine”.
Not having an exact number is part of life. But the overall range of sensitivity to doubled CO2 has been in a fairly constant range since scientists started looking at it. Look at Manabe and Wetherald 1967. Or lots of other places.
I have no idea what the final answer will be, and it’s complicated by the other feedbacks, by the strong likelihood that we’ll pass a doubled concentration, and by the response of natural systems like the frozen peat that’s much of global permafrost, what the boreal forests will do, etc.
I’d love that the risk be at the low end of the spectrum, but there’s no reason to pin our hopes realistically on that being the result. Great if it happens, but in the meantime, we need to think otherwise.

• Hi b fagan,
First, your analogy is simply a version of the Precautionary Principle. If there was evidence of any global damage or harm as a result of the rise in CO2, you would have an argument.
But there isn’t any evidence of global harm from CO2. In fact, all the evidence indicates that the rise in CO2 has been completely harmless, and very beneficial to the environment. At what point will you admit that the “carbon” scare has no basis in reality? If we applied the Precautionary Principle to everything, there would be no progress at all.
Next, the debate is over anthropogenic global warming (AGW), not over global warming, which by all available evidence is natural, not man made. And if Muller and his B.E.S.T. team want to show that a rise in CO2 causes global warming, they’ve certainly failed to demonstrate it. For nearly twenty years CO2 has steadily increased, by more than one-third, while global warming stopped during that time (the “pause”, or the “hiatus”).
There is no correlation between rising CO2 emissions and global warming:
http://jonova.s3.amazonaws.com/guest/de/temp-emissions-1850-web.jpg
As we see, rising global T and rising CO2 are simply coincidental events.
You say:
Not having an exact number is part of life.
But there is no credible number at all! Again: at what point will you admit that any effect from CO2 is too small to matter?
Next, you say:
…the overall range of sensitivity to doubled CO2 has been in a fairly constant range since scientists started looking at it
I don’t know how you can claim that, when the guesstimates of the sensitivity number are all over the map:
http://jo.nova.s3.amazonaws.com/graph/models/climate-sensitivity/climate_sensitivity5.png
Climate sensitivity to 2xCO2 ranges from 6ºC, down in steps to ±3ºC, to 2º – 3ºC, to ≈1ºC, to a fraction of a degree. Dr. Ferenc Miskolczi published a peer reviewed paper showing that sensitivity to CO2 is 0.00ºC. And some scientists say that more CO2 causes cooling. So it all depends on who you ask.
…when do scientists on “both sides” now accept it?
Answer: All sides will accept it when a verifiable, replicable, testable measurement quantifying CO2=AGW is produced.
Currently there is no widely agreed-upon sensitivity number. The reason is clear: global warming due to rising CO2 (assuming CO2 causes warming) is still too small to measure. Therefore, it is a non-problem. And that’s why no one in the alarmist crowd ever discusses any kind of cost/benefit analysis. If they did, the “carbon” scare would crash and burn.
Finally, you say:
I’d love that the risk be at the low end of the spectrum, but there’s no reason to pin our hopes realistically on that being the result. Great if it happens, but in the meantime, we need to think otherwise.
The ‘risk’ you’re claiming is not only not at “the low end of the spectrum”; there is no observed risk at all. On balance, the rise in CO2 has been entirely beneficial, with no observed downside.
The biosphere is still starved of CO2, so even the very small rise in that airborne fertilizer has resulted in a measurable greening of the planet. Yet the alarmist crowd still argues that another one part in ten-thousand rise in CO2 is a bad thing!
It’s obvious that the alarmist argument is entirely emotion-based, since they have no credible evidence showing that the increase in CO2 is a problem of any kind. All their objections amount to: “Wait, what if…?” But science doesn’t operate on “What ifs”. Science operates on data. But there is no data quantifying AGW — which is the central question in the debate.
The entire “carbon” alarm is based on the Precautionary Principle. But if we gave that logical fallacy any credence, we would have never gone to the moon. So again: at what point would you admit that there is no credible, verifiable, or observational evidence to support your belief in the Precautionary Principle? Or do you think it should apply, no matter what?

36. Stein Bergsmark says:

This input-output relationship is perfectly modelled by the first order open loop transfer function
H(s) = K/(s + a). A well known physical realization of this transfer function is the resistor-capacitor low pass filter, where the capacitor is an energy storage. This seems to perfectly fit the solar input temperature output situation.
There is no way the observed sinusoidal physical quantities can be taken as proof of a feedback system, be it positive or negative.
Esilex Montagrius

• Esilex,
Another way to express this is with the differential equation, Pi(t) = Po(t) + dE(t)/dt, where Pi(t) is the instantaneous solar energy arriving at the planet, Po(t) is the instantaneous power leaving the planet and E(t) is the energy stored by the planet. When Po > Pi, dE/dt is negative and the planet cools, otherwise, the planet warms. We also know that dT/dE is a constant, where T is the average surface temperature (i.e. 1 calorie -> 1 gm H20 -> 1 C)
We can define an arbitrary amount of time, tau, such that all of E can be emitted in tau seconds at the rate Po. Rewriting, we get Pi(t) = E(t)/tau + dE/dt, which you should recognize as the LTI describing an RC circuit, where tau is the time constant. The only difference is that the RC time constant is a real constant, while the climate time constant has a significant negative temperature coefficient since as T increases linearly, Po increases as T^4, thus tau must decrease as T^3.
Pi can be rewritten as a function of albedo which can itself be expressed as a function of ice amount, cloud coverage, surface and cloud reflectivity. Po can similarly be rewritten as a function of the clear sky emissions and cloudy sky emissions weighted by cloud coverage. each of these can be further decomposed spatially and temporally, I’ve done the comparison to satellite data (ISCCP) and the data conforms to the constraints of this formulation to a remarkable degree across a hierarchy of spatial and temporal averages from pixels to hemispheres and from days to years.