The climate sensitivity and the surface temperature record question – answers from major players

Image from Warren Meyers - click for his post describing this

Dr. Roger Pielke Senior posted this today, since he has no comments on his blog, I felt it would be good to repost it here to allow discussion – Anthony

Repost Of Weblog Climatequotes.com “Climate Scientists Answer Question: Should Climate Sensitivity Be Measured By Global Average Surface Temperature Anomaly?”

There is an excellent collection of interviews posted by Sam Patterson on April 23 2011 on the weblog Climatequotes.com titled

Climate Scientists Answer Question: Should climate sensitivity be measured by global average surface temperature anomaly?

I have reposted his very informative set of interviews and commentary below.

_________________________________________________

From: Climatequotes.com

Note: I wrote this post many weeks ago and never posted it because I was waiting for some more feedback. However, Pielke Sr. has posted specifically on this issue recently and Watts ran it also, so I feel now is a good time to post it.

This post deals with the the question of whether or not climate sensitivity should be measured by global average surface temperature anomaly. I asked multiple climate scientists their opinion, and their responses are below. First, some background.

Over at The Blackboard there is an interesting guest post by Zeke. He attempts to find areas where agreement can take place by laying out his beliefs and putting a certain confidence level on them. This idea was commented upon by several blogs and scientists. Judith Curry, Anthony Watts, Jeff Id, and Pielke Sr. all contributed. I want to focus on Pielke’s response, because he challenges a core assumption of the exercise.

In Zeke’s post, he gives his position on climate sensitivity:

Climate sensitivity is somewhere between 1.5 C and 4.5 C for a doubling of carbon dioxide, due to feedbacks (primarily water vapor) in the climate system…

Here is Pielke’s response to this claim:

The use of the terminology “climate sensitivity” indicates an importance of the climate system to this temperature range that does not exist. The range of temperatures of “1.5 C and 4.5 C for a doubling of carbon dioxide” refers to a global annual average surface temperature anomaly that is not even directly measurable, and its interpretation is even unclear…

Pielke goes on to explain that he has dealt with this issue previously in the paper entitled “Unresolved issues with the assessment of multi-decadal global land surface temperature trends.” Here is the main thrust of his response:

This view of a surface temperature anomaly expressed by “climate sensitivity” is grossly misleading the public and policymakers as to what are the actual climate metrics that matter to society and the environment. A global annual average surface temperature anomaly is almost irrelevant for any climatic feature of importance.

So we know Pielke’s position. He is adamantly opposed to using surface temperature anomaly when discussing climate sensitivity, for various reasons, not the least of which is it ignores metrics which actually matter to people.

I haven’t heard this view expressed very often, so I decided to contact other climate scientists and find out their opinions on this issue. I asked the following questions and invited them to give their general impressions:

1. Do you believe that global annual average surface temperature anomaly is the best available metric to discuss climate sensitivity?

If yes to Question 1, then:

2. Could you briefly explain why you consider global annual average surface temperature anomaly the best available metric to discuss climate sensitivity?

If no to question 1, then:

2. What do you believe is the proper metric to discuss climate sensitivity, and could you briefly explain why?

John Christy

1. Do you believe that global annual average surface temperature anomaly is the best available metric to discuss climate sensitivity?

No. The surface temperature, especially the nighttime minimum, is affected by numerous factors unrelated to the global atmospheric sensitivity to enhanced greenhouse forcing (I have several papers on this.) The ultimate metric is the number of joules of energy in the system (are they increasing? at what rate?). The ocean is the main source for this repository of energy. A second source, better than the surface, but not as good as the ocean, is the bulk atmospheric temperature (as Roy Spencer uses for climate sensitivity and feedback studies.) The bulk atmosphere represents a lot of mass, and so tells us more about the number of joules that are accumulating.

Patrick Michaels

I think it is a reasonable metric in that it integrates the response of temperature where it is important–i.e. where most things on earth live. However, it needs to be measured in concert with ocean measurements at depth and with both tropospheric and stratospheric temperatures. For example, if there were no stratospheric decline in temperature, then lower tropospheric or surface rises would be hard to attribute to ghg changes. Because we don’t have any stratospheric proxy (that I know of) for the early 20th century, when surface temperature rose about as much as they rose in the late 20th, we really don’t know the ghg component of that (though I suspect it was little to none).

Having said that, I suspect that where we do have such data, it is indicative that the sensitivity is lower than generally assumed, but not as low as has been hypothesized by some.

Gavin Schmidt

Your questions are unfortunately rather ill-posed. This is probably not your fault, but it is indicative of the confusion on these points that exist.

“Climate sensitivity” is *defined* as being the equilibrium response of the global mean surface temperature to a change in radiative forcing while holding a number of things constant (aerosols, ice sheets, vegetation, ozone) (c.f. Charney 1979, Hansen et al, 1984 and thousands of publications since). There is no ambiguity here, no choice of metrics to examine, and no room for any element of belief or non-belief. It is a definition. There are of course different estimates of the surface temperature anomaly, but that isn’t relevant for your question.

There are of course many different metrics that might be sensitive to radiative forcings that one might be interested in: Rainfall patterns, sea ice extent, ocean heat content, winds, cloudiness, ice sheets, ecosystems, tropospheric temperature etc. Since they are part of the climate, they will be sensitive to climate change to some extent. But the specific terminology of “climate sensitivity” or the slightly expanded concept of “Earth System Sensitivity” (i.e Lunt et al, 2010) (that includes the impact on the surface temperature of the variations in the elements held constant in the Charney definition), are very specific and tied directly to surface temperature.

People can certainly hold opinions about which, if any, of these metrics are of interest to them or are important in some way, and I wouldn’t want to prevent anyone from making their views known on this. But people don’t get to redefine commonly-understood and widely-used terms on that basis.

I sent a response to Gavin clarifying my questions, and including Pielke Sr’s comments. Here is his response to Pielke’ comments:

I disagree. Prof. Pielke might not find the global temperature anomaly interesting, but lots of other people do, and as an indicator for other impacts, it is actually pretty good. Large-scale changes in rainfall patterns, sea ice amount, etc. all scale more or less with SAT. (They can vary independently of course, and so ‘one number’ does not provide a comprehensive description of what’s happening).

Kevin Trenberth

1. Do you believe that global annual average surface temperature anomaly is the best available metric to discuss climate sensitivity?

This is not a well posed question. This relates to definition: the sensitivity is defined that way. It is not the best metric for climate change necessarily

If yes to Question 1, then:

2. Could you briefly explain why you consider global annual average surface temperature anomaly the best available metric to discuss climate sensitivity?

I think the best metric overall is probably global sea level as it cuts down on weather and related noise. But global mean temperature can be carried back in time more reliably and it is reasonably good as long as decadal values are used.

If no to question 1, then:

2. What do you believe is the proper metric to discuss climate sensitivity, and could you briefly explain why?

However, it is all variables collectively that make a sound case

Pielke Sr.

We have already discussed Pielke’s position, but I contacted him to find out what metrics he would prefer to use. Here is his response:

1. Do you believe that global annual average surface temperature anomaly

is the best available metric to discuss climate sensitivity?

NO

If yes to Question 1, then:

2. Could you briefly explain why you consider global annual average

surface temperature anomaly the best available metric to discuss

climate sensitivity?

If no to question 1, then:

2. What do you believe is the proper metric to discuss climate

sensitivity, and could you briefly explain why?

The term “climate sensitivity” is not an accurate term to define how the climate system responds to forcing, when it is used to state a response in just the global average surface temperature. This is more than a semantic issue, as the global average surface temperature trend has been the primary metric used to communicate climate effects of human activities to policymakers. The shortcoming of this metric (the global average surface temperature trend) was discussed in depth in

“National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties. Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington, D.C., 208 pp. http://www.nap.edu/openbook/0309095069/html/

but has been mostly ignored in assessments such as the 2007 IPCC WG1 report.

A more appropriate metric to assess the sensitivity of the climate system heat content to forcing is the response in Joules of the oceans, particularly where most the heat changes occur. I discuss this metric in

Pielke Sr., R.A., 2008: A broader view of the role of humans in the climate system. Physics Today, 61, Vol. 11, 54-55.

http://pielkeclimatesci.files.wordpress.com/2009/10/r-334.pdf

Pielke Sr., R.A., 2003: Heat storage within the Earth system. Bull. Amer. Meteor. Soc., 84, 331-335. http://pielkeclimatesci.files.wordpress.com/2009/10/r-247.pdf

More generally, in terms of true climate sensitivity, more metrics are needed as we discussed in the 2005 NRC report. The Executive summary includes the text [http://www.nap.edu/openbook.php?record_id=11175&page=4]

“Despite all these advantages, the traditional global mean TOA radiative forcing concept has some important limitations, which have come increasingly to light over the past decade. The concept is inadequate for some forcing agents, such as absorbing aerosols and land-use changes, that may have regional climate impacts much greater than would be predicted from TOA radiative forcing. Also, it diagnoses only one measure of climate change “global mean surface temperature response” while offering little information on regional climate change or precipitation. These limitations can be addressed by expanding the radiative forcing concept and through the introduction of additional forcing metrics. In particular, the concept needs to be extended to account for (1) the vertical structure of radiative forcing, (2) regional variability in radiative forcing, and (3) nonradiative forcing. A new metric to account for the vertical structure of radiative forcing is recommended below. Understanding of regional and nonradiative forcings is too premature to recommend specific metrics at this time. Instead, the committee identifies specific research needs to improve quantification and understanding of these forcings.”

It is, therefore, time to move beyond the use of the global annual average surface temperature trend as the metric to define “climate sensitivity”.

Differing views

There are clearly differing views on this subject.

John Christy does not support the metric. He points out that the surface temperature is affected by numerous things other than greenhouse forcing, and then gives two metrics which he prefers. The first is the change in joules in the system, with particular emphasis on the oceans. The second is bulk atmospheric temperature.

Patrick Michaels supports using the metric. He points out that the metric is important because it addresses the area where people live. However, he emphasizes that the surface temperature must be taken in concert with measurements such as ocean temperature at depth, and tropospheric and stratospheric temperatures. Without these other measurements, it would be difficult to assess the impact of GHGs on surface temperature.

Gavin Schmidt supports the metric unreservedly. He and Trenberth rightly point out that climate sensitivity is defined by global average surface temperature anomaly. Of course, the point of my question is challenging whether or not this is the best definition. Gavin seems to think so, and points out that the metric is “commonly-understood and widely-used”. He states that other metrics such as rainfall patterns and sea ice amount track very well with surface air temperature.

Trenberth is very brief, but states that global average surface temperature anomaly is not necessarily the best metric to use for climate change. He considers that global sea level is a better metric because it cuts down on weather related noise. However, he also points out that global average surface temperature anomaly is useful because it can be applied to the past more reliably. He also states that all variables taken together make a sound case.

Pielke Sr. is adamantly opposed to using this metric. We’ve already discussed his reasons. He also proposes a different metric for assessing climate sensitivity, “A more appropriate metric to assess the sensitivity of the climate system heat content to forcing is the response in Joules of the oceans”. He supports these claims with several of his own papers as well as a NRC report.

Conclusion

Pielke and Christy want to stop assessing climate sensitivity by using global average surface temperature anomaly, and both recommend using a change of joules (particularly in the ocean) as a better metric.

Michaels and Trenberth support the metric while emphasizing that other metrics must also be taken into account. Schmidt does not mention any drawbacks and emphasizes that the metric is already widely used and it works well with other metrics.

It seems to me the main problem here isn’t the metric itself, but the emphasis placed on it. I don’t believe that Pielke or Christy believe the metric has no value at all, only that it is a poor choice to use as the main metric when discussing CO2′s impact on climate. In Pielke’s case, the emphasis on CO2 itself is a problem, as he believes that other human impacts are far more important.

Climate science so frequently focuses on CO2 and temperature that it seems natural climate sensitivity would be measured by global average surface temperature anomaly. A shift away from this metric seems unlikely. However, if it can be shown in the future that a change in joules in the ocean directly contradicts other metrics then I’m sure this discussion will come up again. Pielke’s paper mentions an apparent contradiction found by Joshua Willis of JPL, although the measurements are only taken over a four year period. Only time will tell which metric is most valuable.

0 0 votes
Article Rating
107 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Doug
April 26, 2011 12:27 pm

How can global sea level even be considered to be a good metric for CO2 sensitivity? We know sea level has been going up at about the same rate before and after increases in CO2 levels, so that alone means that it’s a poor metric. There could very well be a delayed feedback that we have not yet seen, but it’s clear that right now it cannot be used, unless the conclusion you draw is that the sensitivity is exactly zero.
I agree that the ocean heat content is the best one available.

Jeremy
April 26, 2011 12:29 pm

The sniff test on that plot:
250PPM of CO2 would cause cooling?
🙂

MarkW
April 26, 2011 12:30 pm

I’m not a climate scientist, though I do play one on the blogs.
I would not use the surface temperature anamoly for two reasons.
The only globaly viable temperature anamoly available is the satellite record which only goes back 30 years and is still subject to dispute regarding adjustments. The surface station network is such a mangled hash that I do not believe a reliable global value can be recovered from it.
The second reason is that even once we do manage to agree as to what the actual surface temperature anamoly is, there is still no agreement as to how much of that anamoly is attributable to CO2 and how much is from other factors. Heck, we still don’t even know what all the other factors might be.

MarkW
April 26, 2011 12:36 pm

Gavin seems to think so, and points out that the metric is “commonly-understood and widely-used”.

Kind of reminds me of the way they used to try and shout down skeptics by declaring that there was a consensus in favor of their position.
The answer above is, in my opinion, the kind of answer a politician gives, not a scientist. Just because a metric is widely used is not evidence that it is right.

BarryW
April 26, 2011 12:39 pm

Dr Schmidt’s definition would be like defining corn crop yield by the average height of the corn stalks. Sure you could define it that way but in actuality it means nothing.

Jacob
April 26, 2011 12:50 pm

An excellent discussion. Pielke Sr. has often advocated for a more regional approach to examining climate change as well, focusing on local to regional changes rather than the global metric of sensitivity most commonly used. Roy Spencer has advocated the use of the global average temperature (see http://www.drroyspencer.com/2010/05/in-defense-of-the-globally-averaged-temperature/ ) to aid in diagnosing climate change and sensitivity. I think the regional approach should be persued on top of the global, with the many regional effects of forcings and the various oscillations taken into account in context. Thats just me, however.

Jeremy
April 26, 2011 12:52 pm

Based on Gavin’s response, I’m left to wonder if he doesn’t understand what a measurement of temperature is. What matters to people on the ground is what weather to expect on their crops in the next 6 months. There’s countless major influences on that, but probably the easiest one to point to is whatever dominant oceanic energy cycle affects your area. That is entirely dependent on where you live. Global surface temperature anomaly doesn’t tell you anything about where energy in the form of heat is going between the ocean and atmosphere, so it tells you nothing about what to expect in terms of rainfall or sunshine for the near future.
More importantly, if we’re speaking long-term, as I would expect a climate change person to be speaking of. The energy budget is *all* you should care about. The temperature measured is simply not going to give you the information you want if you’re trying to track where the temperature will be tomorrow. It is a talking point, a footnote, a blurb that has no bearing on the work of someone trying to demonstrate that the world will get warmer.
His reply frankly reads like a form of flailing created by either ignorance of thermodynamics, or an awkward political position. Here’s hoping it’s the latter.

Steve
April 26, 2011 1:02 pm

What about climate scientist sensitivity? That is, the increase in blood pressure for each doubling in contradictions to their preferred theories.

tom in indy
April 26, 2011 1:04 pm

Scientists who don’t like the term ‘climate sensitivity’ should just define a new term, ‘climate responsiveness’, in terms of joules. Or in terms of a weighted average of several factors, where the weights are determined by emprical studies.
Scientists can push their results on ‘climate responsiveness’ out to the press. They won’t know the difference.

Jeff Carlson
April 26, 2011 1:05 pm

next time replace Gavin with a real scientist and not a propagandist like Gavin clearly is …

April 26, 2011 1:09 pm

I’ll go with temperature.
Climate is where I live, and until I get over my fear of sharks, the heat content of the ocean doesn’t do it for me. The oceans moderate the climate, certainly, and likely regulate it, but they aren’t any measure of it. Sea level is not even in the ball park.
Temperature, humidity, precipitation, and their seasonal changes are the climate.

April 26, 2011 1:33 pm

I have checked temps. so far in Europe, Australia, southern Africa and south America.
(35 years records). Everywhere I looked (so far) I did not find the minima rising faster then maxima and mean temps. when looking at them on an annual basis. So draw your conclusions?
http://www.letterdash.com/HenryP/more-carbon-dioxide-is-ok-ok

Anything is possible
April 26, 2011 1:44 pm

“People can certainly hold opinions about which, if any, of these metrics are of interest to them or are important in some way, and I wouldn’t want to prevent anyone from making their views known on this.”
__________________________________________________________
Well, thank you, Gavin. Very generous of you to allow other people to have, and express, opinions on this matter.
The sheer arrogance of the man never fails to amaze me………

Don K
April 26, 2011 1:47 pm

Excellent article. I’m in no way, shape, or form a climate scientist. My problem with surface temperature anomaly as a metric is pragmatic. The value is noisy and apparently sensitive to many things — some of which are poorly understood. As a result, otherwise intelligent people spend an awful lot of time arguing about small changes in values that are very likely some combination of measurement noise, measurement biases, and exogenous events like the movement of warm and cool ocean water masses.

Mike from Canmore
April 26, 2011 1:56 pm

Out of curiosity, how does one measure joules in a system if not through temp?

David, UK
April 26, 2011 1:56 pm

Doug says:
April 26, 2011 at 12:27 pm
How can global sea level even be considered to be a good metric for CO2 sensitivity? We know sea level has been going up at about the same rate before and after increases in CO2 levels, so that alone means that it’s a poor metric.

Or it means that CO2 sensitivity is small and/or overridden by myriad other factors.

Steeptown
April 26, 2011 2:03 pm

Gavin is a mathematician, not a scientist. He doesn’t understand that temperature is an intensive variable and thus that the concept of global temperature is a nonsense.

April 26, 2011 2:10 pm

My metric for climate sensitivity is the value of damage inflicted on humanity per year, caused by a doubling CO2, divided by Global GDP. After a 40% rise of CO2 from pre-industrial times this is 1/1000th of diddly-squat and my projection is that it will continue to be 1/1000th of diddly-squat even after 100% is passed.

Espen
April 26, 2011 2:13 pm

As I’ve commented on a couple of occasions: If we for a moment ignore the fact that most of the heat capacity is in the oceans: Global average temperature isn’t even a valid measure of the heat content of the atmosphere, since a 1 degree air temperature increase represents a larger amount of added energy if the temperature is high to begin with than if it is low to begin with. A +0.5 C global anomaly which is mainly located in temperate or tropic zones may represent about the same “excess heat” as a +1 C (or even more) global anomaly which is mainly located in the Arctic.

David C. Greene
April 26, 2011 2:14 pm

I don’t see the sense of even using the term “climate sensitivity.” Climate is not a single measurable. Even if it were a single measurable quantity, it would logically be subject to several “sensitivities” – in addition to the various gaseous elements of the earth’s atmosphere. Allowing the widespread use of the concept that global “climate” (meaning temperature) is only a function of CO2 , describable by a single number is not science.

George E. Smith
April 26, 2011 2:17 pm

“”””” “Climate sensitivity” is *defined* as being the equilibrium response of the global mean surface temperature to a change in radiative forcing while holding a number of things constant (aerosols, ice sheets, vegetation, ozone) (c.f. Charney 1979, Hansen et al, 1984 and thousands of publications since). There is no ambiguity here, no choice of metrics to examine, and no room for any element of belief or non-belief. It is a definition. “””””
Well that is totally wonderful Gavin. Aerosols, ice sheets, vegetation, and Ozone, are ALL variables of “The Climate”.
How does one observe the response of the climate to a single variable (the radiative forcing) while at the same time insisting that other variables of the climate system shall not change ”
Reminds me of some rocket scientist British University researchers (can’t recall who), who did a study (computer modelling) of what happens, when you double the atmospheric CO2 abundance; which I presume is the same change, as a CO2 doubling; while at the same time holding the surface Temperature constant. Evidently their study was an analysis of how the laws of Physics change; how else could you double the atmospheric abundance of CO2; or cause a CO2 doubling; your choice, and at the same time have no change in the surface Temperature; which after all, we are told, is a direct consequence of the CO2 abundance.
I learn something every day. Today I learned from Dr Schmidt that the global mean surface Temperature is (or can be) in equilibrium. Well I suppose if Kevin Trenberth can have the earth conduct heat so fast as to remain an isothermal body; then the global mean surface Temperature can be in equilibrium. Gavin doesn’t seem to be sure whether he is talking about the “global mean surface Temperature” ; or the global mean surface Temperature anomaly ! Is this the sloppy way NASA does climate science; perhaps NASA needs to get back to Aeronautics, and Space.
And by the way; presuming that the actual Warming degree C as a function of the PPM CO2 actually falls somewhere in the range between the blue and the red curves; blue for cold, and red for hot; how can one be sure that the function isn’t a straight line. Can you prove it doesn’t follow a curve of the form:- y = exp(-1/x^2) ?

JeffT
April 26, 2011 2:34 pm

To determine whether there is net change happening, instead of some redistribution, we need a global measure. Global average surface temperature is surely imperfect, but it is difficult to find a better indicator. Total system (ocean at all depths, ice and atmosphere) energy content would be great, but we don’t have such a measurement yet. When we do have total system energy, we won’t have a history with which to compare it.
Sea level is attractive as a global measure, because it is sensitive to the total ocean heat content. As Trenberth says, it has a decent signal/noise ratio. However, the volume change of sea water caused by adding some quantity of heat depends on pressure and temperature. Consequently, the distribution of changes in heat content affects sea-level. Of course, glacial meltwater and the drainage of aquifers also add to sea-level. So sea-level requires careful interpretation, which some would deride as “adjustments.”

1DandyTroll
April 26, 2011 2:35 pm

So, essentially, I’m to believe that image depicting climate sensitivity to CO2 is real, good to honest, truest of science, and not the para-climatological modeled garbage from the house of Hansen, the preamble of the IPCC church of climate.
And, of course, everything assumes the static value of 280 ppm, the only statistical value that never change the higher resolution the future brings and, apparently, a value nobody seem to know if there is any actual scientific proof for no more.
But what does it all matter we do so have the global temperature from 1750, for, apparently, because, I mean, we got it for, well, London, and like the crazed climate communist hippies always says: One (tree)ring to to rule ’em all (and one localized ancient temperature measurement for global reference.)

rpielke
April 26, 2011 2:47 pm

Mike from Canmore – Excellent question. Joules is the unit of heat and requires a mass. For example, an object at 10C has twice the heat of another object at 10C with half the mass. In the atmosphere, also, the contribution of water vapor to the heat content must be included; e.g. see:
Pielke Sr., R.A., C. Davey, and J. Morgan, 2004: Assessing “global warming” with surface heat content. Eos, 85, No. 21, 210-211. http://pielkeclimatesci.wordpress.com/files/2009/10/r-290.pdf
and Section 3 in
Pielke, R.A. Sr., K. Wolter, O. Bliss, N. Doesken, and B. McNoldy, 2006: The July 2005 Denver heat wave: How unusual was it? Nat. Wea. Dig., 31, 24-35.
http://pielkeclimatesci.wordpress.com/files/2009/10/r-313.pdf

DeNihilist
April 26, 2011 3:11 pm

Averaging is like the old saw about statistics. That is why I agree with Dr. Pielke.
In my household, the average height for thr foour of us is 5′ 9″. If i was to build my doorways at 6′ heights, only two of us could walk through these doorways without ducking.

RockyRoad
April 26, 2011 3:11 pm

Jeremy says:
April 26, 2011 at 12:29 pm

The sniff test on that plot:
250PPM of CO2 would cause cooling?
🙂

Correct. With the baseline in the graph above established at zero at the inception of the industrialization revolution, any concentration of CO2 less than that would indeed cause “cooling”.

icecover
April 26, 2011 3:12 pm

The answer is sticking out like a sore thumb: a combination/index of sea surface temperature + 600mb TLT temperatures combines the physical measureent with the supossed gas CO2 effect if any. Unfortunately we only have 30 years data, and that per se is not showing anything anomalous. Excuse the Upmanship.

April 26, 2011 3:16 pm

I think the best metric would be ocean heat content because the energy retaining capability of the oceans is so vast.
Furthermore my description of the climate system relies on solar induced variations in energy input to the oceans as the primary driver of subsequent variations in the climate system with the oceans simply operating to ration the rate of release of that energy to the air and thereby modulating the solar effect.
At its simplest the system works as follows:
Where the consensus went wrong was firstly in attributing the observed climate effects to CO2 and secondly in failing to realise that the variations in ozone quantities in the atmosphere at different levels were solar induced and nothing to do with human activity (or at least a miniscule proportion due to human activity).
Furthermore it is really nothing to do with the properties of greenhouse gases per se.
The consensus view is that GHGs simply retain more energy and so warm the planet. It doesn’t work like that at all. It was proposed that an active sun warmed all the layers of the atmosphere together and that a quiet sun cooled them all together but that is not what we have seen.
In reality solar changes from above and oceanic changes from below change the surface pressure distribution to alter cloudiness and albedo so as to alter energy input to the oceans and it is that which changes the system energy content.
So as per my Hot Water Bottle Effect the role of the oceans as an energy retaining component of the system vastly outweighs the so called greenhouse effect in the air. Furthermore the CO2 portion of the total GHG effect in the air is tiny and the human portion even tinier.
Also the system response is reversed from that of the consensus view. That says that more GHGs warm the atmosphere. However what actually happens is that a more active sun COOLS the stratosphere and mesosphere by reducing GHGs (principally ozone) in the mesosphere above 45km (as per Joanna Haigh’s comments) so as to cool the stratosphere too despite MORE ozone in the stratosphere (from more UV) thereby drawing the jets poleward and allowing more solar energy into the oceans.
The active sun therefore allows a faster energy loss to space from the atmosphere (by destroying more ozone above 45km) but in doing so allows more solar energy into the oceans (by shifting the jets poleward) than the extra radiative energy lost to space.
So it really is nothing like the consensus view as to how GHGs work on the system energy budget and the surface temperature metric is of little use in ascertaining current system energy content or even the current trend in such energy content.

icecover
April 26, 2011 3:19 pm

Agree with RPielke a temperature-humidity index (THI) should be included in the equation possibly derived from sea temps (SST) + LTL (lowest troposphere possible) temps. It might be SST/LTL X 100/H or something similar leave to experts. My 10 cents worth.

April 26, 2011 3:38 pm

It appears very similar to what happens in the medical field, some merely focus on the symptoms whilst others dig deeper to determine the underlying causes.

Arfur Bryant
April 26, 2011 3:40 pm

Here’s what the IPCC says about ‘climate sensitivity’:
8.6.1
“Climate sensitivity is a metric used to characterise the response of the global climate system to a given forcing. It is broadly defined as the equilibrium global mean surface temperature change following a doubling of atmospheric CO2 concentration…”
http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch8s8-6.html
So… the IPCC considers climate sensitivity to be a metric in itself. Gavin didn’t mention CO2 but the IPCC does. Most other definitions of climate sensitivity will include CO2 in the definition, and the implication is clear – it is the response of global temperature to a doubling in CO2 concentration.
Lets look at that in more detail:
Since the ‘baseline level’ of 280 ppmv in 1850, CO2 has increased by about 40%. Other radiative GHGs have increased even more. At the same time, but not necessarily dependant upon, the global temperature (as far as it can be judged) has increased by about 0.8 deg C.
The climate sensitivity, as ASSUMED by Gavin and the IPCC, is currently running at 2 deg C. However, and this is a big however, this figure ASSUMES that all the warming since 1850 has been caused by radiative forcing. As no-one can verify that assumption, we are left with the probability that an unknown portion of the warming MAY have been caused by radiative forcing, so the actual climate sensitivity has to be LESS than 2 deg C. It could be anything from 0.1 C to 1.9 C. Unfortunately, there is no proof that the increase in CO2 has made any quantifiable difference in the real planet’s global temperature. As the next doubling has to have a lower ‘climate sensitivity’ (due to the log effect), I have to ask “What the **** is the problem?”
The discussion has no meaning. Inventing a term based on an assumption and then arguing how valid the term is an exercise in futility.

April 26, 2011 3:46 pm

Let’s see, the measurements are inadequate at best and probably garbage or have had many questionable adjustments made to them and/or are over too short a time period and the error bands are larger than the signal we’re looking for.
I think the whole field of “climate science” can best be summed up as “never has so much been made out of so little, by dint of vast expense and effort”.
Enough already!

Lady Life Grows
April 26, 2011 3:47 pm

Humans are the only creatures known to care about life forms and places far from them, such as the arctic or the ocean depths. We do care.
But we still care 100 times more about those things that relate more directly to our own survival. The difference in fisheries will mean much less than terrestrial differences. That is why the current definition relates to surface temperatures and that is why it will remain so.
It might be easier to get a precision metric of ocean heat in joules, and that item will be measured. But the metric both scientists and laymen will pay the most attention to will remain surface temperatures, with emphasis on temperatures in inhabited land areas.

Robert of Ottawa
April 26, 2011 3:54 pm

Well, Hot Damn!

Robert of Ottawa
April 26, 2011 3:55 pm

Absolutely agree with you Stephen Wilde; ocean heat content is THE definative indicator of planetary “warming”

Jer0me
April 26, 2011 4:12 pm

Very, very interesting.
Agnostics prefer not rely on such a variable metric as ‘average’ ‘global’ ‘temperature’ (all words that could be misinterpreted or mis-measured are quoted – ooops, that’s all of them!). Believers seem to want to use it, and want to disregard others.
I particularly like Gavin’s ‘it is the definition because it is the definition’ blinkered response. That is very telling. To me, it really says: “This says what we want it to say, so this is the one we have to use”.
Sea levels, though? Really? Is there any possible signal that could be more easily swamped by daily noise than sea levels? It varies by tide, currents and air pressure, and none of the effects of these can be fully determined! Perhaps because it is such a small metric, and needs so much ‘adjusting’? It is heading down fast now, however, so maybe he would change his mind soon.
I have to agree that sea temperature is the only solid and defensible metric. That is where most of the ‘energy’ in the system is, and that is what we want to measure, is it not? The rest is just natural variable and noise, surely?

Matthew W.
April 26, 2011 4:21 pm

At the very end……………….. What difference does it make???
The planet will do fine either way.

Robert of Ottawa
April 26, 2011 4:21 pm

rpielke April 26, 2011 at 2:47 pm
Joules is a unit of energy. Just being picky, but words mean things.

Robert of Ottawa
April 26, 2011 4:22 pm

OK OK Joule is a unit of energy. Joules are several units of energy. Bad grammar; bad boy; Mea Culpa!

April 26, 2011 4:35 pm

Given the variety of measurement errors affecting surface temperature measurements… which exceed the so-called sensitivity range in many instances… and given the regional distribution of both heating and cooling patterns, it would seem that climate changes might better be monitored by various proxies monitored in strictly rural/wilderness areas.
Climate is not only temperature, but precipitation, number of sunny days, and a variety of other influences on life forms. So, once you eliminate the effects of pollution and artificial sources of heat and drought, you can then examine the “natural” phenomena affecting the “biosphere.”
But we seem to like simple, easily understood, wrong answers to the complex, insightful analyses of the dynamics of our environment. Dr. Pielke, Sr. pushes in the direction of this complexity while alarmists push in the direction of simplicity. And once you begin to grasp the quantitative aspect of this complexity, you can begin to answer the question of whether or not the various changes are beneficial or detrimental toward supporting more or less life on earth. Hint: would longer growing seasons above 45 degrees latitude be beneficial or harmful to most life forms in that zone?

Dr A Burns
April 26, 2011 4:40 pm

Trenberth above claims sea levels are the best climate sensitivity measure. He said to me in an email correspondence:
Trenberth: The rates (of sea level rise) have not been steady and picked up markedly in the mid 20th century and even more since 1990 or so. CO2 has been increasing since 1750 although mainly since 1850.
One wonders where he gets his data ?

George E. Smith
April 26, 2011 5:09 pm

“”””” Arfur Bryant says:
April 26, 2011 at 3:40 pm
Here’s what the IPCC says about ‘climate sensitivity’:
8.6.1
“Climate sensitivity is a metric used to characterise the response of the global climate system to a given forcing. It is broadly defined as the equilibrium global mean surface temperature change following a doubling of atmospheric CO2 concentration…” “””””
You see the varmints can’t even get it straight among themselves; or izzat amongst themselves !
Gavin; excuse me; that’s Dr Schmidt says it is the “”””” “Climate sensitivity” is *defined* as being the equilibrium response of the global mean surface temperature to a change in radiative forcing while holding a number of things constant “””””
And the IPCC says it is: “”””” It is broadly defined as the equilibrium global mean surface temperature change following a doubling of atmospheric CO2 concentration…” “””””
So which the hell is it that drives the “Global Mean Surface Temperature” Is it a doubling of the atmospheric CO2 abundance, or is it a change in “Radiative Forcing”.
One of those would seem to be in “PPM CO2 ratio”, and the other would seem to be in “Watts per square metre.”; and either one would seem to require that the Global Mean Temperature come to some sort of equilibrium; which it never has, and never does.
I always thought that it was the late Dr Stephen Schneider who invented “Climate Sensitivity”, and defined it as the increase in Global mean Surface Temperature for a doubling of the atmospheric CO2 molecular abundance; thereby enshrining forever the myth that somehow the Temperature is a logarithmic Function of the CO2 abundance; which it clearly isn’t. Going from 1 PPM of CO2 , to 2 PPM of CO2 is surely not going to have the same result as going from 280 PPM to 560 PPM; which a logarithmic relationship would demand; or is it only the current doubling of CO2 from its millions of years history of being 280 PPM to its soon to be value of 560 PPM.
Yes Climatism surely is an exact science; so is bar room darts.

Brian H
April 26, 2011 5:58 pm

It’s almost meaningless, and is actually around 0.5.
😉

Quis custoddiet ipos custodes
April 26, 2011 6:05 pm

A report was just published that talks about the POTENTIAL effects of higher temperatures from climate change and how that might effect rainfall, snowfall and runoff- Interior Releases Report Highlighting Impacts of Climate Change to Western Water Resources http://www.doi.gov/news/pressreleases/Interior-Releases-Report-Highlighting-Impacts-of-Climate-Change-to-Western-Water-Resources.cfm and noted here as well http://www.scpr.org/news/2011/04/25/report-climate-change-worsens-western-water-woes/?
I personally find the “global annual average surface temperature anomaly” to be a fairly useless metric when comes down to making policy decision on what I (or the policy makers out here in CA) should be focusing on. Local metrics are needed.

stevo
April 26, 2011 6:11 pm

For how long have you been interested in climate issues, Mr Watts? Some years, maybe? How, then, are you still unable to distinguish between a transient response and an equilibrium response? The graph you post is meaningless.
I think there are three possibilities here:
1. You can’t understand the basics of climate science
2. You could understand, but you have a closed mind and refuse to understand.
3. You do understand, but you’re deliberately trying to mislead others.
Which is it?

KevinK
April 26, 2011 6:36 pm

Temperature is MOST DEFINITIVELY NOT A MEASURE OF ENERGY!!!!!!
Temperature is a measure of the rate at which molecules are vibrating. Temperature is the RESULT of the amount of energy absorbed by a volume of material AND the thermal capacity of said material.
I suggest a simple experiment; place a 1 inch sphere of copper (or 25 mm for those outside of the USA) in a pot of boiling water (212 F/100 C). Also place a sphere of polystyrene (same size with whichever units you prefer) in the same pot of boiling water. Wait 30 minutes or so for the cubes to reach “equilibrium” with the boiling water. Now, remove both with a set of tongs and firmly hold one in your right hand and one in your left hand. Please report back which hand has a nasty burn ?
Conclusions;
TEMPERATURE DOES NOT EQUAL ENERGY; it depends on the thermal capacity of the materials involved………………………..
THE SPEED OF HEAT TRAVELLING THROUGH A SYSTEM IS CRITICAL; if you leave a metal pipe wrench in your yard in August (sunny day in the Northern Hemisphere) and pick it up after lunch the chances are you will wince and drop it. If you leave a plastic pool toy next to it and pick it up at the same time you will probably not wince. WHY ? the metal has a higher thermal capacity AND a higher speed of heat (aka thermal diffusivity). So the metal (which is at about the same temperature as the plastic) has absorbed much more energy and can transfer it to your skin much faster than the plastic can.
As long as the climate scientists insist that they can calculate “net energy gains” and use temperature as their means for ”observing” these gains they are just travelling further down their own little rabbit hole.
Those of us in the engineering field that have prepared proper energy budgets (yes, they do exist for Earth orbiting satellites among other applications) use the detection of “energy gain” as a RED FLAG telling us that our calculations are WRONG AND CANNOT REPRESENT REALITY.
If you doubt this, I suggest you do an online patent search on the following terms;
“voltage amplifier” (a device that produces “net voltage gain”)
Or
“current amplifier” (etc….)
Or
“torque amplifier”, or “power amplifier”, “signal amplifier”, “pressure amplifier”, or just “amplifier”.
NOW SEARCH ON “ENERGY AMPLIFIER”………………
So the “Net Energy Gain” calculated by the climate scientists has yet to be discovered by the engineers and I bet after 30 years of this AGW BS we would have figured out how to make BILLIONS out of it……….
The only useful (but still NOISY) signal that tells us anything about the energy content of the Earth is the temperature of the oceans (which have the largest thermal capacity and a mostly constant thermal capacity over the temps of interest i.e. below freezing to 100F or so).
Cheers, Kevin.

Frank K.
April 26, 2011 7:03 pm

It is ironic that Gavin Schmidt calls the question “ill-posed,” while not apparently realizing that the climate “models” (which attempt to render a numerical mathematical “solution” to adhoc collections of coupled, non-linear, partial differential equations) are themselves “ill-posed.”
And someday, we may even find out what equations are really being “solved” by Model E…

Mike from Canmore
April 26, 2011 7:10 pm

Dr. Pielke:
Thanks for responding. I read on your blog a long time ago how you believe joule content is the key measure and it made a whole lot of sense to me. (FYI; Your blog was one of the first blogs I ever read when I started looking closer at this whole AGW thing)
I also understand what a joule is. (it hasn’t been that long since I graduated from mech eng!!) I just don’t know if any to measure joules other than using a temperature proxy.
Is there another way?
Thanks in advance.
Mike Hodges

Mike from Canmore
April 26, 2011 7:11 pm

Now to make sense of my last sentence:
I just don’t know OF any way to measure joules other than using a temperature proxy.
Cheers

Keith Minto
April 26, 2011 7:15 pm

If you assume the NH and SH hold their water stable as ice over a reasonable period of time and assume precipitation neither adds nor subtracts from the equation then Trenberth’s suggestion of using a long term, easily understood metric like sea level as a proxy for heat gained /heat lost, on a global basis, may make sense.

MikieN
April 26, 2011 7:25 pm

I have a problem with that graph you put up. Tamino told me that the models show acceleration in warming; your chart shows deceleration.

Doug Badgero
April 26, 2011 7:27 pm

Gavin and Trenberth are trivially correct and I believe we should all welcome their engagement on the subject. However, the problem with this metric is that in a deterministically chaotic system the knowledge of what the “climate sensitivity” is yesterday tells us nearly nothing about what it is today or will be tomorrow. What it is not, is constant on any time scale.
The problem with basing any attempted determination of climate response on any temperature, and especially an “air” temperature, is that energy moving around within the system will confound attempts to determine the magnitude and sign of even an instantaneous sensitivity parameter. The “global mean surface temperature” can change on both short and medium time scales and it doesn’t tell us what we think it tells us about what external and internal forcings are doing.

sky
April 26, 2011 7:40 pm

KevinK says:
April 26, 2011 at 6:36 pm
“As long as the climate scientists insist that they can calculate “net energy gains” and use temperature as their means for ”observing” these gains they are just travelling further down their own little rabbit hole.”
Well put! For basic physical reasons that few in climate science seem to understand properly, the entire question of “climate sensitivity” to CO2 changes is largely an empty one. Because they produce no energy, GHG’s cannot “force” anything in any strict physical sense. What we should be asking is what the effect of different concentrations may be upon the rate of thermal energy transfer from atmosphere to space, when the principal mode of transfer from surface to atmosphere is moist convection. In other words, is the climate system’s Lyapunov exponent being affected? Globally averaged surface temperatures do not address that crucial question.

rpielke
April 26, 2011 7:43 pm

Mike Hodges – Thank you for your comment. In the oceans, layer averaged temperatures (i.e. mass weighted) are used to compute Joules. Similarly, John Christy’s suggestion to use tropospheric temperatures is a mass weighted average and can be directly converted to Joules of heat (although without the water vapor component). Mass must be part of any calculation of heat content.
The global annual average surface temperature trend (besides not considering water vapor trends) has inegligible mass associated with it. The use of such a temperature by itself is not the same as heat.

April 26, 2011 8:03 pm

stevo says:

For how long have you been interested in climate issues, Mr Watts? Some years, maybe? How, then, are you still unable to distinguish between a transient response and an equilibrium response? The graph you post is meaningless.

For how many years have you been reading English, stevo? Anthony wrote (at the very top):

Dr. Roger Pielke Senior posted this today, since he has no comments on his blog, I felt it would be good to repost it here to allow discussion – Anthony

In other words, Anthony is simply providing a service, a forum for discussion on an article he didn’t write and expressed no opinion about. You, OTOH, gave us:
“The graph you post is meaningless.” Do tell. I mean it: do tell. Explain why it is MEANINGLESS (as opposed to imprecise, wrong, inaccurate, misleading, leaves out crucial information, etc.). After all, when you insult the host of the blog with your condescending nasty little list of ‘alternatives’, you had better be accurate yourself. And since the invitation by Anthony was for discussion, not for baseless nastiness, you have posted under false pretences by not including any clear argument we can have a discussion about. Typical anonymous astroturfing coward.

Ron
April 26, 2011 8:14 pm

Am I the only one who finds it remarkable that a meaningful discussion on CLIMATE SENSITIVITY does not anywhere (according to the Firefox “find” facility) include the name Lindzen?

naturalclimate
April 26, 2011 8:16 pm

The chart shows sensitivity at 1.2°C per doubling, often also called the no-feedback sensitivity. One could argue that the CO2 “equivalent” for all practical purposes is already at double (or it will be in a few years). So the 0.6°C change which is often quoted for the total surface temperature change since pre-industrial times, is the climate sensitivity, including all feedbacks, assuming that you’ve already discounted anything natural (which is almost by definition for any good AGW proponent). This, of course, also means that feedback is negative since the “no feedback” sensitivity is 1.2°C, and anything that reduces total system response must be a negative feedback (-0.5 in this case). So even assuming the entire change is caused by CO2 and equivalents, positive feedback is hopelessly missing.
The atmosphere responds virtually instantly to a forcing, compared with the time constant of the oceans (no matter what mixing assumptions you want to make). Ocean heat capacity is so vast you’re almost safe assuming any change in radiative forcing of the atmosphere has not been significantly buffered by it. In other words the time constant of air is so fast, and its mass so small in comparison to the oceans, that the total change in heat of the atmosphere is representative of the influence of the sum of the forcings at work. I don’t see evidence for positive feedback in any of it.

rbateman
April 26, 2011 8:53 pm

Using the Global Surface Temperature is like trying to estimate the runoff from snowpack using data obtained at a single altitude. Each season brings differing amounts of water content at differing altitudes.

Cementafriend
April 26, 2011 9:37 pm

On slide 26 of the following http://climategate.nl/wp-content/uploads/2010/09/KNMI_voordracht_VanAndel.pdf it is stated that Trenberth knows that the radiation window is 66W/m2 (as measured by statellites) rather than the 40 W/m2 in his heat balance papers (where is the missing heat). Is his non-correction of his paper then scientific fraud? Trenberth has no credibility in my eyes and it seems he has no understanding of the basic theory of heat transfer. Much the same applies to Gavin Schmidt. On another blog Nusselt numbers and the Schmidt number were mentioned. It appeared Gavin had to look up what the Schmidt number was on Wiki and then replied what did these numbers have to do with climate assessment.
Dr Pielke has it right that climate sensitivity to CO2 is a useless number. However, there is a more important reason, which is reflected in the lag of CO2 after temperature changes found in longterm (800yrs in icecores), medium term (about 5 yrs – Beck and some other researchers) , and short term seasonal and daily ( eg in Kreutz 1941), that there is practical zero sensitivity. This is indicated by Miskcolzci and Van Andel http://climategate.nl/wp-content/uploads/2011/02/CO2_and_climate_v7.pdf and also in Van Andel peer reviewed papers in Energy and Environment (same issue as Willis’s paper about thunderstorms). I have calculated the potentaial energy absorption of CO2 using Prof Hoyt Hottel’s equation and find it insignificant. Climate scientists appear not to understand heat transfer by radiation, convection and phase change (evaporation and condensation). As indicated in many papers all the climate models give wrong results because the authors included CO2 and leave out other important mechamisms.
Repeat the climate sensitivity to CO2 should be close to zero. Please anyone prove by unjusted accurate measurements that is significant ie greater than 0.2K/100 years

Eric Anderson
April 26, 2011 10:03 pm

stevo (and others who have missed it), the graph is from Warren Meyers and is included in the post, I presume, for a catchy graphical representation of the sensitivity issue. For those who are interested in the substance of the graph, see Warren Meyers’ blog and related post, where he makes some interesting observations about the alleged climate sensitivity.

Cementafriend
April 26, 2011 10:15 pm

On slide 26 of the following http://climategate.nl/wp-content/uploads/2010/09/KNMI_voordracht_VanAndel.pdf it is stated that Trenberth knows that the radiation window is 66W/m2 (as measured by statellites) rather than the 40 W/m2 in his heat balance papers (where is the missing heat). Is his non-correction of his paper then scientific fraud? Trenberth has no credibility in my eyes and it seems he has no understanding of the basic theory of heat transfer. Much the same applies to Gavin Schmidt. On another blog Nusselt numbers and the Schmidt number were mentioned. It appeared Gavin had to look up what the Schmidt number was on Wiki and then replied what did these numbers have to do with climate assessment.
Dr Pielke has it right that climate sensitivity to CO2 is a useless number. However, there is a more important reason, which is reflected in the lag of CO2 after temperature changes found in longterm (800yrs in icecores), medium term (about 5 yrs – Beck and some other researchers) , and short term seasonal and daily ( eg in Kreutz 1941), that there is practical zero sensitivity. This is indicated by Miskcolzci and Van Andel http://climategate.nl/wp-content/uploads/2011/02/CO2_and_climate_v7.pdf
and also in Van Andel peer reviewed papers in Energy and Environment (same issue as Willis’s paper about thunderstorms). I have calculated the potentaial energy absorption of CO2 using Prof Hoyt Hottel’s equation and find it insignificant. Climate scientists appear not to understand heat transfer by radiation, convection and phase change (evaporation and condensation). As indicated in many papers all the climate models give wrong results because the authors included CO2 and leave out other important mechamisms.
Repeat the climate sensitivity to CO2 should be close to zero. Please anyone prove by unjusted accurate measurements that is significant ie greater than 0.2K/100 years

Mike from Canmore
April 26, 2011 10:50 pm

Dr. Pielke: Thanks
Mike Hodges

Roy Clark
April 26, 2011 10:56 pm

There is no ‘climate equilibrium’ on any time scale.
There is no climate sentitivity to CO2.
The changes in the meteorological surface temperature record are caused by changes in ocean surface temperatures, urban heat island effects and plain old ‘fixin’ of the temperature records – ‘homgenization’ etc.
Climate sensitivity to CO2 is part of climate astrology not climate science.

AlanG
April 26, 2011 10:58 pm

Firstly, the chart is nonsense. The red and blue lines for climate sensitivity should meet at 0 ppm CO2 and not 280 ppm.
Secondly, talk of surface temperature as a metric would be OK if the metrology (measurement) was any good, but it isn’t. You would think that with all the money spent on climate science there would be a grid of well placed, very accurate thermometers all around the would measuring temperature +/- 0.001 degree. But there isn’t as there is no budget for it. Instead we have a moving set of poorly placed, inaccurate thermometers with no calibration or quality control. All the charts of average temperature produced contain a different set of thermometers at the start and end – apple and pears charts – so are just useless. Averaging readings from 100 thermometers that are accurate to 1 degree and pretending the average is accurate to +/- 0.01 degree is deliberately misleading. This is NO usable record of average surface temperature.
Lastly, there is the problem of averaging, whether you are averaging min and max temperatures or temperatures from multiple sites. Take two places, one where min is 10 and max is 20, the other where min is 5 and max is 25. Both have an average temperature of 15. But apply the Stefan–Boltzmann law as you will see that the second place emits more LW radiation.
It’s as plain as day that climate sensitivity is highly non-linear and there is no single value for climate sensivity. Even if there was, you will never discover it from any available metric because the metrology is just not accurate enough.

don penman
April 26, 2011 11:57 pm

I think that the best metric to use in order to detect the effect of increased co2 on the earths radiation balance depends on the time scale which you believe that it has an effect,if the effect is immediate we should see that in changing SAT but if that change will occur in a thousand years then the best place to look would be the oceans.I don’t believe we are seeing change in SAT that would increase SAT in 50 years by the amount predicted by AGW theory.

April 27, 2011 1:02 am

Lots of confusion in this thread. the various heavy weights are commenting from perspective of their own research, and there are a number of assumptions about what the IPCC says that have to be put into context. So quick physics lesson and quick deconstruct of what IPCC says in that context.
Joule – measure of energy. Takes about 4.2 joules to heat one cubic centimeter of water by 1 degree C assuming a starting temp of 20 C.
Watt – meaure of joules per second.
Forcing – Increase in joules per second per square meter by increases in CO2. So a doubling of CO2 results in a forcing of 3.7 joules per second per square meter of the earth which results in 1 degree of warming. Summarized, the IPCC position is that direct forcing from CO2 doubling is as follows:
CO2 Doubling = +3.7w/m2 = +1 degree C. Repeated all over AR4. Here’s the out of context stuff they don’t make easy to find:
1. That sensitivity calculation relies on the forcing of 3.7w/m2 against the “effective black body temperature of earth as seecn from space. This is explained in AR3 in some detail, but in AR4 they drop that explanatio and make vague references to the statement meaning “earch surface” . This is incorrent. Stefan Boltzmann Law defines how much a given body will change in temperature |(at equilibrium) if additional forcing in w/mw is added to it. The formula is P(w/m2)= 5.67 x 10 to the power of -8 times T in degrees K raised to the power of 4 or:
So P=5*67*10^-8*K^4 Working backwards from that, we discover that 3.7 watts results in a rise in temperature of 1 degree, provided that the starting temperature was -20C. But the average surface temp of earth is +15C. So the fine print in AR3 has been dropped and then misquoted in AR4. In fact, if you do the calcs, it should read like this:
CO2 Doubling = +3.7w/m2 = +1 degree at temperature of earth as seen from space which is -20C.
CO2 Doubling = +3.7w/m2 = +0.6 degree at earth surface which is +15C.
The next thing to keep in mind is that Doubling quote is almost always in the context of “if CO2 doubles from levels in 1920, temps will rise 1.0 degrees”. We’re not in 1920, we’re 90 years later in 2010. During the time period from 1920 to now, CO2 had gone from 280ppm to 390ppm. They quote the effects of doubling from 280, convenienty forgetting to differentiate who has already happened from what will happen in the future by bundling the two together as if they were one and the same. Bull. that’s not how is shoud be presented. there are two proper ways to present it.
1. CO2 doubling from 280 to 560 will result in an AT SURFACE temperature rise of 0.6 degrees. We’ve gone from 280 to 390 already, so over half (due to the logarithmic curve) should already have happened, about 0.4 of it. So saying 280 to 560 = +1 degree is meaningless. Saying 390 to 560 would describe how much we have left to go if we do double from 280 to 560, but prorating so we’re looking at 390 (where we are now) to 560 yields an increase of just 0.2 degrees at surface.
2. Doubling of CO2 from where we are now would add 3.7w/m2 = +1.0 as seen from space = +.6 at surface. That would be 390 doubling to 780 would cause +0.6 degrees at surface.
But the IPCC carefully avoids these far more practical and meaningfull ways to present the data. Consider the final numbers once we adjust for base starting point and if we are looking from space or at the surface. What does that yield?
In 1) above, going from NOW, 390ppm to 560PPM (double 1920) = 0.2 degrees at surface. At the higest rates of CO2 production that we have ever hit, that’s nearly two centuries from now. And 0.2 degrees, Whoopdeedoo.
But 2) becomes even more rediculous. We’re at 390 now. If we double to 780 ppm from where we are now, we’ll get another 3.7 w/m2 = +0.6 degrees at surface from where we are now. CO2 has been rising almost linearly at about 1.5 ppm per year. So in 250 years we could expect another 0.6 degrees higher surface temps.
But the IPCC would rather include what has already happened in their calculations of what is going to happen because not doing that exposes the fact that CO2 is logarithmic. Extend that to CO2 QUADRUPLING from 390 to 1670 ppm of CO2. How much temperature change is that at surface? 1.2 degrees higher than we are now and at 1.5 ppm per year, it will only take 1,000 years to achieve it. Whoopdeedoo.

April 27, 2011 1:09 am

I am also more inclined now towards thinking that the temperature of the oceans may be a better indicator of any “global warming” (yuck) than surface temperature on earth. Afterall, earth is 70% water and 30% land. However, can somebody here direct me to places where temps. of the seas and oceans are measured and where we have some good reliable records going back in time a bit?

April 27, 2011 1:56 am

HenryP
Try this:
http://climexp.knmi.nl/select.cgi
Almost any climate record you can think of downloadable from one site.

Cirrius Man
April 27, 2011 2:01 am

It’s obvious from Gavin’s response that he suffers from Climate Sensitivity !

JohnB
April 27, 2011 2:27 am

I have to agree with DRs Schmidt and Trenberth. “Climate Sensitivity” is defined as the equilibrium temperature change for a doubling of CO2. It’s not like the term is just now being defined, it was defined in the past and as such is what it is.
Whether it is the best metric or not is another question. Heat content in joules of the ocean/atmosphere may be more accurate for some purposes, so perhaps some bright person can define a new metric using that figure. The downside to joules is the “So what?” factor. It gives a baseline figure but very little detail. Do the extra joules in the system mean the oceans are warmer? The winds stronger? The ocean currents faster? All these represent extra energy in the system, but which one is changing? It’s rather imprecise for practical purposes because it doesn’t tell you what you are measuring.
OTOH, using the GMST is using a usable metric, temperature. Yes there are problems with the surface temps, we know that, but at least it is measurable for change.
The thing is that climate (as everybody realises) is incredibly complex and as such I doubt that there is one single metric that can be applied. It is far more likely that a number of metrics are needed, each focussing on a different part of the climate system.
For example, I doubt that an engineer designing a boiler and steam transport system uses just one metric as temps, pressues, volumes and flow rates all come into play. Climate is a similar situation but a much more chaotic system.
So one of the metrics for climate, “Climate Sensitivity” has been defined in the general world. Rather than complaining that it might not be the “best” metric (is there a best one?) help work out what the others are and how they should be defined.
Because Gavin et. al. deals with surface temps, then the current definition of climate sensitivity is the “best” for his work because it’s about temperatures. Other climate scientists dealing with other facets of the climate would be expected to use other metrics. Rather than being condescending, that was all Gavin was getting at in his comment. There’s no “one number” that fits everything, but temps give a good starting point that other metrics can improve on in other areas.
I hope I made sense, but I’ve had the “Dreaded Lurgi” for a week, so I’m not sure.

Geoff Sherrington
April 27, 2011 4:18 am

Please describe the mathematical equations for the blue and red curves in the leader prepared by Warren Meyers. We are repeatedly told that these curves are (like) logairthmic or exponential curves. What is the X=zero intercept of curves on the Y-axis? How can they be different? Are the curves proven as logarithmic or are they exponential or of another form? What is the power(s) to which they are raised, and the equation coefficients? Doubling is a doubling of what? How do we know that we are not already on or passing the plateau part of a curve of diminishing returns? What evidence allows both curves to be pegged at 280 ppm pre-industrial? Why is there but one mention of the word “noise” in the articles, when the whole diagram might be within the range of natural variation? What assumptions are made about the change of rate of depletion of atmospheric CO2 with concentration? Why are ML values used when, in the valleys of land nearer sea level, CO2 can be 10 times as high as shown here, thereby adding to the mean (but not included)?
Surely this essay takes poor signal:noise ratios to a misleading simplicity.

Bill Illis
April 27, 2011 7:10 am

How long are we supposed to wait until we can start talking about what is right and what is wrong with this theory.
CO2 increased above the equilibrium level of 280 ppm around 1800.
It is 211 years later and we should be able to start measuring this darn thing by now. It should be clear enough.
Someone mentioned Transient above. Even the definition of this is up in the air. Sometimes 7 years, sometimes 23 years, sometimes 30, sometimes 150 years and even 1500 years.
The top of the atmosphere emission layer, the lower troposphere, the surface, ocean heat content – we don’t even know what to measure.
What about water vapour which is supposed to contribute 2.0C of the 3.0C per doubling. We should at least have solid numbers on this given it is the most important aspect. My numbers going back to 1948 show no change at all in water vapour levels but nobody can agree on what numbers to use.
It is time to make the call on what we are going to measure and how it will be measured so we can start seeing how much of this theory is correct.

April 27, 2011 7:16 am

Geoff Sherrington says:
April 27, 2011 at 4:18 am
Please describe the mathematical equations for the blue and red curves in the leader prepared by Warren Meyers. We are repeatedly told that these curves are (like) logairthmic or exponential curves.
I believe the IPCC uses the following: dF = 5.35 ln(C/Co) So a change in forcing equals the natural log of the present CO2 reading (380 ppm) divided by the starting point reading (280 ppm) times a constant. So where the red and blue lines cross is 280 ppm. ie ln 1 = 0 and we go from there.

Dave in Delaware
April 27, 2011 7:23 am

An instructive comment on Temperature versus Energy in the atmosphere – the Humidity makes a big difference, as shown in this comment
from – Max Hugoson says: June 7, 2010 at 9:49 am
http://wattsupwiththat.com/2010/06/07/some-people-claim-that-theres-a-human-to-blame/#more-20260
But to all the people playing “average temperature”, and in the spirit of trying to do GOOD ENGINEERING WORK… “average temperature” is a FICTION and MEANINGLESS. Here is why: Go to any online psychometric calculator. (Heh, heh, I use the old English units, if you are fixated on Metric, get a calculator!)
Put in 105 F and 15% R.H. That’s Phoenix on a typical June day.
Then put in 85 F and 70% RH. That’s MN on many spring/summer days.
What’s the ENERGY CONTENT per cubic foot of air? 33 BTU for the PHX sample and 38 BTU for the MN sample.
So the LOWER TEMPERATURE has the higher amount of energy.

April 27, 2011 8:09 am

Dave in Delaware says:
April 27, 2011 at 7:23 am
The weather channel gives a “feels like” temperature when it is humid. So even the weather channel knows, but won’t admit that water is far more important than CO2.

Richard M
April 27, 2011 8:13 am

It seems to me that any exercise that ignores the rest of the GHGs could lead to confusion. If we don’t know the changes in methane, ozone, H2O, etc. over the same time period then we don’t know squat about CO2 either.
And don’t forget the rest of the air must also have some absorption/radiation and, since it’s there’s so much more of it, how can any analysis ignore it?

April 27, 2011 9:21 am

Thanks to DavidMHoffer.
Your logical workup was excellent.

Tom in Florida
April 27, 2011 9:43 am

mkelly says: (April 27, 2011 at 8:09 am)
“The weather channel gives a “feels like” temperature when it is humid. So even the weather channel knows, but won’t admit that water is far more important than CO2.”
Perhaps there should also be a “should be” temperature given, one that tells us what the temperature would be if CO2 was back at the 280 ppm level. It should be easy enough to do, after all, the believers claim to know how much heating we have added since that era.

April 27, 2011 9:54 am

great comments from everyone here
As I have said before
1) there is some radiative warming caused by Co2 (14-15um)
2) there is some radiative cooling caused by CO2 (various places between 0-5 um)
3) there is also cooling caused by CO2 by taking part in the photo synthesis. There is evidence that earth has become greener the past 50 years or so. I mean, did you ever see a forest grow where it is very cold? Greenery and forests need energy from their surroundings.
The IPCC never considered either 2) or 3).
My question was : what is the net effect of 1) and 2) and 3)?
I suspect that 3) is considerable compared to 1)
I did not get an answer to my question on where to get reliable records of the temps. of the oceans and the seas. At what point below the sea level is the temp. measurement taken?

George E. Smith
April 27, 2011 10:02 am

“”””” JohnB says:
April 27, 2011 at 2:27 am
I have to agree with DRs Schmidt and Trenberth. “Climate Sensitivity” is defined as the equilibrium temperature change for a doubling of CO2. It’s not like the term is just now being defined, it was defined in the past and as such is what it is. “””””
Well John B, that’s the problem; it was “defined” to be that, and that is exactly what I take those two words “Climate Sensitivity” to mean; nothing more and nothing less.
BUT !!! implicit in that quite arbitrary definition is the very definitive statement that : T-T0 = (CS) log2(CO2/CO2/0) where my resource constrained terminology is self evident.
That means that T(560) – T(280) is identically equal to T(2) – T(1) where the numbers in brackets are PPM atmospheric molecular abundance of CO2.
Now if I choose to DEFINE a newly discovered Martian rock mineral sample to be “Amalgamese Tobunganate”, then from now and henceforth, that is exactly what that mineral shall be called.
BUT !!! I do NOT have license to simply define laws of Physics to be just whatever I want them to be; and expect to get general acceptance of my practice. The establishment of “Laws of Physics” requires experimental observations of the pertinent phenomena, and verification that the observed values of variables, is in agreement with those values predicted from application of those laws.
So humans have been “reliably” measuring the atmospheric molecular abundance of CO2 at Mauna Loa, now since about 1957/58; basically the Geo-Physical year, and those pretty much universally accepted results, have see the CO2 value go from 315 PPM in 1957/58 to a present value of 390 PPM; a ratio of 1.238 whose log base two is 0.308.
So we have observed 0.3 of one doubling in CO2 so the increase in Global Mean Temperature is 0.36 deg C for the blue line, and 0.9 deg C for the red line; based on the Schmidt/Trenberth insisted Definition of “Climate Sensitivity” and its IPCC mandated 3:1 fudge factor ratio of uncertainty (=/- 50%).
Well since 1957/58, the Global Mean Surface Temperature has gone up and down all over the place; so for anyone to define “Climate Sensitivity” based on that observed behavior of the real actual operational Physical Laws; the ones that Mother Gaia adheres to; is simply farcical.
I’m perfectly happy to defend the Schmidt/Trenberth claim to that definition of “Climate Sensitivity”, since it is exactly the one I use myself; but Unlike them; it is for me just a definition of words; I acknowledge no proof of any validity to the implied Physical Law behind that definition.

George E. Smith
April 27, 2011 10:28 am

“”””” KevinK says:
April 26, 2011 at 6:36 pm
Temperature is MOST DEFINITIVELY NOT A MEASURE OF ENERGY!!!!!! “””””
Well that is correct; “Kelvins” are a measure of Temperature; and “Joules” are a measure of energy; two different scientists, and two different measures named after them.
In the case of gases they are related, in that the mean kinetic energy per particle; W = (3/2)NkT, where k is 1.38066E-23 Joules per Kelvin, and N is the number of particles in the sample containing that much kinetic energy.
That would seem to imply that if you know one you know the other.

R. Craigen
April 27, 2011 11:52 am

It seems clear that, Mr. Schmidt notwithstanding the problem is semantic, and it is a serious semantic problem in that having infused a particular meaning into the term, alarmists such as him then use that metric to draw conclusions not entailed by it in any way.
He is quite right that one does not measure impact of “climate change” on surface ecosystems and human civilization by metrics involving upper-atmosphere or oceanic changes — you measure this impact by what directly impacts it: surface temperature. He, and apparently some others such as Mr. Trenberth, have chosen to take this as the definition of climate sensitivity. Ok, I’m fine with this. As a mathematician I am quite comfortable with the idea that terminology is arbitrary; you can use whatever labels you like.
The problem comes when you attempt to use this particular metric, as they appear wont to do, as a predictive variable. Surface temperature is an effect, not a cause. The number cannot be fed back into the system or climate models as if it has some important feedback effects. And here is where the semantics of this particular choice of terminology becomes severely problematic.
The term “climate sensitivity” suggests that the variable in question is an important determiner of how the overall climate responds to inputs. It is a serious mistake to infer such a relationship, if surface temperature is the only, or primary, datum involved in the metric. Yes, it is a datum in which we have some practical interest. But, no, it is not a number with any particularly important predictive value; nor does it represent in any direct way any important element of the evolution of our climate system. Ocean heat content, in contrast, does both.
Perhaps I’m conflating terms, but I’m afraid none of your responders’ answers jived very closely with my understanding of sensitivity. I have always understood “sensitivity” in this context to refer to the ratio of actual, observed response to derived, theoretical response.
Thus, if (for example) radiative physics predicts (on whatever simplistic assumptions are invoked) that the change in atmospheric CO2 content over the last 100 years should lead to 0.2 C in warming, and 0.6 C is observed, then one infers a sensitivity of 0.6/0.2 = 3, which is specific to the CO2/temperature relationship, and has no broader meaning than this. I have understood this to be a deliberately naive, back-of-the-envelope calculation that involves far-too-simplistic assumptions such as that the atmosphere is a brick with no internal dynamics and that ALL temperature change can be attributed to CO2 content variation. The factor, 3 (in my example) represents some effects unknown in the sense that they have not been accounted for in the theoretical model used. Perhaps they ARE known, however — one might attribute the amplification by a factor of 3 to H2O responding to CO2, for example.
But none of this is important in understanding the long-term evolution of the climate. I do think these guys are talking past each other.

Brian H
April 27, 2011 3:21 pm

Brian H says:
April 26, 2011 at 5:58 pm
It’s almost meaningless, and is actually around 0.5.

Cementafriend says:
April 26, 2011 at 9:37 pm

Repeat the climate sensitivity to CO2 should be close to zero. Please anyone prove by un[ad]justed accurate measurements that is significant ie greater than 0.2K/100 years

I apologize for my gross overestimation!

Frank
April 27, 2011 5:15 pm

Speaking both sarcastically and seriously, a more useful metric for climate change is DISTANCE per doubling of CO2. Most of the people being asked to reduce GHG emissions live northern temperate zone and many of these people are somewhat familiar with how the climate changes as one moves north and south. For example, mean annual temperature in North Dakota and Montana is about 41 degF, while about 1000 miles to the south the mean annual temperature is about 60 degF in Oklahoma and Arkansas. Using this scale, a 1 degC temperature change is equivalent to moving 100 miles or 150 kilometers to the south, a trend which is reasonably accurate for most of the US (except Oregon and Washington). So the IPCC is telling us that climate sensitivity is 150-450 miles or 230-690 kilometers south for 2X CO2. (We can adapt by moving that the same distance to the north. Despite warming over the past 30 years, more people are moving south than north in the US.)
Given that global warming is anticipated to be greatest at night and in the winter – when high temperature is usually not usually a problem – the effective climate sensitivity is probably about half of this distance.
For the environmentally-aware residents of coastal California – who live where the ocean moderates summer temperature so that the daily high rises about 1 degF for every for every mile one moves away from the ocean – climate sensitivity is about 3-8 miles for 2X CO2. In the atmosphere or mountains, where temperature tends to drop about 6.5 degC/vertical kilometer, 2X CO2 is equivalent to 230-700 vertical meters or 750-2300 ft (which would be significant for some mountain snowpacks).

JohnB
April 27, 2011 6:02 pm

George, I’m still not quite with it so I didn’t fully follow your argument. I’ll read it again later.
As you noted, my point was simply that the definition was what the definition was. Your putting in the maths did give rise to a thought on the limitations of the current definition though.
Perhaps a better metric would be “Temperature Sensitivity” defined as the change in GMST per Watt/Metre Squared of forcing. Whether the forcing is from CO2 or the Sun or whatever, the end result is going to be roughly the same although there would need to be “feedback” and “no feedback” versions of the definition. Which solves the problem you outlined in your earlier comment in this thread.
Similarly you might be able to derive a “Humidity Sensitivity” expressed as the change in humidity per Watt/Metre Squared of forcing. Other sensitivity rules might also be derived, each applying to specific areas.
Thoughts?

KevinK
April 27, 2011 7:37 pm

Sky says;
“For basic physical reasons that few in climate science seem to understand properly, the entire question of “climate sensitivity” to CO2 changes is largely an empty one. Because they produce no energy, GHG’s cannot “force” anything in any strict physical sense. What we should be asking is what the effect of different concentrations may be upon the rate of thermal energy transfer from atmosphere to space, when the principal mode of transfer from surface to atmosphere is moist convection. In other words, is the climate system’s Lyapunov exponent being affected? Globally averaged surface temperatures do not address that crucial question.”
Thank you, as I have said a few times the important factor at play here is the “speed of heat” through the climate system. Since GHG’s CANNOT produce
energy the only remaining question is if they can slow the flow of heat through the system enough to affect the “equilibrium temperature”. I posit that since increases in GHG’s displace the amount of NON-GHG’s in the system the “speed of heat” through the system becomes more weigthed towards the speed of light (IR radiation vs convection) against the speed of heat which is quite a bit slower.
The Earth (including the gases in the atmosphere) are most acurately called a “Blackbody RE-RADIATOR” which is only capable of changing the rate at which energy flows from a TRUE Blackbody Radiator (i.e. the SUN) through the Earth/Atmosphere System to the cold vacuum of space.
Cheers, Kevin.

Larry in Texas
April 28, 2011 12:04 am

Perhaps guys like Schmidt and Trenberth think they are doctors; when the temperature of a human body goes up, there is generally something wrong inside. As we all know the rise in body temperature is certainly an effect, not a cause, of disease (I’m not a doctor, so if any doctor sees a need to correct me or add anything to this observation, go ahead; but that is what I’ve learned). So it would probably seem logical to some simpler minds than mine that surface temperature would likewise be a relevant measure of the state of the climate equilibrium.
However, the climate, as the good guys (like Roger Pielke, Sr. and George E. Smith above) realize, is not quite like the human body. There are way too many other features in the dynamic, chaotic, non-linear climate system that act (and interact) to enable us to focus in on any one effect like surface temperature anomalies. Also, assuming all of the other so-called “variables” constant in that definition seems more like cheating to me.
I agree with Pielke, Sr. that this definition is way too narrow and appears to be misleading in many respects. The argument has always been over definition, and the argument must continue. I hope Pielke, Sr. persists and prevails in his argument.

Gary Swift
April 28, 2011 8:26 am

Okay, it seems clear to me that if you want to pin the tail on human caused CO2 then you certainly want to use GAAST as the primary measurement.
What if you want to measure the forcing from another first order human influence such as land use? Would you still want to use GAAST, or would you want to use some other measurement?
My point is that there’s quite a bit of gamesmanship going on here. Depending on what agenda you are trying to push, the ideal frame of conversation will change. If you are trying to guide the conversation towards man made CO2, then you don’t want to talk about the ocean because it’s too slow and visa versa.

George E. Smith
April 28, 2011 11:49 am

“”””” JohnB says:
April 27, 2011 at 6:02 pm
George, I’m still not quite with it so I didn’t fully follow your argument. I’ll read it again later.
As you noted, my point was simply that the definition was what the definition was. Your putting in the maths did give rise to a thought on the limitations of the current definition though. “””””
Well John, perhaps you are looking for something that is a lot more subtle; but the reality is it is quite simple.
As you say; the definitition is what the definition is; that is really the point that Trenberth and Schmidt made in their comments.
They in effect said :- “”””” climate sensitivity has already beeen defined. so live with it”””””
That is the whole crux of the “gay marriage” hullabaloo. Some folks say; “Marriage has already been defined many thousands of years ago.” So we know what it is. You want to create a different structure; fine: go for it. but you’ll have to come up with your own name for it, because that one has already been used.
The “RMS” value of a varying Voltage, for example, is DEFINED as:- “The square root of the sum of the squares of all the instantaneous Voltage values” (along with the sample time normalization of course).
Now it doesn’t matter a hoot why we may want to know such a number; who cares what the reason is; that’s what those letters are defined to mean, so we can use them, whenever we want to express that functional value; doesn’t matter why we care. They have things called dictionaries that are simply chock full of formal definitions of what words mean; live with it.
So ok; somebody; maybe it was Dr. Stephen Schneider; maybe not, defined “Climate Sensitivity” to be the increase in global mean surface Temperature for a doubling of atmospheric CO2 abundance. Doesn’t matter to me why they decided to define sucha term, and I’ll use their definition if I want to refer to such a thing.
My point was, that it is ONE thing to define in a dictionary, what the meaning of a word is. It is entirely a different matter if that definition enshrines some claim about the laws of physics or mathematics, or any other disicpline.
Suppose I wanted to DEFINE “Solar Sensitivity” to be the increase in the value of the TSI, (Total Solar Irradiance) for a doubling of the percentage of humans wearing two piece bathing suits at any time. Well if nobody else has already claimed “Solar Sensitivity”, then I can put that definition in the dictionary, and people should use it.
In fact why don’t I do that. Anthony, please make a note of that. I have defined, here at WUWT, that henceforth:-
“”””” “Solar Sensitivity” , shall be the change in the TSI (at earth’s mean orbit) for a doubling of the percentage of humans wearing two piece bathing suits, at any time. “””””
Now I have given those previously unused words, a meaning; to be used by anyone who wants to refer to that phenomenon.
Now I don’t have either any experimental observational evidence, that if people wear more two piece bathing suits, that the value of TSI will increase; nor do I know of any theoretical modelled reason why one would expect that to be so; nor of the magnitude of the purported effect.
But so what ! Same thing is true of “Climate Sensitivity”.
We have absolutely no observational experimental evidence that the global mean surface Temperature accurately follows, and is theoretically expected to follow a straight line proportionality to the Logarithm (base 2) of the atmospheric molecular abundance (or vol percent if you like) of Carbon dioxide.
Note that Gavin Schmidt also added the requirement that other variables be held constant; so ONLY CO2 IS ALLOWED TO CHANGE WHEN OBSERVING CLIMATE SENSITIVITY.
So can we all just quit mumbling on about water vapor being just a feedback amplification of a CO2 caused effect.
Gavin Schmidt defines CS as the effect due to CO2 alone sans any variations due to H2O or any other interractions. He says these other variables like aerosols, and ozone, and ice sheets; presumably all the other climate variables stay put; you double the CO2 and you observe the Temperature increase; that’s it.
I accept his definition; Trenberth’s too. I just don’t accept that it is a real physical phenomenon that can be observed here on earth. The definition mandates a log base 2 function. That’s not some “non linear” association of numbers. It’s an exactly defined mathematical function, as is “Climate Sensitivity” as well as “Solar Sensitivity” which I just defined.
So don’t mess with it, and claim that log base 2 is just some amorphous non-linear number association. We know what log base 2 means; just as we know what the word “marriage” means; it’s in the dictionary.

KevinK
April 28, 2011 8:42 pm

To George E. Smith;
You wrote;
“That would seem to imply that if you know one you know the other.”
With respect I would like to suggest that we could rephrase this as;
“If we know something about one item we also know something about another item”
Do we know EVERTHING about the other item? Perhaps, but possibilty not! For example electrical current is certaintly related to electrical voltage, but as the the frequency increases many interesting things occur (i.e. look up “skin effect”). So if we know some things about an effect do we indeed know everything about that effect ?
So, in summary, temperature IS related to energy, but there are many other factors involved including the the thermal capacities of the materials involved.
Cheers, Kevin.

April 29, 2011 1:57 am

‘When I use a word,’ Humpty Dumpty said, in rather a scornful tone, ‘it means just what I choose it to mean — neither more nor less.’
‘The question is,’ said Alice, ‘whether you can make words mean so many different things.’
‘The question is,’ said Humpty Dumpty, ‘which is to be master — that’s all.’

Geoff Sherrington
April 29, 2011 5:56 am

mkelly says: April 27, 2011 at 7:16 am About sensitivity curve shapes.
That is but part of the answer, but I thank you for it.
Do you have a lead as to why the blue and red curves cross at 280 ppm CO2 (thus setting part of the equation) when it would be more logical for them to cross at some more distant time when the temperature could be assumed to be the same for each? Put another way, by forcing the cross at 280 ppm, and by using this dubious equation, we derive vastly different temperature regimes for the decades and centuries before then.
Personally, I think the curves derive from a sine-like shape a quarter of its way through a cycle. The problem is (as Dr Strangelove raises his leather-clad fist) “NOBODY CAN PROVE ME WRONG!!”

cba
April 29, 2011 8:20 am

It is foolish to attribute all the warming to co2 since 1850 since albedo is as important as solar incoming flux to the temperature. There are no albedo measurements for virtually the entire time frame and what little there is is all over the map, swamping out the effect of a co2 variation.
What can be done is to determine the contributions of current cloud cover with today’s albedo and compare that to an Earth with the same albedo and no atmosphere – strictly a thought experiment. One finds a black body 33 deg C below that of today’s actual Earth. One can determine how much power is blocked by the atmosphere from the surface given an average surface T of 288.2k (which emits about 391W/m^2) and an actual Earth emission value of about 235 W/m^2. This means about 156 w/m^2 emitted by the surface is blocked by the atmosphere, ghgs, aerosols, clouds, etc and this blockage results in 33 deg C T rise. The average contribution of an individual W/m^2 is about 0.21 deg C rise for the real Earth. ipcc says a forcing is a forcing is a forcing and has the same results. For every W/m^2 that results in more than 0.21 deg C increase, there must be forcing W/m^2 that produced even less than 0.21 deg C.
By simply using the fraction of surface T being blocked to provide the fraction of surface emission that makes it out of the atmosphere (235/391 = 60% ), one can realize that to increase the surface T by a small amount so that one additional W/m^2 escapes by increasing the T so that 1/0.6 w/m^2 additional is radiated, one can see what a small change must result in. 1/0.6 = 1.7 W/m^2 added emission results roughly in 1 W/m^2 additional escaping. Running this through stefan’s law backwards yields a new temperature of 288.49K, an increase of 0.29 deg C which is definitely in the vicinity of the average value, 0.21 deg C / w/m^2. Note that the 0.21 value is the real Earth average value while the 0.29 sensitivity is a simple radiative only simple model.
Clear sky co2 doubling forcing is usually estimated at 3.7 w/m^2. Using the above sensitivities, one has about 0.78 deg C rise for the average Earth value and a little over 1 deg C for the simple model. Note that over 60% of the surface is covered in clouds at any one time. Co2 effects are going to be far less, barely over half that of clear skies.
Also note that two factors exist with the theory concerning h2o vapor. First, the h2o vapor is the major feedback of the system. Second, as temperature rises the average relative humidity will stay roughly the same while the absolute humidity will increase.
Water vapor has a substantially greater impact per doubling than does co2. It is roughly 2 to 3 times as much W/m^2 as a co2 doubling. For constant RH, one can see from absolute humidity tables that an increase in 5 deg C will result in a 30% increase in the total amount of h2o vapor present and for a 2 deg C rise, h2o will increase by a whopping 13%, roughly 1/8th of a doubling. A 30% increase, almost 1/3 of a doubling, will result in around 3.1 w/m^2 added forcing or feedback, not quite as much as a co2 doubling. A 13% increase is far less than that. This means that if we increase the atmospheric column temperatures by 5 deg C, we can increase the h2o vapor enough to contribute less than 1 deg C along with the co2 doubling that provides just over 3/4 of a deg C. Consequently, we’re missing over 3 degrees C worth of forcing/feedback. For the 2 deg C rise case, we’re now down to missing about half of what is needed after considering the forcing and the supposed main feedback of h2o vapor. And this is for the clear sky only case because the effects will be smaller for the cloudy sky conditions.

George E. Smith
April 29, 2011 8:41 am

“”””” KevinK says:
April 28, 2011 at 8:42 pm
To George E. Smith;
You wrote;
“That would seem to imply that if you know one you know the other.”
With respect I would like to suggest that we could rephrase this as;
“If we know something about one item we also know something about another item” “””””
What is the point in rephrasing my words ? They then become NOT my words, so they don’t have my meaning.
But YOU are free to say anything you want to say, and use your own words saying it. It just wouldn’t be my words, or have my meaning.
So on the Absolute Thermodynamic scale of Temperature the mean kinetic energy per degree of freedom of all of the particles present IS the measure of the Temperature.
E = kT = h(nu) =mc^2
What else is there to know about the relationship between energy and Temperature; their numerical values differ only in the factor, that converts units of energy (Joules) into units of Temperature (Kelvins).
Some Physicists even use a system of units in which c = h = k = 1 (actually they use hbar, and measure frequency (nu) in radians per second). In that case E = T = nu = m
Ask anna v about that system.
I actually have a Major in Radio-Physics, so I’m hep to Volts and Amps, and even skin effect. None of those are related to either Temperature or energy.

April 29, 2011 3:29 pm

cba says: April 29, 2011 at 8:20 am

Note that over 60% of the surface is covered in clouds at any one time.

cba,
I think you’ve got the cloud cover % wrong, should be circa 40%. The Earth’s albedo is 0.39, remember Vangelis?
Here is a link to the World Sunlight Map displayed with a Mollweide projection. The cloud cover shown is updated every 3 hours with current weather satellite imagery.

cba
April 29, 2011 8:21 pm

Philip,
It’s a habit, quoting trenberth’s numbers from KT97, even though they tend to be screwed up quite often. Older albedo values tend to be pushing 0.40 while more recent ones claim around 0.30, give or take a bit. Over 80% of the total albedo is going to be atmospheric, clouds, scattering, etc. and only a little is surface contribution.
However, albedo is definitely a variable.

Bill Illis
April 30, 2011 5:40 am

In terms of Albedo, the number which is used to control satellites (and they do need to take into account all radiation levels in the satellite’s environment including reflected solar to maintain proper orbits) is 29.83%.
This happens to be the exact same number used by Trenberth in his latest Earth Energy Budget paper.
Clouds make up about 16 to 17 percentage points of this number and the Surface makes up about 13 to 14 percentage points.
The Earth is about 65% cloud covered on average (including haze from very thin sirrus clouds which we might not count as cloudy) but on average 75% of the sunlight (including UV) either gets through the clouds or is still reflected to the Earth.
Generally, the surface which is most important to Albedo is Ice and Snow and Sea Ice which is the farthest from the Earth average and can vary the most over time.
The Earth’s Albedo in time looks like it can vary from about 24.5% (Pangea and no Ice on the planet) to about 50.0% (in Snowball Earth conditions). If you want to crunch the numbers on these estimates, you can get pretty close to the historical temperature estimates with just Albedo alone.
Then put 33.3% in for the Last Glacial Maximum – Hansen and climate theory have used 31.0% which is clearly too low in my opinion – this value is tuned so that CO2/GHGs produce 3.0C per doubling. The map of the extent of glaciers and snow and sea ice at the Last Glacial Maximum show it was much higher.

April 30, 2011 6:07 am

Bill Illis says April 30, 2011 at 5:40 am
Bill,
Thanks for your detailed explanation and correction and the update to modern numbers 🙂

cba
April 30, 2011 2:40 pm

Bill,
the numbers I see are somewhat along the lines of KT97, 10-30 w/m^2 surface albedo out of over 100 W/m^2. Clouds and atmosphere are the vast majority of albedo. the numbers you present are twice to three times the surface albedo of these values. There is not that much cryosphere and it is way off to the edge where there is not that much solar power. Your 17% attribution to clouds (and must include atmospheric scattering too) leaves cloud reflectivity at under 30%. What type of cloud reflects (scatters) under 35% at the very least? Your 14% surface contribution, considering only 35% of the surface is exposed (out from under the clouds), results in an average surface albedo of 40%, way higher than Mars or the Moon, and this is with 70% of the surface covered in water, which runs less than 0.04 albedo. That means the 30% land coverage must have an albedo of 0.40 / 0.3 = 1.3. Show me anything with an albedo above 0.8. Essentially, the conditions you present would be for at least a full glaciation period or even snowball Earth conditions for clouds to have that little effect and for the surface to have that great an effect. In fact, you’d have to have a substantial glaciation covering a lot of the Earth’s oceans to achieve this, roughly 50% of the surface assuming fresh snow at maximum snow albedo.

Bill Illis
April 30, 2011 5:30 pm

cba says:
April 30, 2011 at 2:40 pm
———
You have to think of it as, the majority of the solar radiation is still getting through the clouds. So it gets through the clouds and then still reflects off the surface.
What does cloud cover do over the Greenland Glacier. Nothing really, because the Ice is reflecting up to 83% (in some places) of whatever solar radiation (including UV which goes straight through most clouds) back out to space.
These amounts are higher than any cloud has so technically, cloud cover contributes 0 to Albedo over the Greenland Glaciers. Take the clouds away and Albedo is exactly the same.
In fact, take all the clouds away across the whole planet and what is the average surface Albedo – 13 to 14 percentage points (actually it is more like 16 but then there is re-reflection back from the clouds again which drops the numbers a little).

cba
April 30, 2011 7:54 pm

Bill,
70% of the surface is water. That’s less than 0.04 albedo for most of the surface where the majority of incoming power comes in. Reflectivity for non thin clouds tends to be given in the range of 40% to 80%, depending on type and other conditions. What gets through the cloud after the reflections ( scattering) and the internal absorption, a few %, is going to be more like half or less of the total striking the clouds. While greenland will have little cloud effect, 80% surface albedo only happens with fresh snow. What fraction of the average incoming power strikes greenland? What fraction of the incoming solar power occurs over snow or ice?
Of the less than half of the solar power coming through the clouds, the very small average surface albedo reflecting the power back through the clouds is going to have a good portion reflected back down and so only a small part gets scattered outward through the cloud. In any case, the atmospheric scattering back to space exceeds the average surface albedo. Even if snow and ice albedo were 100%, there is just not enough ice and snow to compensate for all that ultra low albedo ocean area.

May 1, 2011 1:08 am

cba says:
and this is with 70% of the surface covered in water, which runs less than 0.04 albedo
Henry@cba
I remember often standing at the beach and getting the sun’s reflection from the water in my face – where the water acts as a mirror, causing me to squint or to turn my eyes away. Counting also the waves that can get come into a right angle, I doubt if the albedo from “sea shine” is only 0.04. Where did you get that value and how was it arrived at ?
I think what you also forget is that a considerable % of that 70% is covered with ice and snow again.

Bill Illis
May 1, 2011 6:02 am

I had a look at the CERES data for March.
I picked out two 1.4 degree by 1.4 degree blocks at -1.4S, (effectively, the Sun was directly overhead) that were 1) the most cloud-covered and had the highest Albedo (100% – Indian Ocean) and 2) the least cloud-covered (18% – near the International Dateline in the Pacific).
1) The 100% cloud-covered block (and had the highest Albedo/solar radiation reflected of several other blocks which were also at 100% – ie, it was thick high cloud) had a net Albedo of 52%.
2) The 18% cloud-covered block had an Albedo of 13%.
Just doing the math and using the 52% Albedo block as a max since it must have had the highest thickest cloud possible, the Ocean Albedo couldn’t be lower than 10% when the Sun is directly overhead in the real Ocean.
Remember there is waves on the Ocean. The amount of Solar energy which is reflected off water increases exponentially as the incidence angle increases. In practise, it only becomes important at about 70 degrees and higher (so this is important for open water at the poles for example and for waves on the Ocean).

cba
May 1, 2011 9:52 am

Henry P, Bill
the wiki article on albedo has pretty good information.
http://en.wikipedia.org/wiki/Albedo
You’ll note that even at 60 degrees from the vertical, one has under 0.1 albedo for water and that is the point where the atmospheric thickness is double that of straight up. At the poles one sees that the water reflections are much greater due to the higher angles.
Bill, your numbers look reasonable but remember too that the reflectivity of clouds varies with the droplet size of the cloud particles. One can have significant variation in the reflectivity and in the albedo both by cloud fractional coverage and by the actual cloud reflectivity – the origins of Lindzen’s Iris theory. for 100% cloud cover 0.52 sounds reasonable or typical. That would be for a cloud average reflectivity of 52% and that is pretty much the middle of the road for typical cloud reflectivity.
Using the same value for 18% cloud cover and simply doing a weighted average with the surface being 0.04 reflectivity, one comes up with an average of 12.6% albedo, very close to your value of 13%.
If you take the idea that something significant portion of the clouds reflectivity is coming through the cloud from the ground, say for the 100% coverage, you’ve got about 50% of the light reflected from the cloud, leaving more like 45% striking the surface after absorptions. Since only 4% reflectivity is occurring, that provides about 1.8% reflecting back up to the clouds and assuming a 50% transmission through the cloud outbound, that reduces to under 1% the total of the incoming.

May 1, 2011 10:21 am

Just to comment here that the wiki article does not mention that the albedo “seashine” is only 0.04. It is clearly indicated at almost double that. But I still wonder how they measured it. Perhaps by looking at pictures like the one from WUWT here above? (if you look carefully at the picture you could say that the yellow/orange area is around 10%?)

cba
May 1, 2011 3:54 pm

Henry,
Check out the section on water in wikipedia’s albedo article, especially the graph and the part about waviness.