
Dr. Roger Pielke Senior posted this today, since he has no comments on his blog, I felt it would be good to repost it here to allow discussion – Anthony
There is an excellent collection of interviews posted by Sam Patterson on April 23 2011 on the weblog Climatequotes.com titled
I have reposted his very informative set of interviews and commentary below.
_________________________________________________
From: Climatequotes.com
Note: I wrote this post many weeks ago and never posted it because I was waiting for some more feedback. However, Pielke Sr. has posted specifically on this issue recently and Watts ran it also, so I feel now is a good time to post it.
This post deals with the the question of whether or not climate sensitivity should be measured by global average surface temperature anomaly. I asked multiple climate scientists their opinion, and their responses are below. First, some background.
Over at The Blackboard there is an interesting guest post by Zeke. He attempts to find areas where agreement can take place by laying out his beliefs and putting a certain confidence level on them. This idea was commented upon by several blogs and scientists. Judith Curry, Anthony Watts, Jeff Id, and Pielke Sr. all contributed. I want to focus on Pielke’s response, because he challenges a core assumption of the exercise.
In Zeke’s post, he gives his position on climate sensitivity:
Climate sensitivity is somewhere between 1.5 C and 4.5 C for a doubling of carbon dioxide, due to feedbacks (primarily water vapor) in the climate system…
Here is Pielke’s response to this claim:
The use of the terminology “climate sensitivity” indicates an importance of the climate system to this temperature range that does not exist. The range of temperatures of “1.5 C and 4.5 C for a doubling of carbon dioxide” refers to a global annual average surface temperature anomaly that is not even directly measurable, and its interpretation is even unclear…
Pielke goes on to explain that he has dealt with this issue previously in the paper entitled “Unresolved issues with the assessment of multi-decadal global land surface temperature trends.” Here is the main thrust of his response:
This view of a surface temperature anomaly expressed by “climate sensitivity” is grossly misleading the public and policymakers as to what are the actual climate metrics that matter to society and the environment. A global annual average surface temperature anomaly is almost irrelevant for any climatic feature of importance.
So we know Pielke’s position. He is adamantly opposed to using surface temperature anomaly when discussing climate sensitivity, for various reasons, not the least of which is it ignores metrics which actually matter to people.
I haven’t heard this view expressed very often, so I decided to contact other climate scientists and find out their opinions on this issue. I asked the following questions and invited them to give their general impressions:
1. Do you believe that global annual average surface temperature anomaly is the best available metric to discuss climate sensitivity?
If yes to Question 1, then:
2. Could you briefly explain why you consider global annual average surface temperature anomaly the best available metric to discuss climate sensitivity?
If no to question 1, then:
2. What do you believe is the proper metric to discuss climate sensitivity, and could you briefly explain why?
John Christy
1. Do you believe that global annual average surface temperature anomaly is the best available metric to discuss climate sensitivity?
No. The surface temperature, especially the nighttime minimum, is affected by numerous factors unrelated to the global atmospheric sensitivity to enhanced greenhouse forcing (I have several papers on this.) The ultimate metric is the number of joules of energy in the system (are they increasing? at what rate?). The ocean is the main source for this repository of energy. A second source, better than the surface, but not as good as the ocean, is the bulk atmospheric temperature (as Roy Spencer uses for climate sensitivity and feedback studies.) The bulk atmosphere represents a lot of mass, and so tells us more about the number of joules that are accumulating.
Patrick Michaels
I think it is a reasonable metric in that it integrates the response of temperature where it is important–i.e. where most things on earth live. However, it needs to be measured in concert with ocean measurements at depth and with both tropospheric and stratospheric temperatures. For example, if there were no stratospheric decline in temperature, then lower tropospheric or surface rises would be hard to attribute to ghg changes. Because we don’t have any stratospheric proxy (that I know of) for the early 20th century, when surface temperature rose about as much as they rose in the late 20th, we really don’t know the ghg component of that (though I suspect it was little to none).
Having said that, I suspect that where we do have such data, it is indicative that the sensitivity is lower than generally assumed, but not as low as has been hypothesized by some.
Gavin Schmidt
Your questions are unfortunately rather ill-posed. This is probably not your fault, but it is indicative of the confusion on these points that exist.
“Climate sensitivity” is *defined* as being the equilibrium response of the global mean surface temperature to a change in radiative forcing while holding a number of things constant (aerosols, ice sheets, vegetation, ozone) (c.f. Charney 1979, Hansen et al, 1984 and thousands of publications since). There is no ambiguity here, no choice of metrics to examine, and no room for any element of belief or non-belief. It is a definition. There are of course different estimates of the surface temperature anomaly, but that isn’t relevant for your question.
There are of course many different metrics that might be sensitive to radiative forcings that one might be interested in: Rainfall patterns, sea ice extent, ocean heat content, winds, cloudiness, ice sheets, ecosystems, tropospheric temperature etc. Since they are part of the climate, they will be sensitive to climate change to some extent. But the specific terminology of “climate sensitivity” or the slightly expanded concept of “Earth System Sensitivity” (i.e Lunt et al, 2010) (that includes the impact on the surface temperature of the variations in the elements held constant in the Charney definition), are very specific and tied directly to surface temperature.
People can certainly hold opinions about which, if any, of these metrics are of interest to them or are important in some way, and I wouldn’t want to prevent anyone from making their views known on this. But people don’t get to redefine commonly-understood and widely-used terms on that basis.
I sent a response to Gavin clarifying my questions, and including Pielke Sr’s comments. Here is his response to Pielke’ comments:
I disagree. Prof. Pielke might not find the global temperature anomaly interesting, but lots of other people do, and as an indicator for other impacts, it is actually pretty good. Large-scale changes in rainfall patterns, sea ice amount, etc. all scale more or less with SAT. (They can vary independently of course, and so ‘one number’ does not provide a comprehensive description of what’s happening).
Kevin Trenberth
1. Do you believe that global annual average surface temperature anomaly is the best available metric to discuss climate sensitivity?
This is not a well posed question. This relates to definition: the sensitivity is defined that way. It is not the best metric for climate change necessarily
If yes to Question 1, then:
2. Could you briefly explain why you consider global annual average surface temperature anomaly the best available metric to discuss climate sensitivity?
I think the best metric overall is probably global sea level as it cuts down on weather and related noise. But global mean temperature can be carried back in time more reliably and it is reasonably good as long as decadal values are used.
If no to question 1, then:
2. What do you believe is the proper metric to discuss climate sensitivity, and could you briefly explain why?
However, it is all variables collectively that make a sound case
Pielke Sr.
We have already discussed Pielke’s position, but I contacted him to find out what metrics he would prefer to use. Here is his response:
1. Do you believe that global annual average surface temperature anomaly
is the best available metric to discuss climate sensitivity?
NO
If yes to Question 1, then:
2. Could you briefly explain why you consider global annual average
surface temperature anomaly the best available metric to discuss
climate sensitivity?
If no to question 1, then:
2. What do you believe is the proper metric to discuss climate
sensitivity, and could you briefly explain why?
The term “climate sensitivity” is not an accurate term to define how the climate system responds to forcing, when it is used to state a response in just the global average surface temperature. This is more than a semantic issue, as the global average surface temperature trend has been the primary metric used to communicate climate effects of human activities to policymakers. The shortcoming of this metric (the global average surface temperature trend) was discussed in depth in
“National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties. Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington, D.C., 208 pp. http://www.nap.edu/openbook/0309095069/html/“
but has been mostly ignored in assessments such as the 2007 IPCC WG1 report.
A more appropriate metric to assess the sensitivity of the climate system heat content to forcing is the response in Joules of the oceans, particularly where most the heat changes occur. I discuss this metric in
Pielke Sr., R.A., 2008: A broader view of the role of humans in the climate system. Physics Today, 61, Vol. 11, 54-55.
http://pielkeclimatesci.files.wordpress.com/2009/10/r-334.pdf
Pielke Sr., R.A., 2003: Heat storage within the Earth system. Bull. Amer. Meteor. Soc., 84, 331-335. http://pielkeclimatesci.files.wordpress.com/2009/10/r-247.pdf
More generally, in terms of true climate sensitivity, more metrics are needed as we discussed in the 2005 NRC report. The Executive summary includes the text [http://www.nap.edu/openbook.php?record_id=11175&page=4]
“Despite all these advantages, the traditional global mean TOA radiative forcing concept has some important limitations, which have come increasingly to light over the past decade. The concept is inadequate for some forcing agents, such as absorbing aerosols and land-use changes, that may have regional climate impacts much greater than would be predicted from TOA radiative forcing. Also, it diagnoses only one measure of climate change “global mean surface temperature response” while offering little information on regional climate change or precipitation. These limitations can be addressed by expanding the radiative forcing concept and through the introduction of additional forcing metrics. In particular, the concept needs to be extended to account for (1) the vertical structure of radiative forcing, (2) regional variability in radiative forcing, and (3) nonradiative forcing. A new metric to account for the vertical structure of radiative forcing is recommended below. Understanding of regional and nonradiative forcings is too premature to recommend specific metrics at this time. Instead, the committee identifies specific research needs to improve quantification and understanding of these forcings.”
It is, therefore, time to move beyond the use of the global annual average surface temperature trend as the metric to define “climate sensitivity”.
Differing views
There are clearly differing views on this subject.
John Christy does not support the metric. He points out that the surface temperature is affected by numerous things other than greenhouse forcing, and then gives two metrics which he prefers. The first is the change in joules in the system, with particular emphasis on the oceans. The second is bulk atmospheric temperature.
Patrick Michaels supports using the metric. He points out that the metric is important because it addresses the area where people live. However, he emphasizes that the surface temperature must be taken in concert with measurements such as ocean temperature at depth, and tropospheric and stratospheric temperatures. Without these other measurements, it would be difficult to assess the impact of GHGs on surface temperature.
Gavin Schmidt supports the metric unreservedly. He and Trenberth rightly point out that climate sensitivity is defined by global average surface temperature anomaly. Of course, the point of my question is challenging whether or not this is the best definition. Gavin seems to think so, and points out that the metric is “commonly-understood and widely-used”. He states that other metrics such as rainfall patterns and sea ice amount track very well with surface air temperature.
Trenberth is very brief, but states that global average surface temperature anomaly is not necessarily the best metric to use for climate change. He considers that global sea level is a better metric because it cuts down on weather related noise. However, he also points out that global average surface temperature anomaly is useful because it can be applied to the past more reliably. He also states that all variables taken together make a sound case.
Pielke Sr. is adamantly opposed to using this metric. We’ve already discussed his reasons. He also proposes a different metric for assessing climate sensitivity, “A more appropriate metric to assess the sensitivity of the climate system heat content to forcing is the response in Joules of the oceans”. He supports these claims with several of his own papers as well as a NRC report.
Conclusion
Pielke and Christy want to stop assessing climate sensitivity by using global average surface temperature anomaly, and both recommend using a change of joules (particularly in the ocean) as a better metric.
Michaels and Trenberth support the metric while emphasizing that other metrics must also be taken into account. Schmidt does not mention any drawbacks and emphasizes that the metric is already widely used and it works well with other metrics.
It seems to me the main problem here isn’t the metric itself, but the emphasis placed on it. I don’t believe that Pielke or Christy believe the metric has no value at all, only that it is a poor choice to use as the main metric when discussing CO2′s impact on climate. In Pielke’s case, the emphasis on CO2 itself is a problem, as he believes that other human impacts are far more important.
Climate science so frequently focuses on CO2 and temperature that it seems natural climate sensitivity would be measured by global average surface temperature anomaly. A shift away from this metric seems unlikely. However, if it can be shown in the future that a change in joules in the ocean directly contradicts other metrics then I’m sure this discussion will come up again. Pielke’s paper mentions an apparent contradiction found by Joshua Willis of JPL, although the measurements are only taken over a four year period. Only time will tell which metric is most valuable.
It seems to me that any exercise that ignores the rest of the GHGs could lead to confusion. If we don’t know the changes in methane, ozone, H2O, etc. over the same time period then we don’t know squat about CO2 either.
And don’t forget the rest of the air must also have some absorption/radiation and, since it’s there’s so much more of it, how can any analysis ignore it?
Thanks to DavidMHoffer.
Your logical workup was excellent.
mkelly says: (April 27, 2011 at 8:09 am)
“The weather channel gives a “feels like” temperature when it is humid. So even the weather channel knows, but won’t admit that water is far more important than CO2.”
Perhaps there should also be a “should be” temperature given, one that tells us what the temperature would be if CO2 was back at the 280 ppm level. It should be easy enough to do, after all, the believers claim to know how much heating we have added since that era.
great comments from everyone here
As I have said before
1) there is some radiative warming caused by Co2 (14-15um)
2) there is some radiative cooling caused by CO2 (various places between 0-5 um)
3) there is also cooling caused by CO2 by taking part in the photo synthesis. There is evidence that earth has become greener the past 50 years or so. I mean, did you ever see a forest grow where it is very cold? Greenery and forests need energy from their surroundings.
The IPCC never considered either 2) or 3).
My question was : what is the net effect of 1) and 2) and 3)?
I suspect that 3) is considerable compared to 1)
I did not get an answer to my question on where to get reliable records of the temps. of the oceans and the seas. At what point below the sea level is the temp. measurement taken?
“”””” JohnB says:
April 27, 2011 at 2:27 am
I have to agree with DRs Schmidt and Trenberth. “Climate Sensitivity” is defined as the equilibrium temperature change for a doubling of CO2. It’s not like the term is just now being defined, it was defined in the past and as such is what it is. “””””
Well John B, that’s the problem; it was “defined” to be that, and that is exactly what I take those two words “Climate Sensitivity” to mean; nothing more and nothing less.
BUT !!! implicit in that quite arbitrary definition is the very definitive statement that : T-T0 = (CS) log2(CO2/CO2/0) where my resource constrained terminology is self evident.
That means that T(560) – T(280) is identically equal to T(2) – T(1) where the numbers in brackets are PPM atmospheric molecular abundance of CO2.
Now if I choose to DEFINE a newly discovered Martian rock mineral sample to be “Amalgamese Tobunganate”, then from now and henceforth, that is exactly what that mineral shall be called.
BUT !!! I do NOT have license to simply define laws of Physics to be just whatever I want them to be; and expect to get general acceptance of my practice. The establishment of “Laws of Physics” requires experimental observations of the pertinent phenomena, and verification that the observed values of variables, is in agreement with those values predicted from application of those laws.
So humans have been “reliably” measuring the atmospheric molecular abundance of CO2 at Mauna Loa, now since about 1957/58; basically the Geo-Physical year, and those pretty much universally accepted results, have see the CO2 value go from 315 PPM in 1957/58 to a present value of 390 PPM; a ratio of 1.238 whose log base two is 0.308.
So we have observed 0.3 of one doubling in CO2 so the increase in Global Mean Temperature is 0.36 deg C for the blue line, and 0.9 deg C for the red line; based on the Schmidt/Trenberth insisted Definition of “Climate Sensitivity” and its IPCC mandated 3:1 fudge factor ratio of uncertainty (=/- 50%).
Well since 1957/58, the Global Mean Surface Temperature has gone up and down all over the place; so for anyone to define “Climate Sensitivity” based on that observed behavior of the real actual operational Physical Laws; the ones that Mother Gaia adheres to; is simply farcical.
I’m perfectly happy to defend the Schmidt/Trenberth claim to that definition of “Climate Sensitivity”, since it is exactly the one I use myself; but Unlike them; it is for me just a definition of words; I acknowledge no proof of any validity to the implied Physical Law behind that definition.
“”””” KevinK says:
April 26, 2011 at 6:36 pm
Temperature is MOST DEFINITIVELY NOT A MEASURE OF ENERGY!!!!!! “””””
Well that is correct; “Kelvins” are a measure of Temperature; and “Joules” are a measure of energy; two different scientists, and two different measures named after them.
In the case of gases they are related, in that the mean kinetic energy per particle; W = (3/2)NkT, where k is 1.38066E-23 Joules per Kelvin, and N is the number of particles in the sample containing that much kinetic energy.
That would seem to imply that if you know one you know the other.
It seems clear that, Mr. Schmidt notwithstanding the problem is semantic, and it is a serious semantic problem in that having infused a particular meaning into the term, alarmists such as him then use that metric to draw conclusions not entailed by it in any way.
He is quite right that one does not measure impact of “climate change” on surface ecosystems and human civilization by metrics involving upper-atmosphere or oceanic changes — you measure this impact by what directly impacts it: surface temperature. He, and apparently some others such as Mr. Trenberth, have chosen to take this as the definition of climate sensitivity. Ok, I’m fine with this. As a mathematician I am quite comfortable with the idea that terminology is arbitrary; you can use whatever labels you like.
The problem comes when you attempt to use this particular metric, as they appear wont to do, as a predictive variable. Surface temperature is an effect, not a cause. The number cannot be fed back into the system or climate models as if it has some important feedback effects. And here is where the semantics of this particular choice of terminology becomes severely problematic.
The term “climate sensitivity” suggests that the variable in question is an important determiner of how the overall climate responds to inputs. It is a serious mistake to infer such a relationship, if surface temperature is the only, or primary, datum involved in the metric. Yes, it is a datum in which we have some practical interest. But, no, it is not a number with any particularly important predictive value; nor does it represent in any direct way any important element of the evolution of our climate system. Ocean heat content, in contrast, does both.
Perhaps I’m conflating terms, but I’m afraid none of your responders’ answers jived very closely with my understanding of sensitivity. I have always understood “sensitivity” in this context to refer to the ratio of actual, observed response to derived, theoretical response.
Thus, if (for example) radiative physics predicts (on whatever simplistic assumptions are invoked) that the change in atmospheric CO2 content over the last 100 years should lead to 0.2 C in warming, and 0.6 C is observed, then one infers a sensitivity of 0.6/0.2 = 3, which is specific to the CO2/temperature relationship, and has no broader meaning than this. I have understood this to be a deliberately naive, back-of-the-envelope calculation that involves far-too-simplistic assumptions such as that the atmosphere is a brick with no internal dynamics and that ALL temperature change can be attributed to CO2 content variation. The factor, 3 (in my example) represents some effects unknown in the sense that they have not been accounted for in the theoretical model used. Perhaps they ARE known, however — one might attribute the amplification by a factor of 3 to H2O responding to CO2, for example.
But none of this is important in understanding the long-term evolution of the climate. I do think these guys are talking past each other.
I apologize for my gross overestimation!
Speaking both sarcastically and seriously, a more useful metric for climate change is DISTANCE per doubling of CO2. Most of the people being asked to reduce GHG emissions live northern temperate zone and many of these people are somewhat familiar with how the climate changes as one moves north and south. For example, mean annual temperature in North Dakota and Montana is about 41 degF, while about 1000 miles to the south the mean annual temperature is about 60 degF in Oklahoma and Arkansas. Using this scale, a 1 degC temperature change is equivalent to moving 100 miles or 150 kilometers to the south, a trend which is reasonably accurate for most of the US (except Oregon and Washington). So the IPCC is telling us that climate sensitivity is 150-450 miles or 230-690 kilometers south for 2X CO2. (We can adapt by moving that the same distance to the north. Despite warming over the past 30 years, more people are moving south than north in the US.)
Given that global warming is anticipated to be greatest at night and in the winter – when high temperature is usually not usually a problem – the effective climate sensitivity is probably about half of this distance.
For the environmentally-aware residents of coastal California – who live where the ocean moderates summer temperature so that the daily high rises about 1 degF for every for every mile one moves away from the ocean – climate sensitivity is about 3-8 miles for 2X CO2. In the atmosphere or mountains, where temperature tends to drop about 6.5 degC/vertical kilometer, 2X CO2 is equivalent to 230-700 vertical meters or 750-2300 ft (which would be significant for some mountain snowpacks).
George, I’m still not quite with it so I didn’t fully follow your argument. I’ll read it again later.
As you noted, my point was simply that the definition was what the definition was. Your putting in the maths did give rise to a thought on the limitations of the current definition though.
Perhaps a better metric would be “Temperature Sensitivity” defined as the change in GMST per Watt/Metre Squared of forcing. Whether the forcing is from CO2 or the Sun or whatever, the end result is going to be roughly the same although there would need to be “feedback” and “no feedback” versions of the definition. Which solves the problem you outlined in your earlier comment in this thread.
Similarly you might be able to derive a “Humidity Sensitivity” expressed as the change in humidity per Watt/Metre Squared of forcing. Other sensitivity rules might also be derived, each applying to specific areas.
Thoughts?
Sky says;
“For basic physical reasons that few in climate science seem to understand properly, the entire question of “climate sensitivity” to CO2 changes is largely an empty one. Because they produce no energy, GHG’s cannot “force” anything in any strict physical sense. What we should be asking is what the effect of different concentrations may be upon the rate of thermal energy transfer from atmosphere to space, when the principal mode of transfer from surface to atmosphere is moist convection. In other words, is the climate system’s Lyapunov exponent being affected? Globally averaged surface temperatures do not address that crucial question.”
Thank you, as I have said a few times the important factor at play here is the “speed of heat” through the climate system. Since GHG’s CANNOT produce
energy the only remaining question is if they can slow the flow of heat through the system enough to affect the “equilibrium temperature”. I posit that since increases in GHG’s displace the amount of NON-GHG’s in the system the “speed of heat” through the system becomes more weigthed towards the speed of light (IR radiation vs convection) against the speed of heat which is quite a bit slower.
The Earth (including the gases in the atmosphere) are most acurately called a “Blackbody RE-RADIATOR” which is only capable of changing the rate at which energy flows from a TRUE Blackbody Radiator (i.e. the SUN) through the Earth/Atmosphere System to the cold vacuum of space.
Cheers, Kevin.
Perhaps guys like Schmidt and Trenberth think they are doctors; when the temperature of a human body goes up, there is generally something wrong inside. As we all know the rise in body temperature is certainly an effect, not a cause, of disease (I’m not a doctor, so if any doctor sees a need to correct me or add anything to this observation, go ahead; but that is what I’ve learned). So it would probably seem logical to some simpler minds than mine that surface temperature would likewise be a relevant measure of the state of the climate equilibrium.
However, the climate, as the good guys (like Roger Pielke, Sr. and George E. Smith above) realize, is not quite like the human body. There are way too many other features in the dynamic, chaotic, non-linear climate system that act (and interact) to enable us to focus in on any one effect like surface temperature anomalies. Also, assuming all of the other so-called “variables” constant in that definition seems more like cheating to me.
I agree with Pielke, Sr. that this definition is way too narrow and appears to be misleading in many respects. The argument has always been over definition, and the argument must continue. I hope Pielke, Sr. persists and prevails in his argument.
Okay, it seems clear to me that if you want to pin the tail on human caused CO2 then you certainly want to use GAAST as the primary measurement.
What if you want to measure the forcing from another first order human influence such as land use? Would you still want to use GAAST, or would you want to use some other measurement?
My point is that there’s quite a bit of gamesmanship going on here. Depending on what agenda you are trying to push, the ideal frame of conversation will change. If you are trying to guide the conversation towards man made CO2, then you don’t want to talk about the ocean because it’s too slow and visa versa.
“”””” JohnB says:
April 27, 2011 at 6:02 pm
George, I’m still not quite with it so I didn’t fully follow your argument. I’ll read it again later.
As you noted, my point was simply that the definition was what the definition was. Your putting in the maths did give rise to a thought on the limitations of the current definition though. “””””
Well John, perhaps you are looking for something that is a lot more subtle; but the reality is it is quite simple.
As you say; the definitition is what the definition is; that is really the point that Trenberth and Schmidt made in their comments.
They in effect said :- “”””” climate sensitivity has already beeen defined. so live with it”””””
That is the whole crux of the “gay marriage” hullabaloo. Some folks say; “Marriage has already been defined many thousands of years ago.” So we know what it is. You want to create a different structure; fine: go for it. but you’ll have to come up with your own name for it, because that one has already been used.
The “RMS” value of a varying Voltage, for example, is DEFINED as:- “The square root of the sum of the squares of all the instantaneous Voltage values” (along with the sample time normalization of course).
Now it doesn’t matter a hoot why we may want to know such a number; who cares what the reason is; that’s what those letters are defined to mean, so we can use them, whenever we want to express that functional value; doesn’t matter why we care. They have things called dictionaries that are simply chock full of formal definitions of what words mean; live with it.
So ok; somebody; maybe it was Dr. Stephen Schneider; maybe not, defined “Climate Sensitivity” to be the increase in global mean surface Temperature for a doubling of atmospheric CO2 abundance. Doesn’t matter to me why they decided to define sucha term, and I’ll use their definition if I want to refer to such a thing.
My point was, that it is ONE thing to define in a dictionary, what the meaning of a word is. It is entirely a different matter if that definition enshrines some claim about the laws of physics or mathematics, or any other disicpline.
Suppose I wanted to DEFINE “Solar Sensitivity” to be the increase in the value of the TSI, (Total Solar Irradiance) for a doubling of the percentage of humans wearing two piece bathing suits at any time. Well if nobody else has already claimed “Solar Sensitivity”, then I can put that definition in the dictionary, and people should use it.
In fact why don’t I do that. Anthony, please make a note of that. I have defined, here at WUWT, that henceforth:-
“”””” “Solar Sensitivity” , shall be the change in the TSI (at earth’s mean orbit) for a doubling of the percentage of humans wearing two piece bathing suits, at any time. “””””
Now I have given those previously unused words, a meaning; to be used by anyone who wants to refer to that phenomenon.
Now I don’t have either any experimental observational evidence, that if people wear more two piece bathing suits, that the value of TSI will increase; nor do I know of any theoretical modelled reason why one would expect that to be so; nor of the magnitude of the purported effect.
But so what ! Same thing is true of “Climate Sensitivity”.
We have absolutely no observational experimental evidence that the global mean surface Temperature accurately follows, and is theoretically expected to follow a straight line proportionality to the Logarithm (base 2) of the atmospheric molecular abundance (or vol percent if you like) of Carbon dioxide.
Note that Gavin Schmidt also added the requirement that other variables be held constant; so ONLY CO2 IS ALLOWED TO CHANGE WHEN OBSERVING CLIMATE SENSITIVITY.
So can we all just quit mumbling on about water vapor being just a feedback amplification of a CO2 caused effect.
Gavin Schmidt defines CS as the effect due to CO2 alone sans any variations due to H2O or any other interractions. He says these other variables like aerosols, and ozone, and ice sheets; presumably all the other climate variables stay put; you double the CO2 and you observe the Temperature increase; that’s it.
I accept his definition; Trenberth’s too. I just don’t accept that it is a real physical phenomenon that can be observed here on earth. The definition mandates a log base 2 function. That’s not some “non linear” association of numbers. It’s an exactly defined mathematical function, as is “Climate Sensitivity” as well as “Solar Sensitivity” which I just defined.
So don’t mess with it, and claim that log base 2 is just some amorphous non-linear number association. We know what log base 2 means; just as we know what the word “marriage” means; it’s in the dictionary.
To George E. Smith;
You wrote;
“That would seem to imply that if you know one you know the other.”
With respect I would like to suggest that we could rephrase this as;
“If we know something about one item we also know something about another item”
Do we know EVERTHING about the other item? Perhaps, but possibilty not! For example electrical current is certaintly related to electrical voltage, but as the the frequency increases many interesting things occur (i.e. look up “skin effect”). So if we know some things about an effect do we indeed know everything about that effect ?
So, in summary, temperature IS related to energy, but there are many other factors involved including the the thermal capacities of the materials involved.
Cheers, Kevin.
mkelly says: April 27, 2011 at 7:16 am About sensitivity curve shapes.
That is but part of the answer, but I thank you for it.
Do you have a lead as to why the blue and red curves cross at 280 ppm CO2 (thus setting part of the equation) when it would be more logical for them to cross at some more distant time when the temperature could be assumed to be the same for each? Put another way, by forcing the cross at 280 ppm, and by using this dubious equation, we derive vastly different temperature regimes for the decades and centuries before then.
Personally, I think the curves derive from a sine-like shape a quarter of its way through a cycle. The problem is (as Dr Strangelove raises his leather-clad fist) “NOBODY CAN PROVE ME WRONG!!”
It is foolish to attribute all the warming to co2 since 1850 since albedo is as important as solar incoming flux to the temperature. There are no albedo measurements for virtually the entire time frame and what little there is is all over the map, swamping out the effect of a co2 variation.
What can be done is to determine the contributions of current cloud cover with today’s albedo and compare that to an Earth with the same albedo and no atmosphere – strictly a thought experiment. One finds a black body 33 deg C below that of today’s actual Earth. One can determine how much power is blocked by the atmosphere from the surface given an average surface T of 288.2k (which emits about 391W/m^2) and an actual Earth emission value of about 235 W/m^2. This means about 156 w/m^2 emitted by the surface is blocked by the atmosphere, ghgs, aerosols, clouds, etc and this blockage results in 33 deg C T rise. The average contribution of an individual W/m^2 is about 0.21 deg C rise for the real Earth. ipcc says a forcing is a forcing is a forcing and has the same results. For every W/m^2 that results in more than 0.21 deg C increase, there must be forcing W/m^2 that produced even less than 0.21 deg C.
By simply using the fraction of surface T being blocked to provide the fraction of surface emission that makes it out of the atmosphere (235/391 = 60% ), one can realize that to increase the surface T by a small amount so that one additional W/m^2 escapes by increasing the T so that 1/0.6 w/m^2 additional is radiated, one can see what a small change must result in. 1/0.6 = 1.7 W/m^2 added emission results roughly in 1 W/m^2 additional escaping. Running this through stefan’s law backwards yields a new temperature of 288.49K, an increase of 0.29 deg C which is definitely in the vicinity of the average value, 0.21 deg C / w/m^2. Note that the 0.21 value is the real Earth average value while the 0.29 sensitivity is a simple radiative only simple model.
Clear sky co2 doubling forcing is usually estimated at 3.7 w/m^2. Using the above sensitivities, one has about 0.78 deg C rise for the average Earth value and a little over 1 deg C for the simple model. Note that over 60% of the surface is covered in clouds at any one time. Co2 effects are going to be far less, barely over half that of clear skies.
Also note that two factors exist with the theory concerning h2o vapor. First, the h2o vapor is the major feedback of the system. Second, as temperature rises the average relative humidity will stay roughly the same while the absolute humidity will increase.
Water vapor has a substantially greater impact per doubling than does co2. It is roughly 2 to 3 times as much W/m^2 as a co2 doubling. For constant RH, one can see from absolute humidity tables that an increase in 5 deg C will result in a 30% increase in the total amount of h2o vapor present and for a 2 deg C rise, h2o will increase by a whopping 13%, roughly 1/8th of a doubling. A 30% increase, almost 1/3 of a doubling, will result in around 3.1 w/m^2 added forcing or feedback, not quite as much as a co2 doubling. A 13% increase is far less than that. This means that if we increase the atmospheric column temperatures by 5 deg C, we can increase the h2o vapor enough to contribute less than 1 deg C along with the co2 doubling that provides just over 3/4 of a deg C. Consequently, we’re missing over 3 degrees C worth of forcing/feedback. For the 2 deg C rise case, we’re now down to missing about half of what is needed after considering the forcing and the supposed main feedback of h2o vapor. And this is for the clear sky only case because the effects will be smaller for the cloudy sky conditions.
“”””” KevinK says:
April 28, 2011 at 8:42 pm
To George E. Smith;
You wrote;
“That would seem to imply that if you know one you know the other.”
With respect I would like to suggest that we could rephrase this as;
“If we know something about one item we also know something about another item” “””””
What is the point in rephrasing my words ? They then become NOT my words, so they don’t have my meaning.
But YOU are free to say anything you want to say, and use your own words saying it. It just wouldn’t be my words, or have my meaning.
So on the Absolute Thermodynamic scale of Temperature the mean kinetic energy per degree of freedom of all of the particles present IS the measure of the Temperature.
E = kT = h(nu) =mc^2
What else is there to know about the relationship between energy and Temperature; their numerical values differ only in the factor, that converts units of energy (Joules) into units of Temperature (Kelvins).
Some Physicists even use a system of units in which c = h = k = 1 (actually they use hbar, and measure frequency (nu) in radians per second). In that case E = T = nu = m
Ask anna v about that system.
I actually have a Major in Radio-Physics, so I’m hep to Volts and Amps, and even skin effect. None of those are related to either Temperature or energy.
cba says: April 29, 2011 at 8:20 am
cba,
I think you’ve got the cloud cover % wrong, should be circa 40%. The Earth’s albedo is 0.39, remember Vangelis?
Here is a link to the World Sunlight Map displayed with a Mollweide projection. The cloud cover shown is updated every 3 hours with current weather satellite imagery.
Philip,
It’s a habit, quoting trenberth’s numbers from KT97, even though they tend to be screwed up quite often. Older albedo values tend to be pushing 0.40 while more recent ones claim around 0.30, give or take a bit. Over 80% of the total albedo is going to be atmospheric, clouds, scattering, etc. and only a little is surface contribution.
However, albedo is definitely a variable.
cba says April 29, 2011 at 8:21 pm
Agree!
22/Dec/1968 View of earth visible from Apollo 8 spacecraft during lunar orbital mission
16/Apr/1972 Apollo 16 view of the earth from translunar injection
07/Dec/1972 View of the Earth seen by the Apollo 17 crew traveling toward the moon
In terms of Albedo, the number which is used to control satellites (and they do need to take into account all radiation levels in the satellite’s environment including reflected solar to maintain proper orbits) is 29.83%.
This happens to be the exact same number used by Trenberth in his latest Earth Energy Budget paper.
Clouds make up about 16 to 17 percentage points of this number and the Surface makes up about 13 to 14 percentage points.
The Earth is about 65% cloud covered on average (including haze from very thin sirrus clouds which we might not count as cloudy) but on average 75% of the sunlight (including UV) either gets through the clouds or is still reflected to the Earth.
Generally, the surface which is most important to Albedo is Ice and Snow and Sea Ice which is the farthest from the Earth average and can vary the most over time.
The Earth’s Albedo in time looks like it can vary from about 24.5% (Pangea and no Ice on the planet) to about 50.0% (in Snowball Earth conditions). If you want to crunch the numbers on these estimates, you can get pretty close to the historical temperature estimates with just Albedo alone.
Then put 33.3% in for the Last Glacial Maximum – Hansen and climate theory have used 31.0% which is clearly too low in my opinion – this value is tuned so that CO2/GHGs produce 3.0C per doubling. The map of the extent of glaciers and snow and sea ice at the Last Glacial Maximum show it was much higher.
Bill Illis says April 30, 2011 at 5:40 am
Bill,
Thanks for your detailed explanation and correction and the update to modern numbers 🙂
Bill,
the numbers I see are somewhat along the lines of KT97, 10-30 w/m^2 surface albedo out of over 100 W/m^2. Clouds and atmosphere are the vast majority of albedo. the numbers you present are twice to three times the surface albedo of these values. There is not that much cryosphere and it is way off to the edge where there is not that much solar power. Your 17% attribution to clouds (and must include atmospheric scattering too) leaves cloud reflectivity at under 30%. What type of cloud reflects (scatters) under 35% at the very least? Your 14% surface contribution, considering only 35% of the surface is exposed (out from under the clouds), results in an average surface albedo of 40%, way higher than Mars or the Moon, and this is with 70% of the surface covered in water, which runs less than 0.04 albedo. That means the 30% land coverage must have an albedo of 0.40 / 0.3 = 1.3. Show me anything with an albedo above 0.8. Essentially, the conditions you present would be for at least a full glaciation period or even snowball Earth conditions for clouds to have that little effect and for the surface to have that great an effect. In fact, you’d have to have a substantial glaciation covering a lot of the Earth’s oceans to achieve this, roughly 50% of the surface assuming fresh snow at maximum snow albedo.