Pielke Sr. on Zeke's zinger

Guest post by Dr. Roger Pielke Senior

Missing The Major Point Of “What Is Climate Sensitivity”

There is a post by Zeke on Blackboard titled Agreeing [See also the post on Climate Etc  Agreeing(?)].

Zeke starts the post with the text

“My personal pet peeve in the climate debate is how much time is wasted on arguments that are largely spurious, while more substantive and interesting subjects receive short shrift.”

I agree with this view, but conclude that Zeke is missing a fundamental issue.

Zeke writes

“Climate sensitivity is somewhere between 1.5 C and 4.5 C for a doubling of carbon dioxide, due to feedbacks (primarily water vapor) in the climate system…”

The use of the terminology “climate sensitivity” indicates an importance of the climate system to this temperature range that does not exist. The range of temperatures of  “1.5 C and 4.5 C for a doubling of carbon dioxide” refers to a global annual average surface temperature anomaly that is not even directly measurable, and its interpretation is even unclear, as we discussed in the paperPielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229.

This view of a surface temperature anomaly expressed by “climate sensitivity” is grossly misleading the public and policymakers as to what are the actual climate metrics that matter to society and the environment. A global annual average surface temperature anomaly is almost irrelevant for any climatic feature of importance.

Even with respect to the subset of climate effects that is referred to as global warming, the appropriate climate metric is heat changes as measured in Joules (e.g. see). The  global annual average surface temperature anomaly is only useful to the extent it correlates with the global annual average climate system heat anomaly [most of which occurs within the upper oceans].  Such heating, if it occurs, is important as it is one component (the “steric component”) of sea level rise and fall.

For other societally and environmentally important climate effects, it is the regional atmospheric and ocean circulations patterns that matter. An accurate use of the terminology “climate sensitivity” would refer to the extent that these circulation patterns are altered due to human and natural climate forcings and feedbacks. As discussed in the excellent post on Judy Curry’s weblog

Spatio-temporal chaos

finding this sensitivity is a daunting challenge.

I have proposed  definitions which  could be used to advance the discussion of what we “agree on”, in my post

The Terms “Global Warming” And “Climate Change” – What Do They Mean?

As I wrote there

Global Warming is an increase in the heat (in Joules) contained within the climate system. The majority of this accumulation of heat occurs in the upper 700m of the oceans.

Global Cooling is a decrease in the heat (in Joules) contained within the climate system. The majority of this accumulation of heat occurs in the upper 700m of the oceans.

Global warming and cooling occur within each year as shown, for example, in Figure 4 in

Ellis et al. 1978: The annual variation in the global heat balance of the Earth. J. Geophys. Res., 83, 1958-1962.

Multi-decadal global warming or cooling involves a long-term imbalance between the global warming and cooling that occurs each year.

Climate Change involves any alteration in the  climate system , which is schematically illustrated  in the figure below (from NRC, 2005)

which persists for an (arbitrarily defined) long enough time period.

Shorter term climate change is referred to as climate variability.  An example of a climate change is if a growing season 20 year average  of 100 days was reduced by 10 days in the following 20 years.  Climate change includes changes in the statistics of weather (e.g. extreme events such as droughts, land falling hurricanes, etc), but also include changes in other climate system components (e.g. alterations in the pH of the oceans, changes in the spatial distribution of malaria carrying mosquitos, etc).

The recognition that climate involves much more than global warming and cooling is a very important issue. We can have climate change (as defined in this weblog post) without any long-term global warming or cooling.  Such climate change can occur both due to natural and human causes.”

It is within this framework of definitions that Zeke and Judy should solicit feedback in response to their recent posts.  I recommend a definition of “climate sensitivity” as

Climate Sensitivity is the response of the statistics of weather (e.g. extreme events such as droughts, land falling hurricanes, etc), and other climate system components (e.g. alterations in the pH of the oceans, changes in the spatial distribution of malaria carrying mosquitos, etc) to a climate forcing (e.g. added CO2, land use change, solar output changes, etc).  This more accurate definition of climate sensitivity is what should be discussed rather than the dubious use of a global annual average surface temperature anomaly for this purpose.

0 0 votes
Article Rating
113 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Albert Frankenstein
February 28, 2011 10:41 am

Even more basically, it starts with the assumption that feedback is in fact positive for CO2. What if it isn’t?

February 28, 2011 10:51 am

In not too distant future graphics of the above type may have Geomagnetic field added, or at least I hope….
http://www.vukcevic.talktalk.net/NFC1.htm

AnonyMoose
February 28, 2011 11:02 am

Maybe someone is saying climate sensitivity when temperature sensitivity is intended?

Dave D
February 28, 2011 11:04 am

What would be the collective climate science response to the statement, “It could be the that atmospheric sensitivity to CO2 is essentially zero?”
I believe that scientists measured and listed the following heat adsorption figures: 1) Oxygen = X, Nitrogen = X, yes they were very close to each other; CO2 = Y > X, a mixture of 20% O2 and 80% N2 was then measured and yielded a value VERY close to = Y.
I did NOT bookmark this article, but I believe it was published in Nature nearly a year ago. Has it since been debunked? Has it been buried? Is it totally unfounded? I would think this type of study would be central to resolving whether any Greenhouse gas theory around CO2 even applies? Science has the tools to quantify and repeat or discredit this type of thing – is anyone aware of any results?
Experts, more learned men/women than me and Faux-Mountebanks, all are welcome to reply! Dave

commieBob
February 28, 2011 11:21 am

yabut if you put more joules into the ocean, it gets warmer. In any event, how are you going to measure those joules if not with a thermometer. (commieBob now ducks and hides under his desk) 😉

Rob R
February 28, 2011 11:28 am

This is a much more rational way of looking at the issue. Thanks Roger.

George E. Smith
February 28, 2011 11:35 am

Well I have been trying to discover for years, just who or what first described Climate Sensitivity in terms of its relation between teh global mean surface Temeprature, and the logarithm of the atmospheric CO2 abundance.
I was under the impression that this was a creation of the late Dr Stephen Schneider at Stanford..
I take a dim view of people who say that some relationship is “logarithmic”, unless they can demonstrate either by measurement or theory, that such a relationship is mainatained over at least several orders of magnitude (three min) of the variable whose logarithm tracks the other variable.
My point is simply that just because a relationship, is non-linear; does not automatically mean it is logarithmic. When somebody says it IS logarithmic, that to me means that a CO2 increase from 280 ppm to 560 ppm produces the same Temperature increase, as going from 1 ppm to 2 ppm. That IS alogarithmic relationship.
Mathematical curve fitting experimental data, to some function is a well practised art. In the Mauna Loa era, we have somewhat less than 1/3 of one doubling of the ML CO2 number, from 325 ppm to today’s 390 ppm.
That data can easily be fitted to an ordinary power series, over some restricted range, and to as accurate a fit to the data, as can say a logarithmic equation.
Usually one looks at logarithmic relationships; when some theoretical basis for believing it is logarithmic, exists, such as radioactive decay for example, or the Current-Voltage relationship of a Semiconductor diode junction at constant Temperature.
GHG absorptance of LWIR radiation has its roots in Beer’s Law relating the absorptance of chemical solutions to the molar concentration.
But one must not confuse optical absorptance with energy transmission.
The energy transmitted by an optically absorbing medium, can be orders of magnitude higher than (1-a) where (a) is the optical absoptance for some spectral region, and can have a Beer’s Law relationship.
That transmitted energy will likely be in a quite different spectrum, from that represented by the Beer’s law absorption.
This situation exists in the CO2 GHG absoption case. CO2 has well understood infra-red absorption spectra; but it is also well known that the absorbing CO2 molecule, seldom gets a chance to re-emit the photon that sent it to an excited state. The energy will become thermalized through molecular collisions that occur thousands of times faster than spontaneous emissions of the decay of the resonance absorption modes. and that energy simply raises the Temeprature of the main atmospheric gases. They in turn radiate a purely thermal (BB-like) continuum spectrum, that just happens to overlap the same range as the absorption band (of lines) for CO2.
That radiation is emitted isotropically from every level in the atmosphere, from the very surface to the end of the atmosphere, and only in the highest coldest rarest atmospheric regions does the CO2 get a chance to spontaneously emit its own characteristic band of frequencies. And at every level in the main body of the atmosphere, about half the emitted radiation is emitted upwards towards space, and half downwards to the surface. It matters not whether the surrounding layers are transmissive or absorptive. The transmitted portions of the spectrum, still travel about half in each direction, and the absorbed portions simply lead to a subsequenct emission of some thermal spectrum, that likelwise splits in two hemispheres.

bob
February 28, 2011 11:43 am

Temperature, being an intensive variable, which means that it is possible to measure the temperature of the earth’s systems that vary with the seasons, the sun’s cycles, and orbital parameters etc. These would be the atmosphere, the oceans, the ice caps and the solid earth’s surface to a depth that changes with the seasons.
Possible yet difficult, so using the anomaly to measure the changes rather than the absolute is acceptable scientifiacally.
To replace that metric with a conglomeration of statistical measures, which it would probably be impossible to agree on which metrics to monitor in the data, seems without meric. Considering that finding longterm data trends for some of the suggested metrics, for example the geographic range of malaria carrying mosquitos, and agreeing on their validity, would be difficult.
What would the units be?
mosquito bite*hurricane * droughts per ppm CO2* rainforest acres cut down * solar variability
I mean we can track the changes in those variables, but to define climate sensitivity in terms of them is difficult.
If I am misunderstanding the point here then you can have my most sincere
Roseanna Roseannadanna
“never mind”

Carrick
February 28, 2011 11:48 am

Roger Pielke:

The range of temperatures of “1.5 C and 4.5 C for a doubling of carbon dioxide” refers to a global annual average surface temperature anomaly that is not even directly measurable, and its interpretation is even unclear, as we discussed in the paper

Why do you think that it is an indirect measurement is an issue? There are plenty of quantities that get measured indirectly in meteorology, climate and other sciences. (More that get measured indirectly than not, I’d think.)
The question of interpretation (and utility which is really what you are driving at) depends on what you are planning on using the number for. We live in the surface layer, most of the land biosphere does too, so knowing how the temperature responds to CO2 does act as a proxy for the impact of CO2 emissions on humans (changes in growing seasons, shifts in dry belts etc).
Are there other numbers? Sure, and any policy maker needs to read further than just global mean temperature change.
You are right to say that there is perhaps too much fixation on CO2 sensitivity, but your more general definition of climate sensitivity could be quantified by including a list of global values, which would have to include global mean temperature, as well mean precipitation, mean ocean pH and so forth.

Max Albert
February 28, 2011 11:49 am

One question: Wouldn’t it be rather difficult to empirically demonstrate a cause-and-effect relationship between the broad definition of “weather statistics and other climate components” and “a climate forcing,” particularly on a global scale? I mean, we’ve yet to do that even where it comes to a single component, say heat content vs CO2.
You’re talking a lot of time and scientific shoe leather, which is fine by me, but I wouldn’t expect your definition to be embraced by the warmist camp anytime soon. They’re in too big a hurry.

R. Gates
February 28, 2011 11:56 am

I agree that global average annual temperature is not an incredibly useful metric of “climate” sensitivity, but only of temperature sensitivity. But, if one understand the implications of that temperature sensitivity, one can see the implications for the longer-term (but not daily) statistics of weather. For example, if we know that at 2 degrees C of global warming, we will get 7 to 10 C above the Arctic circle, than that also tells you to expect sea ice and permafrost to decline. This decline in sea ice has implications for atmospheric circulation patterns which of course affect longer-term weather patterns. What it comes down to is that experts can understand the potential implications of global temperature changes and thus sensitivity to GH gas increases measured in this manner can be very useful in making future potential scenarios based on that sensitivity. Those scenarios could be completely wrong of course, but they require some measurement of sensitivity to output anything at all.

Roger Andrews
February 28, 2011 12:01 pm

I agree that OHC is the metric we should be using to measure the AMOUNT of global warming, assuming we can ever come up with a reliable way of measuring it. But the physical impacts of added heat (I hasten to add, if any) are felt at the land surface, which is where we humans and all the animals, trees, plants and crops we depend on live. So we can’t dismiss surface temperature as “almost irrelevant”. It’s the basic metric that measures the IMPACT of global warming, which is not the same thing as the amount. (And it’s also the more important. No one would care about global warming if it wasn’t going to do anything.)

Paddy
February 28, 2011 12:02 pm

Bishop Hill has an relevant post regarding climate forcings that quotes another blogger’s question to Freeman Dyson and his answer about important climate issues:
“1. What questions should intelligent lay-people be asking about climate, and to whom should these questions be directed?”
“Answer to your question”
“1. Both for humans and for wildlife, water is more important than temperature. Intelligent lay-people should be asking questions about water. What are the trends over the last hundred years for total rainfall, total water vapor in the atmosphere, total volume of ice on the continents, volume of fresh water in aquifers, distribution of droughts and floods, and last but not least, sea-level? Why did the Sahara change from green six thousand years ago to desert today? What are the implications for public health and natural ecology of the changes in flow and abundance of water?”
http://bishophill.squarespace.com/blog/2011/2/28/some-correspondence-with-freeman-dyson.html

Charles Higley
February 28, 2011 12:05 pm

I keep coming back to the fact that the IPCC altered a thermodynamic constant for CO2, bumping a 0.12 degree effect with doubling to 1.2—1.5 deg by multiplying the constant by 12-fold, all the while marveling at how consistent this constant had been while actually changing it at the same time – quite backhanded. Then, they used water vapor as a positive forcing factor to arrive at much higher values for warming with doubling, here, specifically pretending that atmospheric convection did not exist as part of a global heat engine, a powerful negative feedback machine.
Wh is so little made of this duplicity and fraud?

Ken Harvey
February 28, 2011 12:12 pm

“Climate Sensitivity is the response of the statistics of weather (e.g. extreme events such as droughts, land falling hurricanes, etc), and other climate system components (e.g. alterations in the pH of the oceans, changes in the spatial distribution of malaria carrying mosquitos, etc)”
Hmmm. Changes in the spatial distribution of malaria may conceivably occur as a result of climate sensitivity, but the emphasis must be on the “may”. Certainly malaria has moved well southwards in Southern Africa since the mid ‘seventies, but without any detectable change in climate. The southern drift that has occurred can all be attributed to anthropological causes. The population has grown tremendously over that time period resulting in considerably greater movement of people and consequent greater carrying of the insect. Anti malaria preparations that worked unfailingly in the mid seventies are no longer nearly as effective. Governmental chemical actions to fight the the mosquito spread, have in some parts (Zimbabwe in particular) either become less effective or have broken down altogether.
I feel that the scientific community should avoid this type of bland statement, if for no other reason than that there will no lack of Governments who will happily claim compensation for the damage done to them by the West’s nasty CO2.

February 28, 2011 12:16 pm

There is enough disagreement about the temperature variation and getting such obscure things involved like “land falling hurricanes” will only escalate the difficulty in defining what is going on.
Temperature should not be the problem that it, but the way it is used is a problem. Variations and trends in the different regions would be a better, but more complicated way. Most of the warming has been in the Arctic region. That is not a global effect, but the polar regions are important to climate.
Not sure what is the best way to define sensitivity, but less subjective is better than more subjective and weather events are very, very subjective.

Espen
February 28, 2011 12:30 pm

Great post by dr. Pielke!
Carrick: Why do you think that it is an indirect measurement is an issue?
“Global mean temperature” is not only indirect, it’s also wrong. Even if we covered the earth with a grid of more than a billion temperature sensors of previously unmatched quality, the arithmetic mean of those readings would be a bad measure of “global warming”, since +10 C in the arctic corresponds to much less incremental athmospheric heat content than +10C in the tropics.

February 28, 2011 12:32 pm

Dr. Pielke,
I agree with you that the surface temperature response to doubling CO2 (e.g. standard climate sensitivity) is not a full description of the effects of a climate forcing, but rather a limited subset of those effects. However, discussing how the full effects (which are characterized by even more uncertainty than sensitivity itself) should be communicated to the public and policy makers is a separate issue from what our best understanding of the science says about climate sensitivity as generally defined. It hardly amounts to a wittily alliterative “zinger” on my part that I referred to the current notion of climate sensitivity, rather than your more encompassing definition. Even were we to broaden the definition, we would still need a term for the surface temperature response to forcings, as it is important piece to the puzzle.

Lew Skannen
February 28, 2011 12:33 pm

Judy Curry’s weblog article Spatio-temporal chaos is exactly what I have been looking for for years!
I am glad that someone like Judy Curry has expressed so well the unavoidable facts about chaotic systems and how they relate to climate.

Paul Coppin
February 28, 2011 12:34 pm

I have to take exception to the semi-definition of Climate Change. There is the persistent predication that the Climate is static, and as a consequence, any alteration of this stasis constitutes “Climate Change”. The climate is not static; climate is a complex of events that isn’t even in equilibrium, let alone static. Acknowledging and accepting “climate change” as a definable, measurable paremeter is a mugg’s game, and we’ve already been beat to death with it. There is NO universally agreeable definition for climate change that has any useful scientific value whatsoever.

rpielke
February 28, 2011 12:36 pm

Thanks for the feedback that has been sent. I have just two further comments here. First, while ocean heat content requires the measurement of temperature, it also includes mass so that the proper unit of global warming (Joules) can be calculated. This is not true with the two dimensional analyses of surface temperatures which are used to calculate global averge surface temperature anomaly. There is no mass associated with this determination of heat.
Secondly, even with respect to surface temperature, it is a flawed metric. For example, what height is this temperature measured at (e.g. the surface, 2m etc). This matters in terms of the anomaly values that are computed as we have shown, for example, in our recent paper
Steeneveld, G.J., A.A.M. Holtslag, R.T. McNider, and R.A Pielke Sr, 2011: Screen level temperature increase due to higher atmospheric carbon dioxide in calm and windy nights revisited. J. Geophys. Res., 116, D02122, doi:10.1029/2010JD014612.
http://pielkeclimatesci.files.wordpress.com/2011/02/r-342.pdf
There is the issue that anomalies in absolute humidity alter the anomalies in temperature even if the total heat of a volume of air does not change, as we showed in our paper
Pielke Sr., R.A., C. Davey, and J. Morgan, 2004: Assessing “global warming” with surface heat content. Eos, 85, No. 21, 210-211.
http://pielkeclimatesci.wordpress.com/files/2009/10/r-290.pdf
These are just two of a number of issues with this data set. We can add all of the seminal work that, Joe Daleo, Anthony Watts and surface project volunters have done on the quality of the siting, as well as the failure of the land part of the surface temperature record to clearly define how they massage the data in order to create what they call a “homogenized” analysis.
We will have more to report on the surface temperature data set soon. My recommendation, however, is to scrap the use of surface temperatures to indirectly diagnose global warming and replace with upper ocean heat content change which is a direct measure of climate system heat changes.

February 28, 2011 12:38 pm

Until the temperatures get above the Holocene Optimum everything we are talking about is within normal variation with or without CO2.

Orson
February 28, 2011 12:43 pm

Zeke’s problem is formulating “climate change sensitivity” in terms laid out by the IPCC authorities.
In an important sense, Zeke is parsing a different problem from the narrower one Pielke, Sr., attacks above. Zeke and Lucia’s friends attempt to declare and decide on the range of probable and improbable problems pertinent to climate change science. Zeke hopes to move beyond debate to zero-in on genuine scientific problems; this makes agreement (or “consensus”) a priority before progress can be made.
Yet all of this is highly speculative, dependent upon the context of one’s selective knowledge of the literature, and besides the point of genuine scientific criticism. “How many angels can dance on the head of a pin” theology speculations, but dressed up as science. Furthermore, Zeke shows us that Linzen’s lessons about the corrupt fatuities of climate science have not yet been learned by his cohorts. (SEE “Climate Science – is it currently designed to answer questions,” Nov 2008) Instead, they must be papered over and denied as genuine problems.
This is not science, but politics IPCC-style, plain and simple.

etudiant
February 28, 2011 12:50 pm

It is striking that the level of water vapor in the upper atmosphere (300mB level) has dropped steadily and substantially (20+%) in the past60 years: http://www.climate4you.com/ see the greenhouse gases section, yet there has been no effort afaik to evaluate either the causes or the consequences of this change.
At a minimum, it seems inconsistent with the idea of increasing CO2 raising the atmospheric water vapor level.
More generally, if this kind of change in the Earth’s primary greenhouse gas has so little measurable effect, why would a comparable change in a vastly less abundant trace gas make a difference?

Eric (skeptic)
February 28, 2011 1:04 pm

R Gates: you said “I agree that global average annual temperature is not an incredibly useful metric…” You made up a new term there. Perhaps you meant global average temperature anomaly?
I would ask for support for “For example, if we know that at 2 degrees C of global warming, we will get 7 to 10 C above the Arctic circle,…” but your answer is probably climate models. Are those the same models that predicted that AO would increase? The ones that were trumpeted as validated when AO was increasing in the 90’s And now that AO is decreasing, those models are quietly tossed aside for new speculations that ice loss causes negative (decreased) AO? Bottom line: your claim is an unknown. The decline in sea ice was a well known prediction in the 90’s yet those models predicted the opposite of what is now happening.

George E. Smith
February 28, 2011 1:16 pm

Related to this issue that Roger raises, is a featured article in the Jan 2011 issue of Physics Today, which is a free Journal of the American Institute of Physics. It’s a quite extensicve article authored by Peter Humbug; who lurks around c-R. It’s a very readable article and one I think everyone should find and read. It has been mentioned here at WUWT before.
In the very first sentence of this paper, we learn that the Temperature of the earth would be nearly 800,000 K after a billion years of absorbing sunlight with no means of escape. Well I dare say that if the earth is actually 4.5 billion years old, then it’s Temperature woul likely be closer to 3 million Kelvins, than 800,000.
Of course it wouldn’t have any of its present atmosphere, and no oceans of course (of H2O). That 800,000 Kelvins, the mean molecular velocity would likely be greater than the escape velocity, so all of those molecules would be gone; and since mass goes down faster than surface area, as the surface layers of the earth ablated, the incoming solar energy rate, would not decline as fast as the earth mass does, so the Temperature would likely continue to increase indefinitely, until the whole earth had evaporated and blown away.
But of course that model of earth (Peter is a model maker evidently) assumes it has no way of getting rid of any energy.
That’s a strange assumption for a model; because later in the paper, he plots a graph, that purports to be an actual theoretical calculation for the Radiance for LWIR emissions from the earth. Well he calls it “flux”, and he has it overlaid with actual measured “flux” from the AIRS Satellite Instrument.
According to his graph, the earth actually CAN radiate a nearly infinite amount of energy. His actual plotted data, both measured and calculated, has a peak value of close to 0.12 Watts per (square metre-Steradian) at around 550 cm^-1 wavewnumber and again at around 750 cm^-1 on the other side of the CO2 hole. The Radiance drops to about 0.07 on the same scale at the Ozone hole at 1050 cm^-1, and then at 1250 cm^-1, it drops to about 0.2 or less for the water hole.
He has similar graphs for both Mars, and Venus; since his theory is applicable to any planet. Of course the detail numbers are different for different planets.
so how is it that such a graph predicts a near infininte amount of radiated energy.
Well let me repeat, the Units of his “flux” are WATTS PER SQUARE METRE PER STERADIAN; or if you prefer W/(m^2.sr). You get the idea; those are units of Radiance th3e radiant equivalent to “candle power”.

Stephen Richards
February 28, 2011 1:18 pm

Zeke has an agenda to drive. He believes that by making those statements they must be true. Unfortunately, he and his fellow AGWs have provided no empirical evidence for their ‘believe’ and probably never will. I just wish, like Willis, that htese people would provide the science with their absolute statements so that we can all become believers like them.

don penman
February 28, 2011 1:20 pm

I don’t think co2 has had that much effect on temperature in the last hundred years.
wattsupwiththat.com/2011/01/31/some-people-claim-there’s-a-human-to-blame-gw-tiger

February 28, 2011 1:21 pm

To the reader from Huntsville: this is the one you may whish to consider:
http://www.vukcevic.talktalk.net/LFC2.htm

James Sexton
February 28, 2011 1:44 pm

Dave D says:
February 28, 2011 at 11:04 am
What would be the collective climate science response to the statement, “It could be the that atmospheric sensitivity to CO2 is essentially zero?”
I believe that scientists measured and listed the following heat adsorption figures: 1) Oxygen = X, Nitrogen = X, yes they were very close to each other; CO2 = Y > X, a mixture of 20% O2 and 80% N2 was then measured and yielded a value VERY close to = Y.
======================================================
I’m not familiar with the way this is expressed, there may be a problem with the labeling, but, if one were to look at the IR absorption spectrum, http://joannenova.com.au/globalwarming/graphs/CO2_infrared_sorption_Tom.jpg
then one would see the overlap of nitrous oxide and H2O vs. CO2. Note how little unique absorption CO2 provides.

February 28, 2011 1:47 pm

I think many people, including Peilke, misunderstand the purpose of Zeke’s questions.
Before you start a discussion about what you disagree about, it’s important to find out what you agree about.
It would be a good thing, if Peilke and others explained what they agreed about.
This is not about talking a poll and deciding the truth. It’s about seeing what agreement there is and what disagreement.

geo
February 28, 2011 1:52 pm

It seems to me there is some psychological need in the human system generally for “one number-itis” to track and explain important issues. And, further, that as soon as there is even moderate complexity in the system being tracked by “one number-itis” that all manner of bad things begin happening both from a communications perspective and a general framework of even thinking about the issue perspective.
Climate sensitivity expressed in global temperature anamoly seems just another example of this.
At some level the urge to simplify is understandable. Over-simplifying is no friend to anyone however. 4 or 5 metrics ought to be few enough to still have an understandable framework to explain to laymen without introducing the really quite significant communication issues (and fights about same) that having one introduces.

George E. Smith
February 28, 2011 1:55 pm

Related to this issue that Roger raises, is a featured article in the Jan 2011 issue of Physics Today, which is a free Journal of the American Institute of Physics. It’s a quite extensive article authored by Peter Humbug; who lurks around c-R. It’s a very readable article and one I think everyone should find and read. It has been mentioned here at WUWT before.
In the very first sentence of this paper, we learn that the Temperature of the earth would be nearly 800,000 K after a billion years of absorbing sunlight with no means of escape. Well I dare say that if the earth is actually 4.5 billion years old, then it’s Temperature would likely be closer to 3 million Kelvins, than 800,000.
Of course it wouldn’t have any of its present atmosphere, and no oceans of course (of H2O). At that 800,000 Kelvins, the mean molecular velocity would likely be greater than the escape velocity, so all of those molecules would be gone; and since mass goes down faster than surface area, as the surface layers of the earth ablated, the incoming solar energy rate, would not decline as fast as the earth mass does, so the Temperature would likely continue to increase indefinitely, until the whole earth had evaporated and blown away.
But of course that model of earth (Peter is a model maker evidently) assumes it has no way of getting rid of any energy.
That’s a strange assumption for a model; because later in the paper, he plots a graph, that purports to be an actual theoretical calculation for the Radiance for LWIR emissions from the earth. Well he calls it “flux”, and he has it overlaid with actual measured “flux” from the AIRS Satellite Instrument.
According to his graph, the earth actually CAN radiate a nearly infinite amount of energy. His actual plotted data, both measured and calculated, have a peak value of close to 0.12 Watts per (square metre-Steradian) at around 550 cm^-1 wavenumber and again at around 750 cm^-1 on the other side of the CO2 hole. The Radiance drops to about 0.07 on the same scale at the Ozone hole at 1050 cm^-1, and then at 1250 cm^-1, it drops to about 0.2 or less for the water hole.
He has similar graphs for both Mars, and Venus; since his theory is applicable to any planet. Of course the detail numbers are different for different planets.
So how is it that such a graph predicts a near infininte amount of radiated energy.
Well let me repeat, the Units of his “flux” are WATTS PER SQUARE METRE PER STERADIAN; or if you prefer W/(m^2.sr). You get the idea; those are units of Radiance; the radiant equivalent to “candle power”.
What they specifically ARE NOT, is units of SPECTRAL RADIANCE., so that presumably is the calculated and measured radiance at each and every possible frequency (wave number) within the range , of roughly 100 to 2000 cm^-1 as plotted, and there is an infinite number of such frequencies (the thermal spectrum is a continuum) so the total radiated energy is infinite.
The author also states that planets with Temperatures between about 50 K and 1000 K emit IR spectra in the range of 5-50 microns in the far infrared; which he calls the Thermal IR (fine with me).
This thermal IR spectrum is presumably somewhat black body like; modified by some appropriate (spectral) emissivity.
This is a rather weird piece of information, since we know that for planet earth which roughly averages 288 K and emits 390 W/m^2 total radiant emittance, the corresponding BB spectrum would contain 98% of its energy between the wavelengths of 5.0 microns to 80 microns, with the spectral peak at about 10 microns.
Wow ! that an even longer spectral range, than the author claims for the whole planetary Temperature range from 50 K to 1000 K.
The way I figure it (I do not have a Teracomputer), the 1000 K spectrum, would have its peak emittance at 2.88 microns, and 98% of the energy would be between 1.44 microns, and 23 microns; whereas the 50 K planet, would have 98% of its energy between about 29 microns, and 460 microns, with a peak at 58 microns.
Apparently my stick in the sand is not quite as accurate as his Teracomputer.
Well I could go on, and mention a few other glaring questionable statements; but I would not want anyone to be put off from reading the entire paper.
The most important sentence in the whole paper is the second to last sentence which says:-
“The precise magnitude of the resulting warming depends on the fairly well known amount of amplification by water-vapor feedbacks and the less known amount of cloud feedback.”
The final sentence says:- “There are indeed uncertainties in the magnitude and impact of anthropogenic global warming, but the basic radiative physics of the anthropogenic greenhouse effect is unassailable.”
I would suggest to the author that those two statements are quite incompatible. The radiative Physics cannot be unassailable so long as the cloud feedback is unknown; and it certainly is unknown. Even the most battle hardened AGW warrior can’t tell you how the clouds work to control the Temperature of the earth.
It’s a shame, that such a respected Teramodeller, can write a paper like this, and even more surprising that Physics Today chose to publish it.
But do read it anyway; he does address some of the myths; such as the CO2 bands being saturated; but I think his explanation leaves a lot to be desired. There’s that “shoulders” of the CO2 absorption band explanation of what adding more CO2 does. Nothing about the same amount of absorption occurring in a thinner atmosphere layer, if the CO2 abundance increases.

nc
February 28, 2011 2:23 pm

So how many 100’s of billions have been spent and the dicussions

nc
February 28, 2011 2:28 pm

So how many 100′s of billions have been spent and the arguments still cover the basic principles, amazing.
Mod please delete first post, hit post when attempting to correct typing.

Bruce of Newcastle
February 28, 2011 2:34 pm

Apologies to Zeke but 2XCO2 has been empirically measured by Dr Spencer and others and found to be about 0.4-0.6 C.
I’ve cross checked this a couple of ways, one using SST’s and another by difference after solar effects are controlled for, and a value around 0.6 C seems to be the number. The feedback must therefore be negative not positive. None of this is modelling, it is a straight analysis of recorded and easily available data.
It may be the problem is climate scientists seeming to ignore the effects of the sun, even though these are easy to see even if you plot HadCRUT vs SCL (or etc) yourself.

February 28, 2011 2:44 pm

err bruce, ya cant get a sensitivity from observations that way.
Start with Schwartz and avoid his mistakes.

February 28, 2011 2:45 pm

James sexton.
see the stratosphere.

February 28, 2011 2:52 pm

Here’s a nice little presentation that will give you some ideas.. using Ocean heat content
http://www.newton.ac.uk/programmes/CLP/seminars/120812001.html

Carrick
February 28, 2011 2:54 pm

Espen:

“Global mean temperature” is not only indirect, it’s also wrong. Even if we covered the earth with a grid of more than a billion temperature sensors of previously unmatched quality, the arithmetic mean of those readings would be a bad measure of “global warming”, since +10 C in the arctic corresponds to much less incremental athmospheric heat content than +10C in the tropics.

Sorry but this description of how global mean temperature is calculated is wrong, and I’m pretty sure Roger would not endorse this explanation. Rather than launch into a tirade or a sermon about how to calculate it correctly, I encourage you to read some of the posts of Zeke on Lucia’s blog.
The short answer to why you don’t need one billion sensors comes from the sampling theory: This theory is the basis for digitization of audio and video signals. If there were an error with this theory, we would viscerally see it and hear it. If you ask how many samples you need to fully capture the Earth’s temperature field, based on a correlation length of 500-km (this is a smaller number than is quoted by Hansen’s group for the correlation length), that works out to around 2000 instruments world wide. By comparison, there are nearly 10,000 fixed instruments located on about 30% of the surface of the Earth from land-based systems alone. The oceans are less well covered, but the correlation lengths go way up.
If there are problems, they are from other sources (at least post 1950).
Roger Pielke Sr:

Secondly, even with respect to surface temperature, it is a flawed metric. For example, what height is this temperature measured at (e.g. the surface, 2m etc). This matters in terms of the anomaly values that are computed as we have shown, for example, in our recent paper …

I will raise to you the same challenge that I have to others. If it matters in a substantial fashion, this error should rear its ugly head in the comparison of surface based measurements to satellite based ones.
It does not.
(There is a difference in the long-term trends among the series, mostly between surface versus satellite, but I believe this is explained based on the difference in where the satellites are sampling the atmosphere—at roughly 8000-m up–compared to 1-2 m for surface measurements. Even if we posit that the difference is due to an error in accuracy of the surface measurements, that still just amounts to 10% of the total observed trend in temperature.)

Al Tekhasski
February 28, 2011 3:07 pm

Zeke wrote:
“… surface temperature response to doubling CO2 (e.g. standard climate sensitivity) is not a full description of the effects of a climate forcing, but rather a limited subset of those effects. However, discussing how the full effects (which are characterized by even more uncertainty than sensitivity itself) should be communicated to the public and policy makers is a separate issue from what our best understanding of the science says about climate sensitivity as generally defined.”
This is what really pisses me off. Maybe before trying to “communicate” some “generally defined” concepts to “the public”, one needs to communicate to himself that change in globally-averaged surface temperature IS NOT A PROXY FOR CHANGES IN HEAT CONTENT of climate system. It was shown mathematically on JCurry blog that a three-dimensional FIELD of temperature cannot be meaningfully characterized by a single number (“global temperature”) when the field is not spherically symmetrical. It is quite obvious and can be shown with simple examples that global average can go up while system is globally cooling, and global average can go down even if system is gaining heat. More importantly, the global average temperature can go up or can go down even if the heat content does not change at all. Please communicate this to yourself first.

Roy Clark
February 28, 2011 3:29 pm

The concept of CO2 induced ‘climate sensitivity’ is based on a fundamental misunderstanding of climate energy transfer. Over the last 50 years or so, the atmospheric CO2 conentration has increased by about 70 ppm to 380 ppm. This has produced an increase in the downward long wave IR (LWIR) atmospheric flux at the Earth’s surface of about 1.2 W.m-2. Over the last 200 years these numbers go up to 100 ppm and 1.7 W.m-2.
There is no such thing as a ‘global average temperature’. Each point on the Earth’s surface interacts with the local energy flux to produce a local surface temperature. This changes continuously on both a daily and a seasonal time frame. The solar flux at the surface can easily vary from zero to 1000 W.m-2. The night time LWIR emission can vary between zero and 100 W.m-2, depending on cloud cover and humidity.
Under full summer sun conditions the ‘dry’ land cooling balance is about 200 W.m-2 in excess LWIR emission and 800 W.m-2 in moist convection at a surface temperature that can easily can reach 50 C.
In terms of a daily energy budget, the solar flux can reach up to ~25 MJ.m-2.day-1. This is balanced by about ~15 MJ.m-2.day-1 in moist convection and ~10 MJ.m-1.day-1 in LWIR emission. The heat capacity of the ground is ~1.5 MJ.m-3 and ~3 MJ of the total daily flux is stored and released by the daily surface heating. At 1.7 W.m-2, the total daily LWIR flux increase from CO2 is ~0.15 MJ.m-2.day-1. This corresponds to about 2.5 minutes of full summer sun or the evaporation of a film of water 65 micron thick over an area of 1 m^2. It is simply impossible to detect the surface temperature change from an increase of 100 ppm in atmospheric CO2 concentration.
Over the oceans it is even easier. The LWIR emission from CO2 is absorbed within 100 micron of the ocean surface and is coupled directly into the surface evaporation. An increase of 65 micron per day in water evaporation is just too small to detect in the wind and surface temperature driven variations in ocean surface evaporation. The long term average uncertainty in ocean evaporation measurements is 2.7 cm per year. It is impossible for a 100ppm increase in CO2 concentration to cause any change in ocean temperatures.
The other problem is that the surface temperature was not usually measured before satellite observations and the meteorological surface air temperature (MSAT) measured in an enclosure at eye level above the ground has been substituted for the surface temperature. The observed ‘climate sensitivity’ over the last 50 years or so is just the changes in ocean surface temperature with urban heat island effects added along with downright faudulent manipulation of the ‘climate averages’. Ocean surface temperatures have been decreasing for at least the last 10 years and will probably continue to do so for another 20.
The climate prediction models have been ‘fixed’ to predict global warming by using a set of fraudulent ‘radiative forcing constants’. The hockey stick has been used as ‘calibration’ to predict more hockey stick. This is just climate astrology.
The only climate sensitivity is from the sun. A sunspot index of 100 correponds to an increase in the solar ‘constant’ of about 1 W.m-2. The change in ellipticity of the Earth’s orbit over a Milankovitch ice Age cycle is about -1 W.m-2. Over 10,000 years this is sufficient to bring the Earth in or out of an Ice Age, just through the change in ocean heat content. A Maunder Minimum requires about 50 to 100 years with few sunspots. We are going to find out about real climate sensitivity if the sunspot index stays low.

Al Tekhasski
February 28, 2011 3:41 pm

Carrick wrote:
“The short answer to why you don’t need one billion sensors comes from the sampling theory: This theory is the basis for digitization of audio and video signals. If there were an error with this theory, we would viscerally see it and hear it. If you ask how many samples you need to fully capture the Earth’s temperature field, based on a correlation length of 500-km (this is a smaller number than is quoted by Hansen’s group for the correlation length), that works out to around 2000 instruments world wide.”
Totally wrong. We could “viscerally see and hear” disconnects in video and audio signals, but it doesn’t affect much of information we receive. We might see some “snow” or “popping”, still old VHS tapes or vinyls could be well enjoyable. More, while information content is still fine, estimates of some average of signal could be over the roof, a technique that was used in early copy protection of VHS tapes. Even more, modern communication channels (such as USB) explicitly define “isochronous steams” where it is not necessary to retry blocks of signal if checksum does not match, and simply ignore the data block. These are the streams dedicated for video and audio information. So you could not be more wrong.
Regarding temperatures, you do need much higher density of sampling. As I reported elsewhere, there are many pairs of stations that are just 50-60km apart while exhibiting OPPOSITE century-long temperature trends. It means that you/we have no clue what is actually happening to climate-scale trends between stations that are 300-500 miles apart. It means that the field is undersampled. How badly? Opposite warming-cooling trends on a distance of 50km means that you need AT LEAST 25km REGULAR grid, which translates into about 800,000 stations world-wide.
So, your assertion that “it works out” to 2000 stations is unconditionally, blatantly wrong. Please get yourself familiar with Nyquist-Shannon-Kotelnikov-Whittaker sampling theorem, and study the effect called “aliasing” when sampling density does not conform to the sampling theorem.

rpielke
February 28, 2011 3:50 pm

Carrick – An anomaly of +10C at cold temperatures has a smaller effect on outgoing radiation than an anomaly of +10C at warm temperatures. This is due to the Stefan-Boltzman relationship that this outgoing emission is proportional to the 4th power of temperature.
With respect to the relationship between surface and lower tropospheric temperatures, and their divergence from one another, please see our paper
Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841. http://pielkeclimatesci.wordpress.com/files/2009/11/r-345.pdf
Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2010: Correction to: “An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841”, J. Geophys. Res., 115, D1, doi:10.1029/2009JD013655.
http://pielkeclimatesci.wordpress.com/files/2010/03/r-345a.pdf

Carrick
February 28, 2011 3:55 pm

Alex, you and I aren’t going to see eye to eye on this, if for no other reason than you refuse to use an analytic approach to buttress your arguments, and you prefer to substitute ugly personal attacks for reasoned argument. Sorry but I just find that very boring.
My comments about the number of locations needed based on a correlation length of 500-km and a (band-limited) sampling period of 1 month/sample follow straight fro the sampling theorem. So, either there’s something wrong with the sampling theorem, or there’s something wrong with the statement that the correlation length for a 1-month band-limited signal is 500-km.
Those are the only two possibilities.
Your discussion about old VHS systems, etc are so incoherent or OT I have no idea what you are even trying to drive at, other than some petty desire to show me “totally wrong”.
Address the science, or expect to be ignored.

Bruce of Newcastle
February 28, 2011 4:09 pm

steven mosher says:
February 28, 2011 at 2:44 pm
Steven – I appreciate your work very much, but sorry I respectfully disagree. Are you rejecting the correlation between SCL and temperature? Looks quite clear to me statistically and visually. Its there in the CET too, clearer post Laki. We can argue about the magnitude and causation but unless someone finds a new dataset that correlation isn’t going away.

johanna
February 28, 2011 4:10 pm

Steven Mosher said:
It would be a good thing, if Peilke and others explained what they agreed about.
This is not about talking a poll and deciding the truth. It’s about seeing what agreement there is and what disagreement.
—————————————————————
It is perfectly reasonable to ask Dr Pielke what his views on a particular topic are, and decide whether or not you agree with them, and debate them. But, there are no ‘others’, at least in a real scientific discussion. Convenient as it would be for those who do not buy the IPCC view of the world to form a nice homogeneous bloc, it is not the case. Given the vast range of disciplines and often uncharted territory involved, there would be something very wrong if it was the case.

Carrick
February 28, 2011 4:17 pm

Roger Pielke, thanks for the reply.
First to follow up on my previous comment, I was unintentionally a bit misleading in stating the difference as 10% (sorry that was my age-degenerated recollection at work).
Using a robust estimator (minimum of absolute deviations) for the 1980-2010 period inclusive, I get 1.62°C/century GISTEMP, 1.57°C/centuryand HADCRUT3vGL for the surface measurements and 1.32°C/century UAH and 1.67°C/century RSS for satellite. I think a more accurate statement would be the difference in trend could be as large as 25% (though RSS largely agrees with the surface temperature set).
How much of the difference between the sets is due to boundary layer physics versus surface/elevated measurements is of real interest to me. I have seen your papers and have found them very interesting in this context (especially since I occasionally am involved in the collection of boundary layer data myself). I’ll sit down and give it another read tonight.

mpaul
February 28, 2011 4:35 pm

steven mosher says:
February 28, 2011 at 1:47 pm
“It would be a good thing, if Peilke and others explained what they agreed about.”
Mosh, how about this:
(1) Its gotten warmer since the last mini ice age
(2) Dumping large amounts of CO2 and other by-products of fossil fuel combustion into the atmosphere is hazardous to human life and to the common welfare
(3) We should be aggressively pursuing alternative sources of cheap clean energy as a national priority
(4) Climate science has been corrupted by politics and some prominent climate scientist are willing to falsify results and cherry pick data in order to achieve power, fame and career advancement.
(5) Prominent climate scientists have systematically overstated confidence in results and the degree of scientific consensus
(6) The climate science community is unwilling and unable to enforce a meaningful code of professional conduct.
(7) Global mean surface temperature is meaningless and attempts at measuring it are illusory.
(8) Modeling of the climate with any degree of predictive power (beyond a few weeks) is well beyond the current state of technology
(9) Current estimate of climate sensitivity are unsupported by direct evidence
(10) We have not been able to characterize natural variability in the climate and as such we are unable to determine the significance of the anecdotal warming

Philip Mulholland
February 28, 2011 4:56 pm

vukcevic says:
February 28, 2011 at 10:51 am

In not too distant future graphics of the above type may have Geomagnetic field added, or at least I hope….

I agree, let’s hope it also includes:-
Chemical weathering of igneous rocks, in particular those containing plagioclase feldspars and the pedological processes that create mature regoliths which release calcium and magnesium cations into the hydrosphere.
The organic and inorganic sequestering of oxidised sulphur and carbon, occurring as water soluble anions, leading to an increase in the lithic reservoir of sulphate and carbonate rocks.
Adjustments in the areal extent of the hydrosphere to take account of tectonic and isostatic changes in relief and coastal processes of sedimentary erosion and accretion.
Changes in ocean basin morphology associated with continental drift, deep ocean crustal sinking, the growth of sedimentary fans, the development of submarine ridges by tectonic accretion, volcanic extrusion of submarine lavas and the impact of all of these on deep water mobility.
Changes in the organisation of deep marine currents in response to alterations in the rate of formation of polar cold dense bottom water that is intimately associated with the growth and destruction of continental icecaps. The competing formation of warm dense tropical bottom water associated with the growth and destruction of carbonate ramps and reefs in epeiric seas that are sensitive to gross sea level changes, particularly those sea level falls associated with the growth of said continental icecaps.
And so on, and so on…
One of the Apollo astronauts, on his return to earth, was asked what is the biggest difference between the earth and the moon? He said “on earth everything moves”.

Bernd Felsche
February 28, 2011 5:43 pm

“climate forcing”
Surely a less-wrong term for that is “perturbation”.
Terminology consistent with control systems theory and Engineering.

Al Tekhasski
February 28, 2011 6:10 pm

Carrick,
“based on a correlation length of 500-km … So, either there’s something wrong with the sampling theorem, or there’s something wrong with the statement that the correlation length for a 1-month band-limited signal is 500-km.”
No, there is a third possibility, which is in correct interpretation of data. The correlation is a broad function that accounts for many time scales. The correlation length (peak position) you mention is formed and dominated by SHORT-TERM seasonal variations. AGW people love to point out that summers are warmer than winters. That’s why stations at 500km – 1200km distance “correlate well”. However, we are trying to talk here about century-long CLIMATE trends, and the observational evidence (see GISS stations) shows that the long-term correlation breaks even at 50km distance. Sorry to point this out. Please don’t repeat the mistake made in the article of Hansen and Lebedeff (1987).
BTW, I have made this argument on several occasions. Apparently it was not scientific enough to your AGW taste and was ignored. I feel that this time will be no different.

izen
February 28, 2011 6:16 pm

mpaul says:
February 28, 2011 at 4:35 pm
“Mosh, how about this:”
If I may give my take…
The following numbered statements are your suggestions, the indented statements are my replies.
(1) Its gotten warmer since the last mini ice age
—-As expressed as a global average this is true. Regionally its less clear-cut, locally it varies. But the rise in sea level after ~6000 years of stasis and evidence from borehole data indicate the recent warming is a robust signal.
(2) Dumping large amounts of CO2 and other by-products of fossil fuel combustion into the atmosphere is hazardous to human life and to the common welfare
—-The degree of hazard it represents has more to do with how robust human agricultural systems and societies are to the changes that increasing CO2 will inflict.
You cant change just one thing in a compex interactive system, so raising CO2 will cause other changes. Even if we knew EXACTLY what changes it would cause, whether such changes are hazardous is dependent on societal adaptability.
Only if you regard any and all change as hazardous is it valid to ignore the magnitude or the societal response.
(3) We should be aggressively pursuing alternative sources of cheap clean energy as a national priority
—-Obviously.
(4) Climate science has been corrupted by politics and some prominent climate scientist are willing to falsify results and cherry pick data in order to achieve power, fame and career advancement.
—-Despite the frequency of this claim there is little or no good evidence for its accuracy.
Politics has distorted Climate science for its own ends. Sometimes to raise ‘alarmist’ scenarios and sometimes to eschew it.
(5) Prominent climate scientists have systematically overstated confidence in results and the degree of scientific consensus
—-Overstated confidence is occasional rather than systematic, and the degree of scientific consensus has been understated. The degree of confidence with which the AGW theory is rejected vastly exceeds the obverse, and media ‘balance’ tends to imply its a fifty:fifty split, or perhaps ten-to-one while the reality is over 98-1.
(6) The climate science community is unwilling and unable to enforce a meaningful code of professional conduct.
—-No more or less than every other field of science. Other fields of research have equally patchy, or worse records of unprofessional behavior.
(7) Global mean surface temperature is meaningless and attempts at measuring it are illusory.
—-Global mean surface temperature is a very rough metric of global heat content. Until satellite observations no attempt was made to measure it. It was derived from local measurements taken to measure the local weather. This makes the surface temperature record less than ideal as a source for calculating global mean temperature.
(8) Modeling of the climate with any degree of predictive power (beyond a few weeks) is well beyond the current state of technology
—-Wrong.
You cant predict the weather, but models would predict that January of 2019 in Northern Europe will be colder than August 2019. I would suggest this is a robust result from modeling.
(9) Current estimate of climate sensitivity are unsupported by direct evidence
—-The qualifier ‘direct’ is the problem here. How direct do you consider the evidence from volcanic climate change like Pichon/Pinatubo or glacial cycles?
(10) We have not been able to characterize natural variability in the climate and as such we are unable to determine the significance of the anecdotal warming
—-We CAN characterize natural variability but with varying degrees of accuracy and resolution over different timescales.
The significance of recent warming is that it is certainly exceptional compared to natural variation of the last 500 years. If you expand the timescale the uncertainty increases, there COULD be comparable events a few thousand years ago, but you have to go back to the end of the last ice-age and the Holocene optimum around 8000 years ago to get any good evidence of similar warming. Expand the timescale further and resolution drops way down, but there are credible candidates for natural variability greater than the present warming.
The trouble is that present technological advanced societies are a recent development, within the last few centuries, so multi-millennial periods are of rather less relevance. Agriculturally based society is only ~8000 years old so again deep time variability is of less significance. Our present society is adapted for the stable climate of the last few centuries, and the warming most likely to derive from doubling CO2 will exceed the natural variability seen over the same time period. How significant that is depends on how robust present complex technological society is in the face of variability outside the natural range already experienced.

D. J. Hawkins
February 28, 2011 6:22 pm

Carrick says:
February 28, 2011 at 3:55 pm
Alex, you and I aren’t going to see eye to eye on this, if for no other reason than you refuse to use an analytic approach to buttress your arguments, and you prefer to substitute ugly personal attacks for reasoned argument. Sorry but I just find that very boring.
My comments about the number of locations needed based on a correlation length of 500-km and a (band-limited) sampling period of 1 month/sample follow straight fro the sampling theorem. So, either there’s something wrong with the sampling theorem, or there’s something wrong with the statement that the correlation length for a 1-month band-limited signal is 500-km.

I’m somewhat new to the general discussion re AGW, so I’m curious as to how this characteristic length of 500km was derived. And I assume it was tested and verified? I am skeptical at first glance. 500km from my home takes me north to about Montreal or south to North Carolina and west almost into Ohio. The idea that temperature measurements at each of these points is sufficient to describe the entire temperature field between them strikes me as…ambitious.

February 28, 2011 7:30 pm

Izen gives the latest alarmist talking points in response to mpaul’s list. Off the top of my head I can see some problems with Izen’s responses:
#1: “…the rise in sea level after ~6000 years of stasis and evidence from borehole data indicate the recent warming is a robust signal.” The sea level has been rising since the end of the last global glaciacion, although the rise has moderated. Look closely and you will see the rise – and the sharper rise since the LIA. [And -1 for using “robust.”☺]
#2: The endlessly repeated prediction was for runaway global warming to begin by now as a result of increasing CO2. That has not happened, so the goal posts were moved to “climate change.” But the only folks who believe the climate doesn’t change are the true believers in CAGW and Mann’s debunked Hokey Stick. The assumption is that change is bad. But there is no evidence of any global harm from more CO2. The only quantifiable result is increased agricultural productivity. [Another -1 for “debunked” again.]
#4: Anyone who seriously believes that climate science has not been corrupted must be living on another planet. The sidebar has books giving detailed evidence of corruption by the ruling clique of climate charlatans.
#5: “The degree of confidence with which the AGW theory is rejected vastly exceeds the obverse, and media ‘balance’ tends to imply its a fifty:fifty split, or perhaps ten-to-one while the reality is over 98-1.”
Hogwash. First, either Izen doesn’t understand the scientific method, or he’s craftily trying to promote at best a hypothesis to a theory. It is not a theory because it cannot make accurate predictions. And he trots out that debunked 98%-of-climate-scientists-believe horse manure. You couldn’t get 98% of a group to agree that the Pope is Catholic. It was a badly worded push-poll; talk about pseudo-science.
#6: The climate science community is corrupted by money. Izen says every other branch of science is equally corrupt, “no more or less.” Shall we begin by discussing mathematics?
#8: “You cant predict the weather, but models would predict that January of 2019 in Northern Europe will be colder than August 2019. I would suggest this is a robust result from modeling.”
Don’t be silly. The difference between January and August is seasonal. You don’t need a model to understand that. [Another -1 for a third “robust.”]
#9: I’ll take Dr Lindzen’s educated estimate of ≈1°CX2CO2 over the agenda driven UN/IPCC. And Lindzen’s estimate is higher than other well known climatologists.
#10 is complete opinion, and it flies in the face of numerous peer reviewed geological studies going back decades. Izen says: “Our present society is adapted for the stable climate of the last few centuries, and the warming most likely to derive from doubling CO2 will exceed the natural variability seen over the same time period. How significant that is depends on how robust present complex technological society is in the face of variability outside the natural range already experienced.”
Complete circular argument: “…warming will most likey…”, followed by the conclusion, which assumes a priori that the supposed changes will be outside natural variability. [Another “robust.” -1. Total: -4. 60%. Fail.]
The blogs of the alarmists who spout this clown college nonsense didn’t make the “Best Science” finals for a good reason. CO2=CAGW has been downgraded from a hypothesis to a conjecture becuase it fails the most basic tests of the scientific method.
I wouldn’t care so much, but I remember Izen pontificating about how he comes here just to have some fun with skeptics. Having fun now, Izen?

Carrick
February 28, 2011 7:45 pm

DJ Hawkins:

? I am skeptical at first glance. 500km from my home takes me north to about Montreal or south to North Carolina and west almost into Ohio.

It’s the measured average correlation length. I wouldn’t be surprised if it were an elongated oval (greater correlation east to west than north to south). Certainly that becomes true as you shorten the averaging interval.
But keep in mind that they are anomalizing (subtracting the mean value for that location) before computing the correlation.

February 28, 2011 7:56 pm

mpaul
As I suggested people should start by answering zeke’s questions to the best of their ability. Even if to say they disagree with all of it. It’s impossible to have the debate you want to have without an understanding of where the points of common agreement are:
Like so:
(1) Its gotten warmer since the last mini ice age:
>95% certain
(2) Dumping large amounts of CO2 and other by-products of fossil fuel combustion into the atmosphere is hazardous to human life and to the common welfare
Poorly articulated. Quantify “large” that we we know what we are talking about.
your idea of large may not be my idea of large. Numbers make for clarity
(3) We should be aggressively pursuing alternative sources of cheap clean energy as a national priority
Yes, but define agressively.
(4) Climate science has been corrupted by politics and some prominent climate scientist are willing to falsify results and cherry pick data in order to achieve power, fame and career advancement.
Overgeneralized motive hunting. I would stick to particular cases and then
realize that I cannot know with any certainty what lies in man’s heart. I’ve seen
no evidence of willful falsification of data. I’ve seen questionable methods.
(5) Prominent climate scientists have systematically overstated confidence in results and the degree of scientific consensus
Again, I would stick to the cases I have studied. There appears to be some
overconfidence.
(6) The climate science community is unwilling and unable to enforce a meaningful code of professional conduct.
Not sure about unwilling, unable ? hard to judge without evidence.
(7) Global mean surface temperature is meaningless and attempts at measuring it are illusory.
False. It’s a proxy that has some uses. OHC is obviously better.
(8) Modeling of the climate with any degree of predictive power (beyond a few weeks) is well beyond the current state of technology
False. Ill specified. the term you are looking for is actually “skill” skill is a technical
measure. GCMs have skill. This is typically measured as skill over a naive forecast.
(9) Current estimate of climate sensitivity are unsupported by direct evidence
False. There are multiple lines of evidence supporting the range of estimates.
Direct measurement. Analysis of observational data. Paleo data, and constraints
imposed by phsyical models. For example, The paleo history clearly rules out
some very high estimates and some very low estimates. We will never (in our lifetime)have a
single number for this. It will always be a range.
(10) We have not been able to characterize natural variability in the climate and as such we are unable to determine the significance of the anecdotal warming
Well, that means that the “null” cannot be specified and cannot be falsified and
is not therefore a Null. I’d agree with Peter Webster that natural variablity ( especially coupling of oscilating modes) has not been adequately studied. Part of
the reason Im a lukewarmer.

Carrick
February 28, 2011 8:00 pm

Al Tek:

No, there is a third possibility, which is in correct interpretation of data

Data are data, and it’s possible to compute the spatial correlation length on its own merits. Whatever that number is tells you something about the data. “Interpretation” is something you do in art class.
There is a systematic basis for approaching the data, and you’re avoiding it by trying to focus on rhetoric instead.

However, we are trying to talk here about century-long CLIMATE trends, and the observational evidence (see GISS stations) shows that the long-term correlation breaks even at 50km distance.

Which sites are you talking about? On average, all of the stations at the same latitude follow the same trend, and the trend increases in a systematic fashion with latitude.
See e.g. here. This was obtained without any assumption of spatial correlation. Just an average of stations in 5° latitudinal bands.

BTW, I have made this argument on several occasions. Apparently it was not scientific enough to your AGW taste and was ignored. I feel that this time will be no different.

You’ve made arguments, but you’ve never followed them up with proof.
The data’s all there for immediate download, and the main studies are all peer reviewed and available outside of pay walls. GISTEMPs algorithm and code is fully publicized for your and anybody else’s review.
If you want to get technical, you have to run the numbers, not just talk. Just like many of us have actually done. You claim some skill at this, looks like it’s your turn to put up or shut up.

February 28, 2011 8:32 pm

Mpaul.
You might want to consider the oddity of these two:
(1) Its gotten warmer since the last mini ice age
(7) Global mean surface temperature is meaningless and attempts at measuring it are illusory.
#######
Lots of people who dont believe in a global temperature index simultaneously believe that “it” was colder in the LIA. Odd, but they do.

D. J. Hawkins
February 28, 2011 9:07 pm

Carrick says:
February 28, 2011 at 7:45 pm
DJ Hawkins:
? I am skeptical at first glance. 500km from my home takes me north to about Montreal or south to North Carolina and west almost into Ohio.
It’s the measured average correlation length. I wouldn’t be surprised if it were an elongated oval (greater correlation east to west than north to south). Certainly that becomes true as you shorten the averaging interval.
But keep in mind that they are anomalizing (subtracting the mean value for that location) before computing the correlation.

Okay, pretend I don’t know anything about anomalizing and constructing the correlation (not a stretch) and tell me how the temperature at Bloomingdale NJ and Montreal QC is a good description of the temperature field that includes the Hudson Valley, Appalachian mountains, and Lake Champlain. And who proved (or made a brilliant argument) that the characteristic length of 500km was appropriate. Use small words so I can follow. Or a link or two; I won’t consider it cheating if you think someone’s already done a bang up explanation elsewhere.

mpaul
February 28, 2011 9:09 pm

mpaul “(3) We should be aggressively pursuing alternative sources of cheap clean energy as a national priority”
Mosher “Yes, but define aggressively.”
Manhattan Project to build a commercially viable thorium reactor
mpaul “(4) Climate science has been corrupted by politics and some prominent climate scientist are willing to falsify results and cherry pick data in order to achieve power, fame and career advancement.
Mosher: “Overgeneralized motive hunting. I would stick to particular cases and then
realize that I cannot know with any certainty what lies in man’s heart. I’ve seen
no evidence of willful falsification of data. I’ve seen questionable methods.”
“Censored” directory; R**2 testimony
mpaul “(5) Prominent climate scientists have systematically overstated confidence in results and the degree of scientific consensus”
Mosher “Again, I would stick to the cases I have studied. There appears to be some
overconfidence.”
Climate Change Assessments Review of the Processes and Procedures of the IPCC, InterAcademy Council, 30 August 2010: “… authors reported high confidence in some statements for which there is little evidence. Furthermore, by making vague statements that were difficult to refute, authors were able to attach ‘high confidence’ to the statements. The Working Group II Summary for Policymakers contains many such statements that are not supported sufficiently in the literature…”
mpaul: “(8) Modeling of the climate with any degree of predictive power (beyond a few weeks) is well beyond the current state of technology”
Mosher “False. Ill specified. the term you are looking for is actually “skill” skill is a technical measure. GCMs have skill. This is typically measured as skill over a naive forecast.
No. “Skill” is a term used by academics. I’m an applied scientists. I’m using the term predictive power in a precise sense. To quote von Neumann: “If you allow me four free parameters I can build a mathematical model that describes exactly everything that an elephant can do. If you allow me a fifth free parameter, the model I build will forecast that the elephant will fly.”
mpaul “9) Current estimate of climate sensitivity are unsupported by direct evidence”
Mosher “False. There are multiple lines of evidence supporting the range of estimates.”
Name one.

Al Tekhasski
February 28, 2011 9:14 pm

Carrick wrote:
“Data are data … Whatever that number is tells you something about the data. “Interpretation” is something you do in art class.”
Funny. You do realize that if a “number tells you something”, it is an interpretation, do you? But I never attended any “art class” and my English is second hand, so I will take your word on this 🙂
Carrick:
“You’ve made arguments, but you’ve never followed them up with proof.”
Really? How about these data references?
http://wattsupwiththat.com/2010/09/22/arctic-isolated-versus-urban-stations-show-differing-trends/#comment-489483
http://ourchangingclimate.wordpress.com/2010/03/08/is-the-increase-in-global-average-temperature-just-a-random-walk/#comment-1627
http://ourchangingclimate.wordpress.com/2010/03/08/is-the-increase-in-global-average-temperature-just-a-random-walk/#comment-1733
http://noconsensus.wordpress.com/2010/09/02/in-search-of-cooling-trends/#comment-35823
Looks like your turn is long overdue.

February 28, 2011 9:22 pm

Steven Mosher,
“It” refers to current temperatures. That’s not, as you say, ‘odd.’ It is a fact that the LIA was colder than today.
And I’ll respond to one of your comments:
“We have not been able to characterize natural variability in the climate and as such we are unable to determine the significance of the anecdotal warming”
“Well, that means that the ‘null’ cannot be specified and cannot be falsified and
is not therefore a Null.”
Both comments are incorrect. The climate null hypothesis is ‘the statistical hypothesis that states that there are no differences between observed and expected data.’
The ‘expected data’ of the alternative hypothesis is that runaway global warming will exceed the parameters of the past ten millennia [the Holocene]. That is not even close to happening. Far from it.
Falsifying the null is easy: just show that current temperatures exceed the parameters of the last ten thousand years. Clearly, they do not. Thus, the alternative CO2=CAGW hypothesis is falsified.

mpaul
February 28, 2011 9:22 pm

steven mosher says:
February 28, 2011 at 8:32 pm
“Mpaul.
You might want to consider the oddity of these two:
(1) Its gotten warmer since the last mini ice age
(7) Global mean surface temperature is meaningless and attempts at measuring it are illusory.”
———————————–
There’s little doubt in my mind that the earth has warmed a bit since the last mini ice age. Global mean surface temperature (anomaly) is an index that attempts to quantify the temperature anomaly at the “surface” of the planet yet is so poorly defined as to be meaningless. Think about it this way, what specification would you need to design a a system to measure global mean surface temperature anomaly. Do you think you have such clarity in the term today?

Al Tekhasski
February 28, 2011 9:27 pm

Carrick wrote: “Just an average of stations in 5° latitudinal bands.”
You seem to miss a simple point: if you don’t have sufficient density of sampling, you don’t (and can’t) know what you are missing. Data collected without regard for theoretical requirement for sampling are garbage. The only way to have confidence in a series of spatio-temporal data is to have substantially high sampling density first, then gather convincing evidence that further increases (and decreases) in sampling density don’t change the result significantly, and then back up to smallest reasonable density. Whoever does not understand this simple procedure is practicing garbage science, and all results must be safely discarded and conclusions ignored.

Carrick
February 28, 2011 9:32 pm

For people who didn’t read it, Willis reposted Hansen’s original peer-reviewed results for correlation vs distance:
figure here.
You can see that 500-km is a pretty conservative number globally.
I agree with Al Tek that it would be interesting to break this up e.g. by season. I also think East-West versus North-South would be interesting (and I plan to run this case when I get a shot).
One thing I would predict (meaning I haven’t tested this personally) is that when you look at stations along the coastline, you get a loss of coherence with the interior… the so called “coastal boundary layer” plays a role. There is roughly 350,000 km of coastline around the world (easy to remember: It’s roughly the distance from the Earth to the Moon), which is means about 3% of the Earth’s surface is affected by the coastal boundary layer.
To give an idea what this means for the global mean temperature reconstructions: for this to affect the accuracy of the global mean temperature trend by 1% would require more than a 30% error in the estimates of the coastline.
Last I check the “admitted error” in the global mean temperature trend is about 10%. So you’d need a 300% error in coastlines for that to be equal in magnitude. That doesn’t strike me as plausible personally.
Now where did I put that packet of Zingers….

Carrick
February 28, 2011 9:36 pm

I’ll also note that I’ve posted my own analyses, as well as those of others. Al Tek as yet to post a single plot of his own. This comment says all it needs to say.

Larry in Texas
February 28, 2011 9:51 pm

“This is not about talking a poll and deciding the truth. It’s about seeing what agreement there is and what disagreement.”
This is precisely why no policy makers should be involved in this type of discussion, why the IPCC’s role is overblown and improper, and why all the politics that has surrounded the science development should stop. Until you guys who are scientists can really figure out what is going on (and I believe all of you know a lot less than you think you know – how can a science that has only about 50 years of satisfactory, reliable measurements be convinced it knows everything already?), policy makers are wasting their time. UNless, of course (note the pun there), your agenda is to control and rule the world.
We humans seem to really think with all of our proxies, surface temperature measurements, satellites, etc., that we know something substantial about the climate of the earth, when we aren’t even close. Scientists should not think themselves so wise that they are prepared to make recommendations concerning what, if anything, to do about CO2. The post by Roger Pielke, Sr. illustrates this nicely.

Al Tekhasski
February 28, 2011 10:15 pm

To Carrick: I’ve seen your analysis, and already commented on it. Earlier I looked at real (GISS) data, and found that several adjacent pairs of stations have opposite “climatological” trends. This observation alone establishes that, for the purpose of testing TRENDS and their average, the sampling density is insufficient in accord with established mathematical discipline. This observation invalidates the “correlation” analysis, and I don’t need to present any alternative mathturbation, because the data are flawed from the very beginning, insufficient. Any own or else plot will be garbage. What is so difficult to understand here?
One more time: mo matter how do you massage insufficient data, you cannot get more information out of this data set than it is already. Homogenize it, grid it, select subsets, it does not matter, you will not have any new information. That’s why all your “analyses” show about the same result, which is a garbage predefined by under-sampled wrongly-spaced initial station set. The only way to increase information from the field is to increase sampling density of the data acquisition system. When you do this and get the similar result to your current mathturbations, then you might be up to something.

izen
February 28, 2011 10:31 pm

Smokey says:
February 28, 2011 at 7:30 pm
“Izen gives the latest alarmist talking points in response to mpaul’s list. Off the top of my head I can see some problems with Izen’s responses:
#1: “…the rise in sea level after ~6000 years of stasis and evidence from borehole data indicate the recent warming is a robust signal.” The sea level has been rising since the end of the last global glaciacion, although the rise has moderated. Look closely-[LINK]- and you will see the rise – and the sharper rise since the LIA. [And -1 for using “robust.”☺]”
Perhaps you should respond from somewhere else than the top of your head…
The graph you link to shows the sea level rise at the end of the last ice age… and shows that it has altered little in the last ~6000 years as I stated. The rise over the last century is exceptional within that period of a stable Holocene climate.
“I wouldn’t care so much, but I remember Izen pontificating about how he comes here just to have some fun with skeptics. Having fun now, Izen?”
Lots of robust fun. Especially when skeptics post replies that confirm the points I make.
And can be tweaked by their dislike of ‘robust’ opposition. -grin-

Al Tekhasski
February 28, 2011 10:35 pm

Larry asks: “how can a science that has only about 50 years of satisfactory, reliable measurements … “
You are overly optimistic here. The point I am arguing here is that current measurements are not reliable and do not sufficiently represent the object in space nor in time, and therefore are way far from being satisfactory for the purpose of characterizing climate and its change.

February 28, 2011 11:21 pm

Bruce,
What I’m saying is that you cannot derive a sensitivity from a time series as you have attempted to. Again, if you want to see how it might be done see schwartz’s work and some of lucia’s work. there are other examples as well. If you like I can collect a bunch of papers.

DRE
February 28, 2011 11:28 pm

“Average Global Temperature” is completely meaningless. What you want to know is the amount of heat in the system.
Which has more heat, a cubic meter of air at -100 degrees Fahrenheit or a cubic meter meter of water at +31 degrees Fahrenheit.

Mark T
February 28, 2011 11:30 pm

Carrick says:
February 28, 2011 at 2:54 pm

“Global mean temperature” is not only indirect, it’s also wrong. Even if we covered the earth with a grid of more than a billion temperature sensors of previously unmatched quality, the arithmetic mean of those readings would be a bad measure of “global warming”, since +10 C in the arctic corresponds to much less incremental athmospheric heat content than +10C in the tropics.

Sorry but this description of how global mean temperature is calculated is wrong, and I’m pretty sure Roger would not endorse this explanation. Rather than launch into a tirade or a sermon about how to calculate it correctly, I encourage you to read some of the posts of Zeke on Lucia’s blog.
The short answer to why you don’t need one billion sensors comes from the sampling theory: This theory is the basis for digitization of audio and video signals.

Wow, did you really miss Espen’s point THAT badly? It has nothing to do with sampling theory and he stated it pretty plainly. Temperature is an INTENSIVE VARIABLE and cannot be averaged as a normal variable, e.g., length. In other words, it doesn’t matter if you have a billion sensors or only two, the arithmetic mean is a meaningless number. Basic thermo 101.
http://en.wikipedia.org/wiki/Intensive_variable

If there are problems, they are from other sources (at least post 1950).

No, temperatures of gases that do not have an identical composition cannot be “averaged,” period. That’s the only problem.
Mark

DRE
February 28, 2011 11:33 pm

Or better yet. Which has more energy a cubic meter of air, or a cubic meter of air at 10 meters above and with essentially the same temperature and pressure as the first cubic meter of air?

Espen
March 1, 2011 1:09 am

Carrick writes: Sorry but this description of how global mean temperature is calculated is wrong, and I’m pretty sure Roger would not endorse this explanation.
You completely missed my point. I see that Roger Pielke has elaborated my point already, but let me add: Mean surface temperature can be a highly misleading measure of the heat content of the near-surface atmosphere, since the enthalpy change when cold air is heated 10 C is much smaller than the enthalpy change when warm air is heated 10 C. Suppose, for instance, that 1 million square kilometers of Arctic Canada has a temperature anomaly of +10 C in January. Suppose that at the same time, the rest of the world has an anomaly of 0 C, except for 1 million square kilometers of Amazonas, which has an anomaly of -10 C, so the two anomalous areas cancel each other. But in reality, the total heat content of the near-surface atmosphere has dropped significantly compared to a situation with 0C anomaly everywhere.
But even if we had a good measure in Joules of the heat anomaly of the atmosphere, it wouldn’t really be a reliable measure of warming: As Dr. Pielke writes above: Global Warming is an increase in the heat (in Joules) contained within the climate system. The majority of this accumulation of heat occurs in the upper 700m of the oceans.

John Marshall
March 1, 2011 1:47 am

It also depends on the global systems being in equilibrium with regard to temperature. It never is!
There is also the anomaly problem. To state that there is an anomaly one has to know that there is a temperature difference from what is ‘normal’. We do not know what is normal and on a planet with a chaotic climate system we never will know.

izen
March 1, 2011 2:47 am

@-Espen says:
March 1, 2011 at 1:09 am
“But even if we had a good measure in Joules of the heat anomaly of the atmosphere, it wouldn’t really be a reliable measure of warming: As Dr. Pielke writes above: Global Warming is an increase in the heat (in Joules) contained within the climate system. The majority of this accumulation of heat occurs in the upper 700m of the oceans.”
Which is why the fevered arguments about atmospheric/surface temperature trends and variability whether for the present or from paleoclimate proxy records is all a bit of a red herring.
Measuring the Joule content, and change of the top 700m of the oceans is difficult, but there IS a good proxy indicator of it.
Sea level rise is at least in part due to thermal expansion. Subtract the contribution from the melting land ice and you have a good indicator of the rising energy content of the oceans for the TOTAL volume, not just the top 700m which is less than half the total.
The melting of land ice is also a result of extra Joules in the climate system, so the sea level rise from that does giive some measure of increased energy content as well.

March 1, 2011 4:18 am

Izen says:
“The graph you link to shows the sea level rise at the end of the last ice age… and shows that it has altered little in the last ~6000 years as I stated.”
“Altered little” was not what Izen stated. The goal posts have been moved again. Izen’s assertion was that the sea level is in “stasis.” [stasis, n: a period or state of inactivity or equilibrium.]
If it were not for moving the goal posts, the alarmist crowd would have nothing much to say.
Still having fun with your CAGW scam, Izen?

steveta_uk
March 1, 2011 4:49 am

Mark T, density is an intensive variable. By your argument, it is impossible to define the density of a composite such as concrete. Don’t see it myself.

Vince Causey
March 1, 2011 6:25 am

Izen says:
“The graph you link to shows the sea level rise at the end of the last ice age… and shows that it has altered little in the last ~6000 years as I stated.”
Are you sure about that? I believe the rate of SLR is about 2 to 3 mm pa – say 3mm. That’s 3m per millenium or 30m in the last 10,000 years. But I am sure sea levels have risen more than that. For example, the North Sea with its mean depth of 90 metres, was completely dry during the last ice age. Doesn’t that mean that sea levels have historically risen more than 3mm pa in the past?
Just wondering.

Carrick
March 1, 2011 7:50 am

MarkT

Wow, did you really miss Espen’s point THAT badly? It has nothing to do with sampling theory and he stated it pretty plainly. Temperature is an INTENSIVE VARIABLE and cannot be averaged as a normal variable, e.g., length. In other words, it doesn’t matter if you have a billion sensors or only two, the arithmetic mean is a meaningless number. Basic thermo 101.

Temperature measures a variable that affects humans directly, ocean temperatures do not. A 40°C day is going to affect you differently than a 20°C day regardless of what the ocean 4000-km away did. You guys don’t even know why we are interested in temperature? Talk about silly! It’s not “just thermodynamics”.
Espen claims that we need a billion thermometers, which is a risible argument, and that is besides the point???
Good grief, sorry I forgot my mind reading cap or my secret decoder ring so I would know to skip over Espen’s zingers.
Espen, see also my comment to MarkT, I think this serves to answer you both. Yes, ocean heat content is interesting. Though if we’re going on about Roger’s issues with “indirect measurements”, there are few climate metrics I can think of that are more indirect or harder to measure than ocean heat content.
Keep the arguments consistent, and when you check today’s ocean heat index to decide what to where that day, or use it to figure out what will grow in your area, then you will have made a point (but that would only be you need immediate psychiatric evaluation).
Personally if I wanted to pick another metric besides temperature to track, it would be TOA radiation balance. But guess which is still most important for measuring impact on people living in the surface atmospheric layer? (The whales no doubt appreciate your concern.)

Carrick
March 1, 2011 7:57 am

MarktT:

No, temperatures of gases that do not have an identical composition cannot be “averaged,” period. That’s the only problem.

This statement exposes just how much thermo 101 you don’t understand. Gas mixtures is not a problem.

Carrick
March 1, 2011 8:16 am

Al Tek:

One more time: mo matter how do you massage insufficient data, you cannot get more information out of this data set than it is already. Homogenize it, grid it, select subsets, it does not matter, you will not have any new information. That’s why all your “analyses” show about the same result, which is a garbage predefined by under-sampled wrongly-spaced initial station set. The only way to increase information from the field is to increase sampling density of the data acquisition system. When you do this and get the similar result to your current mathturbations, then you might be up to something.

Still up to the usual personal attacks I see. I take that as a substitute for ability to reason.
But again, as I predicted, there’s no way we’ll ever see eye to eye on this.
You are making a series of statements of faith. If they were statements of fact they could be established by analysis of the data.
Analysis of the data set shows up many interesting things.
For one, comparing the different sets obtained either by surface measurement or satellite show a striking similarity in patterns.
Secondly, if one looks at the latitudinal variation in measured temperature over time
it shows a distinct pattern of arctic amplification.
Third the measured data show a striking correlation with distance. Using 500-km for the correlation distance as input to the sampling theorem is probably overly conservative, rather than not nearly conservative enough.
If you knew half as much as you claim you knew, you would understand what the consequences of undersampling the data would be, and what effect that would have on the measurement set. One of the easiest ways to look for undersampling is to look at the frequency domain. For data that follow an approximate 1/f power law, aliasing will show up as a high-frequency plateau.
GISTEMP PSD. If there are aliasing issues contaminating the time-series, it is only for periods of less than a year. (Note: I don’t think GISTEMP is reliable for periods of less than a year for other reasons, so this is no shock to me.)
Mind you I’m not claiming the current data set is perfect, and am even cheered by the Berkeley project, because I too think it’s needed. But your criticisms fall far from the mark.

Spen
March 1, 2011 8:46 am

I read the link above to Judith Curry’s discussion on spatio-temporal chaos. It is certainly challenging and reminds me of Prime Minister Palmeston’s description of the intractable Shleswig Holstein question in the 19 century. He said only two people apart from himself had understood it. One was dead and the other was in an asylum.
So without implying that I am in an asylum or within the 1% who understand the theory I would ask this question. Global climate history over geological time has indeed varied but always within a relatively small range. This evidence points to the existence of boundary conditions, the principle of which surely would be in conflict with this chaos theory.

D. J. Hawkins
March 1, 2011 10:22 am

steveta_uk says:
March 1, 2011 at 4:49 am
Mark T, density is an intensive variable. By your argument, it is impossible to define the density of a composite such as concrete. Don’t see it myself.

Not true. Density is expressed as mass per unit volume. Temperature is not “per unit” anything. The plasma in Princeton University’s fusion reactor reached millions of degrees. If you stuck your hand in it you wouldn’t even have noticed (ignoring the very low pressure conditions leading to “vac bite”) because the plasma density was very low. Total heat content about that of a hot cup of coffee.
Now, you might make a case that the conditions on the face of the planet are such that there is not a lot (for engineers “a lot” usually means an order of magnitude or so) of difference from place to place so that very broadly air anywhere at 70F is the same as air anywhere else at 70F and in that sense averaging temperature might mean something to a rough order of magnitude (ROM). However, the devil is in the details and if you’re looking for sub 1F accuracy in your trends and analysis then air anywhere is not the same as anywhere else and you must look at enthalpy instead. And if you think the temperature record is a mess, you don’t want to contemplate the humidity record.

Bruce of Newcastle
March 1, 2011 10:56 am

steven mosher says:
February 28, 2011 at 2:52 pm
Here’s a nice little presentation that will give you some ideas.. using Ocean heat content: http://www.newton.ac.uk/programmes/CLP/seminars/120812001.html
—————————————————————-
steven mosher says:
February 28, 2011 at 11:21 pm
Bruce,
What I’m saying is that you cannot derive a sensitivity from a time series as you have attempted to.
—————————————————————-
Steven,
Thanks for replying. I’ll leave off the question of deriving 2XCO2 from time series (even though it is a reasonable question), because your link to Magne Aldrin’s presentation was bugging me.
After some thought, what I think is Dr Aldrin is statistically overestimating CO2 sensitivity. This is because (see slide 13, p 14 of the PDF) he seems to be underestimating solar sensitivity.
I say this because the correlation of solar cycle length to temperature suggests solar sensitivity is quite high. If as seems reasonable you can posulate this correlation is reflective of total solar sensitivity (since something to do with the Sun’s behaviour is causing the correlation) then SCL encompasses what Mr Rumsfeld might say the ‘known knowns’ of solar sensitivity plus the ‘unknown unknowns’ of solar sensitivity.
What I suspect is happening is Dr Aldrin is using a measure of solar forcing as an input to the model which reflects only the ‘known knowns’. Because of this the solar ‘unknown unknowns’ would contaminate the calculated CO2 sensitivity and make it too large. To misquote, saying CO2 is “the only thing we can think of” means a GCM approach will tend to estimate CO2 sensitivity on the high side if other sensitivities are underestimated or discounted. I can’t be certain of this without looking at how the code handles the solar radiative forcing input data, but it looks likely to me that it is underweight.
If you graph SCL vs temperature this shows a significant empirical correlation. I suspect this is comprised of the direct component (known known) as Dr Aldrin has in slide 13, and an unidentified component(s), possibly a positive feedback, that people such as Drs Svensmark and Rao and others have been linking to cosmic rays, UV etc. I don’t know enough to tell whether they are right, but this is about lifting the lid on the ‘unknown unknowns’.
This is the intrinsic problem with the modelling approach – if you inadvertently leave out a significant variable, the variable you are trying to measure will tend to be an over-estimate. I know this from multiple regression – if you leave a statistically significant variable out, your parameter vector will be too large since the regression is trying to minimise the residuals, particularly with noisy data. You learn to be wary of this. In this case both pCO2 and solar rose in the time series, especially in the last 50 years or so, so it is not surprising if some solar derived variance is misassigned to CO2. Dr Aldrin’s approach is effectively a form of multiple regression.

March 1, 2011 11:19 am

carrick, you mean if I shout Nyquist you’re not impressed?
Dont forget the resampling experiment I did where I randomly selected 1 station per
grid cell and got the same answer. The other thing Al forget’s is that SST which covers 70% of the globe is much less variable spatially. gosh why would that be

Al Tekhasski
March 1, 2011 12:03 pm

Carrick: “Third the measured data show a striking correlation with distance. “
Third time: there is nothing striking at all. Look at the details of Hansen-Lebedeff algorithm (section 3):
http://pubs.giss.nasa.gov/docs/1987/1987_Hansen_Lebedeff.pdf
Then look at any pair of temperature data. Inter-annual variation in any temperature series vastly exceeds long climatic trends, always, I hope you will not argue with this fact. Therefore, the correlation coefficient between a pair of time series is dominated by high-amplitude inter-annual variations, that’s why their correlation is “strikingly good”.
However, this has no bearing to the question at hand, about climate (30-100years) trends. As I submitted, there are many met stations (Crossbyton vs. Lubbock, Ada vs. Pauls Valley, etc) , 30-50 miles apart. I am absolutely positive that their inter-annual variations would correlate to near 100%. It is obvious, they see nearly the same weather and same seasons, and same skies. However, their 90-years trends are OPPOSITE. This is a documented hard fact, see GISS.
Now, you invoke the satellite argument. Satellites measure something that is fairly different from station’s temperature. The information comes form a vast layer of several km thick, so it does not have the necessary resolution. It is an integral parameter. Yes, it resembles the ground signal to some degree. However, despite all efforts to fudge output account for all corrections, there is still a difference of 0.4-0.6C between sats and grounds:
http://www.woodfortrees.org/plot/gistemp/from:2006/plot/hadcrut3vgl/from:2006/offset/plot/uah/from:2006/offset
This discrepancy amounts to the entire magnitude of “global warming”. [Note: in your example you have introduced various “offsets” to your time series, up to 0.21C. Why would you do so, to skew results by 30% of the entire “global warming” effect?]
More, even if satellites “measure” the same averages and do it correctly, or if there could be sufficient number of ground station, it still does not mean that the globe has radiative imbalance due to man-made CO2 increase as AGW theory asserts. As I mentioned, it can be shown that globally-averaged temperatures can go up or down while there is no global radiative imbalance at all.
As I see, you prefer to ignore these inconvenient facts. More, you continue to express nonsense – Berkeley project or not, the information from current scarce set of stations cannot be fundamentally improved, because the density of sampling grid is insufficient to represent the complex fractal-style topology of temperature field, and its crude boundary layer variability. But it as apparent that it is useless to argue this kind of details with you.
P.S. I’d like to express many thanks to Tamino who re-introduced the term “mathturbation”, which perfectly describes various mangling with temperature data sets including his own meaningless statistical efforts.

Al Tekhasski
March 1, 2011 12:09 pm

Mosher: “Dont forget the resampling experiment I did where I randomly selected 1 station per grid cell and got the same answer. “
Apparently you didn’t grasp the correct meaning of concept of resampling. You cannot have sammples that you don’t have. You need to INCREASE the number of stations, not to decrease the already crippled set. And why do you think that your initial grid is random in first place?

Al Tekhasski
March 1, 2011 12:35 pm

Carrick: “One of the easiest ways to look for undersampling is to look at the frequency domain. For data that follow an approximate 1/f power law, aliasing will show up as a high-frequency plateau.”
No, this would happen only if you undesample just a tiny bit, less than a factor of two. Ground station data suggest however that the land surface field is undersampled by a factor of 100. Try to find some decent sampling-aliasing java applet and educate yourself. Average of your restored signal could be anything, from -1 to +1.
Actually, we have been here before, at Blackboard “physicsts” thread, half a year ago.
http://rankexploits.com/musings/2010/physicists/
I don’t see any progress.

izen
March 1, 2011 12:53 pm

@-Spen says:
March 1, 2011 at 8:46 am
“I read the link above to Judith Curry’s discussion on spatio-temporal chaos. …. Global climate history over geological time has indeed varied but always within a relatively small range. This evidence points to the existence of boundary conditions, the principle of which surely would be in conflict with this chaos theory.”
No, chaos theory differentiates between the inherently unpredictable nature of any specific value at a specific time/place from initial conditions.
But it also defines boundary conditions if the curves of the non-linear functions that generate the chaos are definable.
If you play around with chaotic formula at all you seen find that the numbers produced are unpredictable, but the range or envelope of behavior is constrained by the driving variables. A result may be anywhere on the manifold of a strange attractor, but the shape of the manifold is precisely defined.
(thats badly expressed, hopefully someone with better math chops than I in this subject can clarify it!)

izen
March 1, 2011 12:59 pm

@-Vince Causey says:
March 1, 2011 at 6:25 am
” I believe the rate of SLR is about 2 to 3 mm pa – say 3mm. That’s 3m per millenium or 30m in the last 10,000 years. But I am sure sea levels have risen more than that. For example, the North Sea with its mean depth of 90 metres, was completely dry during the last ice age. Doesn’t that mean that sea levels have historically risen more than 3mm pa in the past?
Just wondering.
Yes, during the meltwater pulse 1A as shown on the graph that ‘SMOKEY’ so helpfully provided there was a rise of about 20m per 1000 years or around ten times the present rate. But that was during the height of the glacial-interglacial transition when the major northern ice-caps were melting. During the glacial transition there was about 120m of sea level rise over ~8000 years.
Once all those except Greenland and Antarctica had gone the sea level stabilized around 6000 years ago and has certainly not being rising for the last 6000 years at the rate of a foot per century. There is ROBUST archaeological, geological and direct observational records that constrain any rise over the last 6000 years to much less than the present observed rate of ~3mm per annum.
@-Smokey says:
March 1, 2011 at 4:18 am
“Still having fun with your CAGW scam, Izen?”
If the ‘C’ in CAGW stands for catastrophic then as I think I have stated before I am agnostic about whether the AGW will be catastrophic as that has more to do with how ROBUST a society is in the face of change rather than the magnitude of the change.
We may agree – at least to some extent – on how much of a ‘scam’ it is to claim that AGW will be catastrophic. I prefer the term ‘political froth’ to scam for such claims for or against the impact of AGW on modern technological societies. Clearly if we were all still hunter-gathers as the human population was during the last Eemian interglacial period any global warming – or the eventual cooling was not catastrophic because such societies – small bands without agriculture are much more robust in the face of changing climate.
I wonder if you could post the link again to the Post-Glacial Sea Level Rise graph, it does make the point rather well….
Sea level may be a better indicator of the global heat content of the climate that any manipulations of surface temperature data.

izen
March 1, 2011 1:00 pm

@-Vince Causey says:
March 1, 2011 at 6:25 am
” I believe the rate of SLR is about 2 to 3 mm pa – say 3mm. That’s 3m per millenium or 30m in the last 10,000 years. But I am sure sea levels have risen more than that. For example, the North Sea with its mean depth of 90 metres, was completely dry during the last ice age. Doesn’t that mean that sea levels have historically risen more than 3mm pa in the past?
Just wondering.
Yes, during the meltwater pulse 1A as shown on the graph that ‘SMOKEY’ so helpfully provided there was a rise of about 20m per 1000 years or around ten times the present rate. But that was during the height of the glacial-interglacial transition when the major northern ice-caps were melting. During the glacial transition there was about 120m of sea level rise over ~8000 years.
Once all those except Greenland and Antarctica had gone the sea level stabilized around 6000 years ago and has certainly not being rising for the last 6000 years at the rate of a foot per century. There is ROBUST archaeological, geological and direct observational records that constrain any rise over the last 6000 years to much less than the present observed rate of ~3mm per annum.

Espen
March 1, 2011 1:48 pm

Carrick says:
Espen claims that we need a billion thermometers
My oh my, I never said such a thing. If I say “Even if the moon were indeed made of Wensleydale cheese, it would be too old to taste good”, do you think I claim that “we need a moon made of Wensleydale cheese”?
but that would only be you need immediate psychiatric evaluation
Wow, now I really have to bow before your intellectual superiority.

March 1, 2011 2:30 pm

Bruce:
http://www.ecd.bnl.gov/steve/pubs.html#pres
Start with this slide set from schwartz. You’ll see a variety of approaches.

March 1, 2011 2:40 pm

Al,
what you fail to realize is that you could drive the sample size down to 60 stations or even fewer, randomly selected from over 40K stations on the land and still get the same answer. Which means you believe the unsampled areas must somehow be different. On what basis? Well, we can look at the whole world over 30 years.
Find any pockets or eddies where a geographically substantial portion of land exhibits
statistiscally different trends? nope. No “standing waves” of zero trend or negative trend.
Further you can sample the whole world ( UHA or RSS) and see that it doesnt differ ( in trend.. which is what we care about) from a sparsely sampled earth. The reason is simple. spatial correlation. Also, the hansen study is very much out of date and their are more recent studies that use daily data from many more stations to establish some slightly different correlation figures.
Do you believe in an LIA?

March 1, 2011 2:43 pm

carrick,
espen and al dont believe in an MWP or an LIA.
too few thermometers to establish an average during those times.

Bruce of Newcastle
March 1, 2011 4:17 pm

steven mosher says:
March 1, 2011 at 2:30 pm
Thanks, Steven, lots to go through. First one I’m fairly randomly looking at is:
Why hasn’t Earth warmed as much as expected? Schwartz, S. E., Sept. 22, 2010
I note two empirical options he lists are:
“Instrumental record ΔTemperature/(Forcing – Flux)
Satellite measmt.: [d(Forcing – Flux)/dTemperature]-1”
The latter is of course the method used by Drs Spencer and Braswell (2010) which I referred to.

sky
March 1, 2011 4:22 pm

Carrick says:
February 28, 2011 at 2:54 pm
“If you ask how many samples you need to fully capture the Earth’s temperature field, based on a correlation length of 500-km (this is a smaller number than is quoted by Hansen’s group for the correlation length), that works out to around 2000 instruments world wide. By comparison, there are nearly 10,000 fixed instruments located on about 30% of the surface of the Earth from land-based systems alone. The oceans are less well covered, but the correlation lengths go way up.”
Even if we accept the “average temperature” as a meaningful metric, the “correlation length” does NOT define the spatial sampling rate that is required for alias-free capture of the temperature field. Although I’d be delighted to see century-long records from even as few as 2000 thermometers uniformly distributed world-wide in locations unaffected by UHI and/or land use changes, we have nothing remotely resembling that in the GHCN data base. On the contrary, there are vast stretches of the continents where no credible “rural” record of adequate duration is to be found for much more than 1000km. Only the USA and Australia is reasonably adequately sampled.
The zero-lag spatial correlation does not even begin to address the issue of low-frerquency coherence between spatially separated points. While that coherence may be quite high in satellite data that sample over a swath tens of kilometers wide, it far too frequently fades into insignificance when temperatures inside an instrument shelter are measured. That’s what makes the “trends” of surface station records so unstable and inconsistent a metric. Th upshot is that we really have no accurate grasp of what the “global average temperature” has done in the last century.

March 1, 2011 5:56 pm

Izen,
Now you’re capitalizing “robust.” Time for counseling.
Here’s the graph you wanted: click

Al Tekhasski
March 1, 2011 7:52 pm

Steven Mosher,
It is really amusing that trends do not differ after your subset selection. My concern is that the original historical set is screwed (by definition) in first place, and is not “random” with regard to surface topology. Therefore, your “random selection” of non-random set is not really random relative to the original field.
Also, my other concern is with stations that have 90+year-long “negative warming” trend. You have identified about 500 stations yourself, if I am not mistaken.
http://stevemosher.wordpress.com/2010/09/29/needs-chec/
The fact that your distribution of trends has a bell-shape curve speaks towards some sort of normal statistics, true. Your bell is asymmetrical. How do you know that if you would have proper amount of stations, the bell curve will not be centered at zero trend?
However, my deepest concern is that these downtrends do not have coherent explanation from the CO2-induced radiation imbalance theory of AGW. The idea fails miserably when many side-by-side pairs of stations show nearly monotonic but distinctly opposite centennial trend. If one has to explain “negative warming” around a certain station by various excuses like land or water regime change etc, the “positively-warming” trend for the next closest neighbor must be subject to the same kind of factors, which leaves the man-made warming completely unsupported and out of the picture.
So, you say, “spatial correlation”. I say “baloney” – my examples show that there is no spatial correlation for climate trends in station data even at several miles.

Espen
March 1, 2011 11:33 pm

steven mosher says:
March 1, 2011 at 2:43 pm

espen and al dont believe in an MWP or an LIA.
too few thermometers to establish an average during those times.

Please don’t use carrick’s misinterpretation of what I wrote to make silly psychic readings, will you?

Carrick
March 1, 2011 11:34 pm

Espen, you said:

Even if we covered the earth with a grid of more than a billion temperature sensors

What is this supposed to imply? I left my secret decoder ring in the lab. You brought it up not me. If you didn’t mean it, or it wasn’t relevant, you shouldn’t have said it.
I also take it from your sarcastic response that you look at today’s ocean heat index forecast to decide what to wear to work today. 😉
sky:

Even if we accept the “average temperature” as a meaningful metric, the “correlation length” does NOT define the spatial sampling rate that is required for alias-free capture of the temperature fieldNot sampling rate, but spatial sampling, and yes it does. Or more precisely the correlation length plus the sampling theorem does.
I agree there are (relatively) big holes in some regions of the world, but believe it or not the uncertainty with these can be modeled too…. and they don’t over turn the overall trends (the NCDC reconstruction includes estimates of the contributions to the uncertainty from undersampled regions, you might want to check that out before assuming everybody in the climate field is a snake oil salesman). Moreover, as I’ve pointed pointed repeated, the reasonable concordance of satellite to surface baed measurement sets an upper limit on how large these various effects could be.
[Then again, I fully support the Berkeley effort and think it is overdue. But there is a difference between the accuracy needed for coarse applications like estimates of climate sensitivity, and what a finer resolution, higher fidelity reconstruction might tell us.]
Al Tek:

No, this would happen only if you undesample just a tiny bit, less than a factor of two. Ground station data suggest however that the land surface field is undersampled by a factor of 100. Try to find some decent sampling-aliasing java applet and educate yourself. Average of your restored signal could be anything, from -1 to +1.

Severely undersampled data gives a flat spectrum, not a 1/f spectrum, you should know that. Try stopping with the petty insults and start reasoning for a change.

So, you say, “spatial correlation”. I say “baloney” – my examples show that there is no spatial correlation for climate trends in station data even at several miles.

All this based on the cherry picking a few station locations, without providing any details how you did the analysis, what time periods the trends were calculated over or anything.
We won’t see eye to eye, because your beliefs are articles of faith, and data contrary to your beliefs simply gets dismissed instead of given the proper weight they deserve.
Bottom line is you are flat wrong and the data show that.

Espen
March 1, 2011 11:37 pm

Moderators, please delete the last line of my previous post – I intended to delete that question since too few understand what it means.

[done]

Carrick
March 1, 2011 11:42 pm

Steven Mosher:

carrick, you mean if I shout Nyquist you’re not impressed?
Dont forget the resampling experiment I did where I randomly selected 1 station per
grid cell and got the same answer. The other thing Al forget’s is that SST which covers 70% of the globe is much less variable spatially. gosh why would that be

No, and I’m not impressed if somebody shouts “Nyquist-Shannon-Kotelnikov-Whittaker-Schlongenknocker Theorem” either. 😉
I didn’t emphasize the greater correlation in the ocean data (again we’re talking about 1-month averaged temperatures, not hourly). But it was contained here.
It must be a queer accident that so many different ways of looking at the data are so self-consistent given that Al Tek thinks we need…
I remember the reconstruction, perhaps you could link it for the crowd, let them make their own determination? (This maybe a case of “the plot thickened as the crowd thinned” but we may have a few onlookers left who have an open mind and would find your work interesting).

Espen
March 2, 2011 1:08 am

Carrick says:
March 1, 2011 at 11:34 pm
Espen, you said:
Even if we covered the earth with a grid of more than a billion temperature sensors
What is this supposed to imply?

You are unbelievably persistent! Don’t tell me you haven’t seen conditional sentences with purely hypothetical antecedents before.
The meaning is simply that the precision of the measurements is irrelevant – the real problem is that mean global temperature anomaly is a bad measure of global warming. Again: A +10 C anomaly in Arctic Canada in January represents only a fraction of the excess heat of a +10 C anomaly over an equivalent area in Amazonas.

sky
March 2, 2011 12:14 pm

Carrick says:
March 1, 2011 at 11:34 pm
I strongly disagree with your conclusions. Zero-lag spatial correlation is not the same as low-frequency coherence, whose much-too-frequent lack is what produces highly inconsistent multidecadal “trends” at neighboring stations. Aside from unrecognized offsets introduced into the record by station moves and instrumentation changes, the primary culprit is UHI and land-use changes, which can be equally strong in corrupting “rural” records. You seem not to recognize this practical problem, nor the analytically indisputable requirement of sampling at half the shortest wavelength of consequence in the spatial field in order to avoid aliasing.. With cities occupying hundreds of square kilometers, that shortest wavelength is on the order of 10km. There simply is no adequate, world-wide, century-long data base for those of us interested in climatology rather than urbanolgy. And when it comes to secular trends, rather than multidecadal swings with which they’re often confused, satelletite data is much too short in duration. Spatial correlation of records, which is strongly influenced by subdecadal variations, is not the determiant.
Nothing that NCDC has done in estimating the effects of world-wide spatial undersampling comes remotely close to coming to grips with these issues. Along with others, they take data from a largely urban or otherwise corrupted data base as physically indicative of the globe. The vaunted trend “concordance” with satellite data is moot at best and is starting to show ever-increasing divergence, especially when considered on a regional level.

Al Tekhasski
March 2, 2011 12:55 pm

Carrick wrote:
“All this based on the cherry picking a few station locations, without providing any details how you did the analysis, what time periods the trends were calculated over or anything.
We won’t see eye to eye, because your beliefs are articles of faith, and data contrary to your beliefs simply gets dismissed instead of given the proper weight they deserve.”

I think you are expressing your own (and wrong) philosophy of research. You have expressed a belief that weather is 1/f noise, that it is spatially correlated including 100-year climate trends, and your sensors must show warming trend because you believe that man-made CO2 exerts radiative “pressure”. It is you who dismiss data that contradict your belief.
You call my examples as “cherry picking”. Yes, I specifically selected these examples, because it takes only one example to dismiss your belief system. After a short look I found a dozen. Some practitioners of climate data mathturbation declare that my examples are “statistically insignificant”. Obviously they failed to realize that weather is not a noise, and 100 years of consistently declining data at 500 stations are not a statistical aberration. Even with relatively poor accuracy of station thermometers, there is no way to reverse trends on these stations, although keepers of the data keep making continuous attempts to “re-analyze” and “correct” data sets to squeeze them into their belief.
You say, “without providing any details”. This is false. I provided several links to several posts of mine, to avoid repetition. If you follow my links, you would find references to exact GISS stations I am talking about, and pull out data charts. You are just trying to obfuscate the subject and hide behind your laziness. But since you value your time so much, I will re-post pointers to just one pair of stations, Pauls_Valley_4wsw and Ada, with some “mathturbation”. These stations are 55km apart. Time span is 1907 to 2009.
Pauls_Valley_4wsw:
http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=425746490030&data_set=1&num_neighbors=1
Trend equation: y = +0.0071x – 13.98, or warming 0.7C/100y
Ada:
http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=425746490040&data_set=1&num_neighbors=1
Trend equation y = -0.002x +3.87, or cooling of 0.2C/100 years
It is interesting that if I partition the Ada data into three climatologically significant (30 years) time blocks, 1907-1936, 1943-1972, 1980-2009, and average corresponding data, I get a nearly perfect regression line y = -0.0015x + 2.96 with R2=0.9985
To summarize: The fact that many close-by pairs of stations show opposite centennial trend is totally inconsistent with the concept of atmospheric radiative imbalance from man-made “excess of CO2”.
The fact that there are many close-by pairs of stations with opposite centennial trend means that you are looking for your warming signal in wrong measuring metrics.
These are my facts. What are yours? These fuzzy clouds of seasonal correlations from logically deficient Hansen-Lebedeff study? Do you know that Hansen averages everything in 200x200km box? This requires an assumption of uniform change (+ “noise”) in instrument reading, which is based on AGW belief. My facts show that this assumption is wrong. Does it carry any weight? I think the weight is devastating, but of course you would disagree and try to find some other goofy excuse to dismiss the facts.

KR
March 2, 2011 2:14 pm

Redefining Climate Sensitivity? Why not introduce a _new_ term encompassing climate changes due to overall temperature shifts, rather than attempting to redefine a term used in all of climate science?
This, in my personal opinion (yours may differ) is a clear attempt to move the goalposts (http://www.don-lindsay-archive.org/skeptic/arguments.html#goalposts); usually a sign that an argument on the original terms has been lost.

eadler
March 3, 2011 7:04 am

Pielke’s idea of using the ocean heat content as a metric is nothing new, and no revelation to the world’s climate scientists.
http://www.realclimate.org/index.php/archives/2005/05/planetary-energy-imbalance/
It is of course harder to make measurements of ocean heat content than to process the existing temperature data of the world’s weather stations.
Historically, the global average surface temperature and the ocean heat content have gone pretty much in the same direction.
http://i38.tinypic.com/zxjy14.png
http://data.giss.nasa.gov/gistemp/graphs/Fig.A2.gif
The ocean heat content data base is newer, with data only since 1955.
Also the ocean heat content data base is more problematic than the temperature data base.
http://pielkeclimatesci.wordpress.com/2010/01/04/guest-weblog-by-leonard-ornstein-on-ocean-heat-content/

eadler
March 3, 2011 7:27 pm

Bruce of Newcastle says:
February 28, 2011 at 2:34 pm
Apologies to Zeke but 2XCO2 has been empirically measured by Dr Spencer and others and found to be about 0.4-0.6 C.
You did not read your link carefully. Spencer did not actually claim to measure the long term climate sensitivity. His measurements, which are disputed are only for short term phenomena.
http://www.drroyspencer.com/2010/08/our-jgr-paper-on-feedbacks-is-published/
Unfortunately, there is no way I have found to demonstrate that this strongly negative feedback is actually occurring on the long time scales involved in anthropogenic global warming.

I’ve cross checked this a couple of ways, one using SST’s and another by difference after solar effects are controlled for, and a value around 0.6 C seems to be the number. The feedback must therefore be negative not positive. None of this is modelling, it is a straight analysis of recorded and easily available data.

If you have really cross checked this by using data, you are smarter than Roy Spencer. Maybe you can get a paper published.
It may be the problem is climate scientists seeming to ignore the effects of the sun, even though these are easy to see even if you plot HadCRUT vs SCL (or etc) yourself.

Climate scientists don’t ignore the effect of the sun. It’s irradience peaked in 1950.

sky
March 4, 2011 11:19 am

eadler says:
March 3, 2011 at 7:04 am
“Historically, the global average surface temperature and the ocean heat content have gone pretty much in the same direction.”
One of the things that makes climate science highly tenuous is the manufacture of time series from scraps of data obtained by different instruments over short time-intervals at ever-shifting locations sparsely scattered around the globe. What makes it disreputable is the presentation of such data sausages as the “observed” global time-history of the physical variable, rather than very crude and highly incomplete estimates, whose uncertainty can exceed its range of variablity.
This is very much the case with NODC’s OHC series going back to the 1950’s that you point to. There is not a single location in the world where a research vessel or buoy has kept station for all those years, making at least four bathythermographic measurements a day. I doubt that the renowned oceanographic institutions at Woods Hole and Sothhampton obtained such comprehensive coverage even in their own back yard. I know that Scripps didn’t. And there are vast stretches of the Pacific and the Southern Ocean for which you will find no BT data whatsoever until the advent of the Argo program. Far more so than with historical SST data, the geographic coverage simply isn’t there.
It is only the unwary and the inexperienced that can buy into claims of reliable knowledge of climate variablity in the absence of adequate bona fide measurements.