Lies, Damn Lies, and Anoma-Lies

Anomalies are unsuitable measure of global temperature trends

Guest post by David M. Hoffer

An anomaly is simply a value that is arrived at by comparing the current measurement to some average measurement.  So, if the average temperature over the last 30 years is 15 degrees C, and this year’s average is 16 degrees, that gives us an anomaly of one degree.  Of what value are anomalies? Are they a suitable method for discussing temperature data as it applies to the climate debate?

On the surface, anomalies seem to have some use.  But the answer to the second question is rather simple.

No.

If the whole earth was a single uniform temperature, we’d have no need of anomalies.  But the fact is that temperatures don’t vary all that much in the tropics, while variations in the high temperate zones are frequently as much as 80 degrees over the course of a year.  How does one compare the temperatures of say Khartoum, which on a monthly basis ranges from an average of 25 degrees to 35 degrees C, to say Winnipeg, which might range from -40 in the winter to +40 in the summer?

Enter anomalies.  By establishing a base line average, usually over 30 years, it is possible to see how much temperatures have changed in (for example) winter in Winnipeg Canada versus Khartoum in summer.  On the surface, this makes sense.  But does the physics itself support this method of comparison?

It absolutely does NOT.

The theory of CO2’s direct effects on earth’s surface temperature is not terribly difficult to understand.  For the purposes of this discussion, let us ignore the details of the exact physical mechanisms as well as the order and magnitude of feedback responses.  Let us instead assume that the IPCC and other warmist literature is correct on that matter, and then see if it is logical to analyze that theory via anomaly data.

The “consensus” literature proposes that direct effects of CO2 result in a downward energy flux of 3.7 watts/m2 for a doubling of CO2.  Let’s accept that.  Then they propose that this in turn results in a temperature increase of one degree.  That proposal cannot be supported.

Let us start with the one degree calculation itself.  How do we convert watts/m2 into degrees?

The answer can be found in any text book that deals with radiative physics.  The derivation of the formula requires some in depth understanding of the matter, and for those that are interested, Wikipedia has as good an explanation as we need:

http://en.wikipedia.org/wiki/Stefan%E2%80%93Boltzmann_law

For the purposes of this discussion however, all we need to understand is the formula itself, which is:

P=5.67*10^-8*T^4

It took Nobel Prize winning work in physics to come up with that formula, but all we need to use it is a calculator.

For the mathematically inclined, the problem ought to be immediately obvious.  There is no direct correlation between w/m2 and temperature.  Power varies with T raised to the power of 4.  That brings up an obvious question. At what temperature does the doubling of CO2 cause a rise in temperature of one degree?  If we use the accepted average temperature of earth surface as +15 degrees C (288 degrees K) simply applying the formula suggests that it is NOT at the average surface temperature of earth:

For T = 288K

P = 5.67*10^-8*288^4 = 390.1

For T = 289K (plus one degree)

P = 5.67*10^-8*289^4 = 395.5

That’s a difference of 5.4 w/m2, not 3.7 w/m2!

So, how does the IPCC justify their claim?  As seen from space, the earth’s temperature is not defined at earth surface, nor can it be defined at the TOA (Top of Atmosphere).  Photons escaping from earth to space can originate at any altitude, and it is the average of these that defines the “effective black body temperature of earth” which turns out to be about -20 C (253 K), much colder than average temperatures at earth surface.  If we plug that value into the equation we get:

253K  = 232.3 w/m2

254K  = 236.0 w/m2

236.0 – 232.3 = 3.7

There’s the elusive 3.7 w/m2 = 1 degree!  It has nothing to do with surface temperatures!  But if we take this analysis a step further, it gets even worse.  The purpose of temperature anomalies in the first place was supposedly to compare temperature changes at different temperature ranges.  As we can see from the analysis above, since w/m2 means very different things at different temperature ranges, this method is completely useless for understanding changes in earth’s energy balance due to doubling of CO2.

To illustrate the point further, at any given time, some parts of earth are actually in cooling trends while others are in warming trends.  By averaging temperature anomalies across the globe, the IPCC and “consensus” science has concluded that there is an overall  positive warming trend.  The following is a simple example of how easily anomaly data can report not only a misleading result, but worse, in some cases it can report a result the OPPOSITE of what is happening from an energy balance perspective.  To illustrate, let’s take four different temperatures and consider their value when converted to w/m2 as calculated by Stefan-Boltzmann Law:

-38 C = 235K = 172.9 w/m2

-40 C = 233K = 167.1 w/m2

+35 C = 318K = 579.8 w/m2

+34 C = 317K = 587.1 w/m2

Now let us suppose that we have two equal areas, one of which has an anomaly of +2 due to warming from -40 C to -38 C.  The other area at the same time posts an anomaly of -1 due to cooling from +35 to +34.

-38 C anomaly of +2 degrees = +5.8 w.m2

+35 C anomaly of -1 degree = -7.3 w/m2

“averaged” temperature anomaly = +0.5 degrees

“averaged” w/m2 anomaly = -0.75 w.m2

The temperature went up but the energy balance went down?  The fact is that because temperature and power do not vary dirfectly with one another, averaging anomaly data from dramaticaly different temperature ranges provides a meaningless result.

Long story short, if the goal of measuring temperature anomalies is to try and quantify the effects of CO2 doubling on earth’s energy balance at surface, anomalies from winter in Winnipeg and summer in Khartoum simply are not comparable.  Trying to average them and draw conclusions about CO2’s effects in w/m2 simply makes no sense and produces a global anomaly that is meaningless.

0 0 votes
Article Rating
159 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Max
August 26, 2012 7:40 pm

“An anomaly is simply a value that is arrived at by comparing the current measurement to some average measurement.”
I would agree, to a degree, because comparing to an average is a weak method, but more generally an anomaly is an abnormal value when compared to a history of values. The definition of “abnormal” is many. and may include comparing to a distribution of values, an example is n-sigma from the mean, or the distance from n-nearest neighbors, where n may be 1 or the entire sample set, or a violation of a rule, or frequency based (near the mean, but does not happen often as in the case of multi-modal distributions), or based on a normal range, etc. Anomalies may be computed in one dimension, or multi-dimensional and to truly determine an anomaly one may need to look at it in many ways, not just “distance” from “average”.

August 26, 2012 7:42 pm

Greenhouse warming theory is like a Russian matryoshka doll; there is always a deeper fraud inside.

August 26, 2012 7:44 pm

Speaking of which, when did the word “average” get replaced with “normal” by our weather overlords?

August 26, 2012 7:52 pm

Excellent analysis, David. This corresponds with my understanding as well.

ShrNfr
August 26, 2012 7:53 pm

The rest of this derives from the assumption that the anomalies live in L2 and parametric statistics can describe them. Weather and climate are highly coupled non-linear processes. Such processes are by their nature chaotic and live in a dimension less than L2. Second moments just plain do not exist. Parametric correlations do not exist. You can do rank correlations and that sort of non-parametric measure, but that is about all you can say with certainty. The tails of the distribution are way too fat. The hundred year flood comes way too often. To get hysterical about those hundred year floods is an admission that the person who does so does not know statistics.

jocky scot
August 26, 2012 8:03 pm

What if the average is itself an anomaly?

geran
August 26, 2012 8:32 pm

wot jocky scot said…
And that begs the question: Is an anomaly of an anomaly an anomaly?

August 26, 2012 8:36 pm

Happens often when you manipulate intensive variables in isolation.

Roger Carr
August 26, 2012 8:43 pm

I take some heart from your essay, David… and I expect others will, too… those who have asked from time to time on WUWT? why anomalies are used at all.
     The explanations have always been profound, but have always left me with a deeper furrow on my brow than before the explanation was given… that and a sense that I must be rather dense.
     Perhaps I am brighter than I thought?
     Whichever; I now feel I am at least in good company.

intrepid_wanders
August 26, 2012 8:51 pm

A while back, I think it was on the Blackboard, I had expressed my “hair on the back of neck up” when people talk anomalies. Anomalous to what and when is my first question. Secondly, why is it that ALL anomalies are bad? History is choked full of anomalies.
Anyhow, Zeke H. explained to me, in many, many, many words how an anomaly is a normalization of a data-set, and I fell asleep and gave up (I am not sure in which order).
This post will be in my bookmarks. Thank you David!

Justthinkin
August 26, 2012 9:00 pm

“The temperature went up but the energy balance went down?”
FOOLS. Don’t you know that the IPCC, the Useless Nations, NASA, NOAA, Mann, etc., and the eco-cultists have re-written the laws of thermodynamics? Sheesh. Get with the pogro….er…..program.

Werner Brozek
August 26, 2012 9:05 pm

And where the simple anomaly number really makes a difference is in the Arctic. Suppose for example that the Arctic warmed by 2 C over the last 60 years. It would make a huge difference if the low and dry temperature of 245 K warmed by 2 C and the high and more moist temperature of 275 K also warmed by 2 C, or if the lower, dryer temperature warmed by 4 C and the high temperature was unchanged. I believe it is the latter that is closer to the truth. And if the warming of cold and dry air is driving global anomalies, we must take these with a grain of salt.

rogerknights
August 26, 2012 9:56 pm

jocky scot says:
August 26, 2012 at 8:03 pm
What if the average is itself an anomaly?

That reminds me of this quotation:

“And yet, what if the average itself were wrong?… Is it not plausible, and even likely, that most of us have the wrong kind of brain wave?”
–Anthony Standen, Science is a Sacred Cow, pp. 205-06

Amino Acids in Meteorites
August 26, 2012 10:00 pm

I get the point of the post, and I agree that the method is lacking.
My comment won’t exactly be on point. But, is it really settled that co2 increases will cause temperature increases? I don’t think it is.
Reginald Newell, MIT, NASA, IAMAP, co2 and possible cooling
1 minute video

Amino Acids in Meteorites
August 26, 2012 10:05 pm

“Trying to average them and draw conclusions about CO2′s effects in w/m2 simply makes no sense and produces a global anomaly that is meaningless.”
Agreed. Saying adding co2 to the atmosphere will cause warming and then seeing manmade co2 is on the rise and temperature is on the rise is not enough to prove that proposed guess. The price of milk went up at the same time. So can I also conclude the price of milk controls temperature?

george e smith
August 26, 2012 10:19 pm

Short and sweet David.
Just think how the discrepancies multiply if you consider tha the actual range of temperatures on eart’s surface is much greater than the numbers you chose for your example locations.
Vostok Station has been measured officially below -128 deg F, and with anecdotal evidence of nearby higher altitude points as low as -130 or -90 deg C, 183 K, while at the exact same time, tropical desert locations have air temperatures over 135 deg F, and surface Temperatures of maybe 140 F or +60 deg C, 333 K.
The S-B Total radiant emittance then covers a range of more than 11:1, and Temperatures near those extremes ca and do exist at exactly the same moment, since the dark of the Antarctic winter night, is the northern summertime. Even more dramatic is the result if you calculate the peak of the spectral radiant emittance over that temperature range, which is where the bulk of the radiant energy is emitted.
That goes as T^5, not T^4 which gives almost a 20 : 1 ratio. And at those highest Temperatures, the wavelength of the peak emission hasmoved from around 10.1 microns at 288 K to around 8.75 microns, which is further inside the “atmospheric window” making CO2 even less important.
There is also another issue. The disciples of the greenhouse gremlin, are willing to swear on a stack of bibles, that earth’s atmosphere DOES NOT radiate ANY “black body” radiation, since gases cannot do so; only solids (rocks) and liquids (oceans) do. Well clouds are water or ice, so clearly clouds can and must emit thermal continuum radiation depending on the cloud Temperature per the S-B law. Otherwise only the surface can emit thermal radiation, so the non cloud part of the atmosphere can only emit the LWIR spectra of the various GHGs, which is NOT spectrally dependent on Temperature to a first order.
So earth’s external emission spectrum, should reflect the earth surface Temperatures, with their higher value and shorter wavelength range; not some 253 K signature spectrum.
Well of course, the non cloud atmosphere does radiate a thermal continuum spectrum, characteristic of its Temperature, and quite independent of its composition; but its density is so low, that even the total atmospheric optical absorption density doesn’t come close to the total absortion required of a “black” body. The deep oceans absorb 97% in about 700 metres,with around 3% surface Fresnel reflection, so they make a pretty good imitation of a black body radiator.
So when James Hansen re-adjusts the historic baseline Temperatures for “discrepancies”, as he seems fond of doing, does he apply the exact same fudge factor to the actual historic Temperature readings taken back then. That would seem to be essential to do, or else there would be a fictitious discrepancy generated anomaly change, every time he discrepancizes the baseline reference ??

Geoff Sherrington
August 26, 2012 10:23 pm

From exploration geochemistry perspectives, (my bachground) an anomaly is a value significantly different to those historically or geographically around it, expressed in the same, correct units and with due consideration to noise and ditribution of values.
Whereas the occasional anomaly in climate science time series tends to be smoothed in long time series, it is can be a pity to downplay the anomaly (expressed this way), because it is information-rich. The “What” in “Whay made it different?” can unravel puzzles.
The use of “anomaly” to denote a property like temperature that has been elevated to an artificial baseline is relatively new to me, but I don’t like it. Once the statistician starts to truncate number sets, many types of possible analysis are subsequently invalid. So, David, the arguments you present are both correct and needed. The examples are quite well chosen, thank you.
I have some residual confusion about the mechanics of calculating anomalies in the climate change sense. If there is a single site with a single thermometer, it is easy to select and to adopt a 30-year reference period. (it certainly should not be called a normal. It is an artifice). The math to reduce the set to ‘anomalies’ is then trivial – until someone makes an adjustment. Pardon my ignorance, but for an area with many sites, is the ‘normal’ the average of all of the sites taken over that time, a constant that is then subtracted one-by-one from each site; or is each site first converted to anomaly form, then the composite calculated by averaging the residuals?
In any event, any change to the number of observations in the reference term, as happens frequently, will produce a new normal and a new anomaly string. Given that people like Ross McKitrick have published about the change on the number of sites used in the CONUS, has the ‘nomal’ been recalculated day after day as the number of observations changes, or are we free of errors from this type of mechanism?
What is the situation on normals when, like Australia had earlier this year, a completely different, revised time series named Acorn? If it becomes the official, accepted version, does this mean that uses such as proxy calibration have to be recalculated because the time-temperature response in the calibration period has been changed?

August 26, 2012 10:46 pm

Reblogged this on Climate Ponderings and commented:
” averaging anomaly data from dramaticaly different temperature ranges provides a meaningless result.”

AJB
August 26, 2012 11:07 pm

You knits deviation with mean power to a tee. Fore!

August 26, 2012 11:26 pm

Absolutely spot on – I’ve been aware of this irreconcilable “anomaly” for several years. Average temperature does not give average radiation, and vice-versa. The inherent error over the surface of the Earth may well be equal (or greater) in magnitude to the “warming effect” of a doubling of CO2.
However, Kiehl & Trenberth 2009 didn’t calculate outgoing surface radiation from an average surface temperature, as is often supposed. They used a method of gridded temperature data over the surface to calculate outgoing radiation for reach grid, averaging the results (though that has it’s own problems, too detailed to go into here).

Baa Humbug
August 27, 2012 12:16 am

-38 C = 235K = 172.9 w/m2
-40 C = 233K = 167.1 w/m2
+35 C = 318K = 579.8 w/m2
+34 C = 317K = 587.1 w/m2

The 579.8 w/m2 and 587.1 w/m2 in the above are backassward.

jorgekafkazar
August 27, 2012 12:36 am

Werner Brozek says: “…It would make a huge difference if the low and dry temperature of 245 K warmed by 2 C and the high and more moist temperature of 275 K also warmed by 2 C, or if the lower, dryer temperature warmed by 4 C and the high temperature was unchanged….”
Right. Temperatures don’t tell the whole story. Net heat flux is what counts, and that requires knowing what the humidity is in every call. The models don’t simulate real clouds rigorously, so they don’t do net heat flux well, either. “Global temperature” is an almost meaningless concept.

Gary Hladik
August 27, 2012 12:37 am

Roger Pielke Sr. has another critique of the average global surface temperature anomaly as a measure of warming:
http://pielkeclimatesci.wordpress.com/2012/05/07/a-summary-of-why-the-global-average-surface-temperature-is-a-poor-metric-to-diagnose-global-warming/
I assume David M. Hoffer’s argument would also apply to the UAH and RSS atmospheric temperature anomalies?

David
August 27, 2012 12:59 am

The “consensus” literature proposes that direct effects of CO2 result in a downward energy flux of 3.7 watts/m2 for a doubling of CO2.
So, how does the IPCC justify their claim? As seen from space, the earth’s temperature is not defined at earth surface, nor can it be defined at the TOA (Top of Atmosphere). Photons escaping from earth to space can originate at any altitude, and it is the average of these that defines the “effective black body temperature of earth” which turns out to be about -20 C (253 K), much colder than average temperatures at earth surface. If we plug that value into the equation we get:
253K = 232.3 w/m2
254K = 236.0 w/m2
236.0 – 232.3 = 3.7
——————————————————————————-
So the direct effect (before feedbacks) of doubling CO2 raises the earths surface T how much less then 1 degree?
Great article BTW. Also it should be pointed out the CO2 in the atmosphere does, as the IPCC shows, increase the residence time of LWIR energy. However it DECREASES the residence time of conducted energy which must stay in the atmosphere longer if it cannot radiate away. Our atmosphere is full of both conducted and radiated energy. additional CO2 accelerates the escape to space of conducted atmospheric energy.

Kasuha
August 27, 2012 1:03 am

For measuring the effect of greenhouse gases on the temperature, Stefan-Boltzmann equation is irrelevant. Greenhouse gases act as an insulator in the atmosphere and to measure their effect, we need to measure thermal conductivity of the atmosphere. It’s no simple matter but the question is, if we do it right, how much different results from average anomaly will we get.
For start we’d need to calculate differences between lower troposphere and stratosphere temperatures, and note that stratosphere is actually cooling over past decades which is enhancing the effect rather than diminishing it.

David
August 27, 2012 1:09 am

The “consensus” literature proposes that direct effects of CO2 result in a downward energy flux of 3.7 watts/m2 for a doubling of CO2.
So, how does the IPCC justify their claim? As seen from space, the earth’s temperature is not defined at earth surface, nor can it be defined at the TOA (Top of Atmosphere). Photons escaping from earth to space can originate at any altitude, and it is the average of these that defines the “effective black body temperature of earth” which turns out to be about -20 C (253 K), much colder than average temperatures at earth surface. If we plug that value into the equation we get:
253K = 232.3 w/m2
254K = 236.0 w/m2
236.0 – 232.3 = 3.7
——————————————————————————-
So the direct effect (before feedbacks) of doubling CO2 raises the earths surface T how much less then 1 degree?
Great article BTW. Also it should be pointed out the CO2 in the atmosphere does, as the IPCC shows, increase the residence time of LWIR energy. However it DECREASES the residence time of conducted energy which must stay in the atmosphere longer if it cannot radiate away. Our atmosphere is full of both conducted and radiated energy; additional CO2 accelerates the escape to space of conducted atmospheric energy.

CEH
August 27, 2012 1:18 am

+34 C = 317K = 587.1 w/m2 is not correct
+34 C = 317K = 572,5 w/m2

stephen richards
August 27, 2012 1:23 am

“I would agree, to a degree, because comparing to an average is a weak method, but more generally an anomaly is an abnormal value when compared to a history of values.”
“something that deviates from the normal” Collins english dictionary. How do you know what the “normal” is?. How do you measure it? What are historical values? I asked at RC some years ago and got a Gavin reply. It was that reply that persuaded me of their utter stupidity.

Peter Miller
August 27, 2012 1:26 am

At least we now know the ‘science’ is settled.

Peter Dunford
August 27, 2012 1:49 am

I think the main implication of this piece is that Climate Scientists are going to need MUCH bigger super computers.

wayne
August 27, 2012 2:32 am

You’ve got it David. What you are basically saying is climate “science” is not using proper science methods at all, they are using improper statistical tricks … and you are correct, but most here already know that well by now. To me all global figures tossed about above 0.3 – 0.4°C since 1900 are just a “scientific” illusion, it was solar based and anyone viewing SOHO regularly throughout the 90’s and into the very early 2000’s realizes this and the tick upward occured starting during the 1930’s.

Michael Lowe
August 27, 2012 2:43 am

In richard Feynman’s book he gives the following story:
“Anyhow, I’m looking at all these books, all these books, and none of them has said anything about using arithmetic in science. ..
Finally I come to a book that says, “Mathematics is used in science in many ways. We will give you an example from astronomy, which is the science of stars.” I turn the page, and it says, “Red stars have a temperature of four thousand degrees, yellow stars have a temperature of five thousand degrees . . .” — so far, so good. It continues: “Green stars have a temperature of seven thousand degrees, blue stars have a temperature of ten thousand degrees, and violet stars have a temperature of . . . (some big number).” There are no green or violet stars, but the figures for the others are roughly correct. … Then comes the list of problems. It says, “John and his father go out to look at the stars. John sees two blue stars and a red star. His father sees a green star, a violet star, and two yellow stars. What is the total temperature of the stars seen by John and his father?” — and I would explode in horror.”
Are anomalies the same as this but with minus signs instead of plus signs?

John Finn
August 27, 2012 2:43 am

For T = 289K (plus one degree)
P = 5.67*10^-8*289^4 = 395.5
That’s a difference of 5.4 w/m2, not 3.7 w/m2!
So, how does the IPCC justify their claim? As seen from space, the earth’s temperature is not defined at earth surface, nor can it be defined at the TOA (Top of Atmosphere). Photons escaping from earth to space can originate at any altitude, and it is the average of these that defines the “effective black body temperature of earth” which turns out to be about -20 C (253 K), much colder than average temperatures at earth surface. If we plug that value into the equation we get:
253K = 232.3 w/m2
254K = 236.0 w/m2
236.0 – 232.3 = 3.7
There’s the elusive 3.7 w/m2 = 1 degree!

Yes – the 3.7 w/m2 for a doubling is the forcing at the TOA. The forcing at the surface willl be greater. Remember (using averages again) the earth emits about 390 w/m2 from the surface – but only about 240 w/m2 from the TOA, i.e. the equivalent of the incoming solar energy. There is, therefore, a factor of 390/240 or ~1.6 in the “surface to TOA” ratio. If the flow of outgoing energy is reduced by 3.7 w/m2 (at TOA) due to an increase in greenhouse gases then the surface temperature will need to increase by about 1 deg C (assuming no feedback) in order that equilibrium is re-established (i.e. incoming solar energy = outgoing LW energy)
A simple energy balance model demonstrates the figures.

Entropic man
August 27, 2012 2:49 am

Having defined your problem with anomalies, please describe “a suitable method for discussing temperature data as it applies to the climate debate?”

old construction worker
August 27, 2012 2:57 am

“The data doesn’t matter. We’re not basing our recommendations
on the data. We’re basing them on the climate models.”
– Prof. Chris Folland,
Hadley Centre for Climate Prediction and Research

John Finn
August 27, 2012 3:03 am

Long story short, if the goal of measuring temperature anomalies is to try and quantify the effects of CO2 doubling on earth’s energy balance at surface, anomalies from winter in Winnipeg and summer in Khartoum simply are not comparable. Trying to average them and draw conclusions about CO2′s effects in w/m2 simply makes no sense and produces a global anomaly that is meaningless.

I’m not sure I agree. While there is significant regional variation, global temperature variation remains remarkably small. For example, only about half a degree covers temperature readings across all years for Aug 24th (i.e. the most recent UAH CH 5 readings). The same is true for all other days in the year. UAH (and RSS, GISS and Hadley) appear to be using sufficiently robust statistical sampling methods such that the anomalies are meaningful.

PhD student
August 27, 2012 3:54 am

Using anomalies in describing the time series requires stationary. Does the temperature record pass a unit root test? I would be very surprised if it did

August 27, 2012 3:57 am

Entropic man says:
August 27, 2012 at 2:49 am
“It feels cold today.”
“I believe this summer has been fine, overall.”
Touchy, feely stuff like that?
Followed by, “Send me your first born so I may save your grandchildren. Or else.”

TomVonk
August 27, 2012 4:06 am

For those mathematically inclined (college maths) this phenomenon is just a translation of a rather trivial truth known for 2 thousands years : “The average of a power is NOT a power of the average”
Example, let us take 1 and 2. The average is 1.5.
Now let us take a power law like a 4th power.
Then average ^4 = (1.5)^4 = 5.06 (Formula 1)
While the average of the 4th powers of these numbers is (1^4 + 2^4) /2 = 8.5 (Formula 2)
So there is a huge difference between the 4th power of an average and and average of 4th powers.
Getting back to physics it means that the Earth does NOT radiate at its average temperature (Formula 1) as it is ususally computed but its radiation is teh sum of places which are at different temperatures (Formula 2). The difference between these 2 values is of course important.
The only mathematical complication to get from this simple example to the Earth is to replace the arithmetic averaging by a surface integral of temperatures (1/S Integral(T.dS) ) but the result is the same.
The global average is irrelevant and the Earth radiation is very different from the “simple” k.(Global average)^4. But because temperature averages don’t give correct answers about energy flows, anomalies (which are just differences to an average) don’t give correct answers either.

TomVonk
August 27, 2012 4:22 am

Having defined your problem with anomalies, please describe “a suitable method for discussing temperature data as it applies to the climate debate?”
I will not talk for the author of the post.
But the only “suitable” way to deal with the dependencies between the temperatures and the energy flows is to take the Temperature field as it is in reality with all its spatial dependencies.
In other words you can obtain correct flows only by first computing then integrating T^4(x,y,z) where x, y and z are the spatial variables.
As soon as you begin with spatial averagings of temperatures (e.g making disappear the x,y and z) of any kind BEFORE you compute the flows you fall in the problem I described above and your answers about energy flows will be wrong.
The maximal “wrongness “will be realized when the spatial averaging of temperatures takes place over the whole globe what is called the global temperature average.

TimC
August 27, 2012 4:40 am

To measure total downward energy flux from the temperature anomalies the P=5.67*10^-8 * T^4 formula can be differentiated (as dP/dT = 4 * 5.67*10^-8 * T^3) to produce a “scaling set” for the energy fluxes – so 1°C up or down from a -10C mean (climatology) is scaled 4.13w/m2, at 0C mean the scaling is 4.62w/m2, at 10C it is 5.15w/m2, etc – the scaled fluxes can then be integrated spatially. Doesn’t this deal with the problem – and doesn’t it practically happen anyway?
More fundamentally, I agree that global annual-average surface temperature is not a very good metric by which to measure global warming or cooling – but what is to replace it?

Bill Illis
August 27, 2012 4:44 am

You can also think of it in terms of 1 Watt/m2 = 1 Joule/second/m2
Now we have “time” in the equation. We have 1 joule of energy moving through an infinitely thin area of a metre by a metre each Second.
And 1 joule is the equivalent of 3,018,412,315,122,250,000 solar photons and 15,092,061,575,611,200,000 Earth temperature long-wave photons (per second).
Now we have to start thinking about energy accumulation/loss per second.
On average, the Earth warms up or cools down by only 0.0075 joules/m2/second over any period of time (Day, Night, or Seasonal change throughout the year). At the height of the day, 960 joules/m2/second are coming in from the Sun, but the air temperature is only recording a change in energy of 0.008 joules/m2/second.
So, at the height of noon-day Sun, 2,897,675,822,517,360,000,000 solar photons are coming in per second but only 24,147,298,520,978,000 worth of those solar photons are accumulating in the air temperature. 99.9992% of the energy in the solar photons is going somewhere else per second. Either building up in the land, soil, vegetation or they are being re-emitted back to the upper atmosphere or space – basically as fast as they are coming in.
This is, of course, an infinitely thin layer or area of 1 m2. Now we have to start thinking of the volume (rather than area) of air that is being recorded by a thermometre. Now it gets even more bizarre.
Just a different take on it. Not something a climate model thinks about.

BigWaveDave
August 27, 2012 5:07 am

Entropic man asks [for someone to] please describe “a suitable method for discussing temperature data as it applies to the climate debate[?]”
Sure. A suitable method for discussing temperature data as it applies to the climate debate would start with the recognition that most, especially average, temperature data is worthless.

EternalOptimist
August 27, 2012 5:31 am

I am no scientist. What struck me about this (excellent) piece , is that there has been no warming for the last 13+ years.
Even using their shonky anomaly methods, they still cannot rustle up any warming. It makes me wonder what the true situation is

Entropic man
August 27, 2012 5:37 am

Probably better to stick to simple measurements.
http://tamino.files.wordpress.com/2012/08/piomas1.gif

August 27, 2012 5:37 am

Entropic:

Having defined your problem with anomalies, please describe “a suitable method for discussing temperature data as it applies to the climate debate?”

It’s really very simple, you have an offset to absolute temperature that only depends on the the mean annual cycle of the period that you baselined over. So, assuming e.g. a monthly series, just add that back in at the end to get the annualized version of the series.
Anomalization/deannualization of time series is common in many fields, e.g., most famously econometrics. Anomalization is just a mechanism for removing the short-period fluctuations in order to remove the otherwise visually obscured longer-term forcings, and a convenient technique for reconstruction.
I decided not to comment on the quality of the paper other than to say I don’t agree with the general assessment made in the comments.

Roger Carr
August 27, 2012 5:43 am

And… a brilliant headline!
               Lies, Damn Lies, and Anoma-Lies

Ed_B
August 27, 2012 6:00 am

give “a suitable method for discussing temperature data as it applies to the climate debate?”
answer:
Use only ocean heat content data.

Jeff Norman
August 27, 2012 6:07 am

For there to be a net increase of 3.7 W/m^2 at the Earth’s surface, there has to be a net decrease of 3.7 W/m^2 into space (assuming a constant influx (that isn’t actually constant)). Kasuha is correct, David M. Hoffer’s analysis and conclusion is wrong.

Ed_B
August 27, 2012 6:10 am

Uh, no… that model is derived from a static view of the earths atmosphere. I nearly fell off my chair laughing as I watched a U of Chicago lecture on “radiative physics” and the derivation of that result. I could not believe that the professor actually ignored thunderstorms lifting heat above most of the CO2 “blanket”, not to mention hurricane Issac which can lift all the man made heat for the last twenty years and dump it above the CO2 “blanket”. The atmosphere ain’t static!
/HT to Willis for his thermostat hypothesis paper. I suspect he is right with 0.3C or so AGW.

more soylent green!
August 27, 2012 6:18 am

The climate and weather numbers are always hash, sliced, diced and cherry picked in order to paint the most alarming picture possible. It’s been that way for decades now. Obfuscation seems to be an important secondary goal as well.

Maus
August 27, 2012 6:24 am

John Finn: “A simple energy balance model demonstrates the figures.”
Indeed. For example we have the following from http://answers.yahoo.com/question/index?qid=20090123131816AAeXRwL
“I have seen asphalt on the apron of a race track get to 138 degrees
Source(s):
Race engineer using an infrared gun at Bristol Motor Speedway in 2008”
Which, if you care to do the math, gives an emission of 689 W/m^2. That this is vastly higher than the estimated 390 W/m^2 is obviously a consequence of all the Mad-Max petrol burners making lap times.
This is, of course, farcical. But then we note that Bristol, TN has a latitude, an the Earth has an axial tilt and albedo. Assume that the axial tilt of the Earth was pointed directly at the sun at the time of the measurement, and that the measurement was taken at solar noon. Then if we use the Bond albedo we find that the inbound radiation is 922 W/m^2. And we might wish to thank our lucky stars that we have all that evil CO2 blocking inbound radiation. As otherwise the given black-body temperature would be 183 degrees (Fahrenheit obviously.)
This is a little bit stupid of course. And mostly for the reason that we’re using Bond albedo. Which default as the geometric albedo as corrected for scattering and other issues across the illuminated hemisphere of the sphere. This then is corrected by all other manner and means based on actual measurement. The problem is that, for this example, we really want the geometric albedo of the Earth on a day of a given cloudiness over a square meter of asphalt.
And the reason we want this is that the Bond albedo is farcically high and inappropriate for something like a simple radiation balance model for a patch of land in Tennessee. But if we simply take the other boundary condition and treat only the albedo as being that of asphaly, at 0.1, then we find the inbound insolation on the apron is 1196.94 W/m^2. Which dictates that we ought see a temperature of 226 degrees.
So our simple radiation balance model can do no more than state that GHGs and geometry are responsible for reducing the insolation received by the apron somewhere in the range of 25 to 42%. Making any statement about predicted temperature is a nonsense for something so trivial as a bit of tarmac as we have a predicted range of 43 degrees; the least point of that prediction being still 45 degrees higher than empirical measurement. And the actual temperautre being 79 degrees warming that the 59 degree temperature we would predict based on a the wonderous notion of 390 W/m^2.
So yes, depending on your assumptions we can make statements about a 3.7 W/m^2 forcing within a window of modelling errors spanning a 806 W/m^2 range of uncertainty in a simple energy balance model.

Dr. Lurtz
August 27, 2012 6:29 am

CO2 -> Control -> Money
SUN -> NO Control -> No Money
Rain is Weather. Sun is Climate.
Unfortunately, both of the “words” CO2 and Sun have three letters. No wonder politicians get confused!!

davidmhoffer
August 27, 2012 6:37 am

Baa Humbug;
+35 C = 318K = 579.8 w/m2
+34 C = 317K = 587.1 w/m2
The 579.8 w/m2 and 587.1 w/m2 in the above are backassward.
>>>>>>>>>>>>>>>>>>>>
CEH;
+34 C = 317K = 587.1 w/m2 is not correct
+34 C = 317K = 572,5 w/m2
>>>>>>>>>>>>>>>>>>>>>>>
Gents, you are both correct. I used temps of +44 and +45, and transposed the results to boot, good catch both of you.
That said, the point doesn’t change.

ericgrimsrud
August 27, 2012 6:43 am

This post appears to have a clear objective (to ridicule what the author calls “warmist literature”) but carries with it absolutely no substance that I can detect.
The author holds forth with use of the Stefan-Boltzmann equation, only, – which tells us primarily what the effective temperature of the Earth’s upper atmosphere must be when viewed from space – while the temperature “anoma-lies” (as the author so cleverly calls anomalies) concern surface temperatures – which, of course, are effectively hidden from space view by the greenhouse effect.
So Kasuh seems to also have this post figured out when he correctly says:
“For measuring the effect of greenhouse gases on the temperature, Stefan-Boltzmann equation is irrelevant. Greenhouse gases act as an insulator in the atmosphere and to measure their effect, we need to measure thermal conductivity of the atmosphere.”
I have seen similar “work” by DavidMHoffer on two previous threads. He seems to put “stuff” out there that might possibly look impressive to the lay public – but is often way off the main point and seems to be intended only to confuse rather than explain. In this case, the big question concerning the surface temperatures of the Earth is the GH effect. The bit he has provided here about the S-B equation and the effective T as viewed from space has been known for what, about 150 years?, but has little to do with surface temperatures of our planet.
If there is a scientific point that DH is trying to make here concerning surface temperature anomalies, perhaps he could make it clearer? As DH will verygladly tell you, I am not the brightest bulb on the tree and certainly am not bright enough to see any relevant or significant scientific point whatsoever in this post.

davidmhoffer
August 27, 2012 6:59 am

John Finn;
Yes – the 3.7 w/m2 for a doubling is the forcing at the TOA.
>>>>>>>>>>>>>>>>>>>>>>>
No, it is not. I refer you to the definition in IPCC AR4 WG1 Chapter 2:
http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch2s2-2.html
Which reads:
“The definition of RF from the TAR and earlier IPCC assessment reports is retained. Ramaswamy et al. (2001) define it as ‘the change in net (down minus up) irradiance (solar plus longwave; in W m–2) at the tropopause after allowing for stratospheric temperatures to readjust to radiative equilibrium, but with surface and tropospheric temperatures and state held fixed at the unperturbed values’.”
So for starters, the definition is NOT at TOA, it is at the tropopause, and it is the sum total of change in net (down minus up). You then have to contemplate what they mean by “net”. The supposed 3.7 w/m2 doesn’t happen at TOA or at any other single point in the atmosphere, it is a value derived from the change in energy flux from top to bottom of the tropospheric air column. No single altitude would yield that number, only a tiny fraction of it.

davidmhoffer
August 27, 2012 7:06 am

Jeff Norman says:
August 27, 2012 at 6:07 am
For there to be a net increase of 3.7 W/m^2 at the Earth’s surface, there has to be a net decrease of 3.7 W/m^2 into space (assuming a constant influx (that isn’t actually constant)). Kasuha is correct, David M. Hoffer’s analysis and conclusion is wrong.
>>>>>>>>>>>>>>>>>>>>>>>
As I said in the article itself, my point was not to debate the physical processes themselves, but to demonstrate that temperature anomalies as measured at the surface are unsuitable for tracking those processes. That said, you may be interested to know that doubling of CO2, at equilibrium, changes neither the amount of energy absorbed from the sun, nor does it change the amount of energy radiated by earth back out to space. What it changes is the average altitude at which energy is radiated to space. This, combined with the lapse rate, changes the temperature gradient from earth surface to TOA.

Gary Pearse
August 27, 2012 7:12 am

In an earlier WUWT? post, it was noted (I think by Dr. Brown of Duke U) that, since the T term is to the 4th power, that to take an average of day and night temps is invalid. The night temps changes yield a different power/T than daytime temps – working with their averages is just wrong. I recall he was explaining why the relation gives the wrong answer for the moon’s temps.

August 27, 2012 7:25 am

The apparent difficulty raised by Mr. Hoffer’s head posting may be resolved as follows.
To determine temperature change, the radiative forcing per CO2 doubling of 3.71 Watts per square meter is multiplied by some climate-sensitivity parameter.
In the absence of temperature feedbacks, or where they sum to zero, that parameter is – to a first approximation – the first differential of the fundamental equation of radiative transfer at the characteristic-emission altitude, which is that altitude (varying inversely with latitude) at which incoming and outgoing fluxes of radiation are in balance.
The incoming flux is measured by satellites at 1362 Watts per square meter, which is divided by 4 to allow for the ratio of (the area of the disk the Earth presents to the Sun) to the Earth’s spherical surface area. Thus, the downward flux at the mid-troposphere is about 340.5 Watts per square meter, which is multiplied by (1 minus the Earth’s reflectance or albedo of 0.3) to give 238.35 Watts per square meter.
Plugging this value into the Stefan-Boltzmann equation, assuming emissivity is unity, gives the characteristic-emission temperature of 254.6 K.
To a first approximation, then, the zero-feedback or “Planck” climate-sensitivity parameter is delta-T / (4 delta-F) = 254.63 / (4 x 238.35) = 0.267 Kelvin per Watt per square meter.
However, as the head posting rightly points out, it is necessary to make adjustments to allow for the effect of the fourth-power Stefan-Boltzmann equation on the non-uniform latitudinal distribution of surface temperatures by latitude.
In fact, the models relied upon by the IPCC do make the appropriate adjustment, and the IPCC’s estimate of the Planck parameter is 0.313 Kelvin per Watt per square meter, a little over one-sixth greater than the first approximation. The IPCC’s estimate will be found in a footnote on p. 631 of the Fourth Assessment Report (2007), where its reciprocal is expressed as 3.2 Watts per square meter per Kelvin.
I have verified the IPCC’s value using 30 years of mid-troposphere temperature-anomaly data kindly supplied by John Christy of the University of Alabama at Huntsville.
The value of the Planck parameter is crucial, because not only the direct warming of about 1.16 K caused by a CO2 doubling but also (and separately) the value of the overall feedback gain factor is dependent upon it.
Therefore, I have additionally determined various values of the Planck parameter, which varies over time depending upon the magnitude of the temperature feedbacks that it triggers. Here is a summary of these values:
Planck or instantaneous parameter: 0.3 Kelvin per Watt per square meter (determined as above).
Bicentennial parameter: 0.5 Kelvin per Watt per square meter (this value is deduced on each of the six SRES emissions scenarios by close inspection of Fig. 10.26 on p. 803 of IPCC (2007).
Equilibrium parameter, when the climate has finished responding to a given forcing (typically after 1000-3000 years): 0.9 Kelvin per Watt per square meter (the IPCC’s current 3.26 K multi-model mean sensitivity to a CO2 doubling, divided by the 3.71 Watts per square meter CO2 radiative forcing).
From these values, one may deduce that an appropriate climate-sensitivity parameter for a CO2-mitigation policy designed to operate over ten years is about 0.33 Kelvin per Watt per square meter; a 50-year parameter is about 0.36; and a centennial-scale parameter is about 0.4 Kelvin per Watt per square meter. From Table 10.26, taken with Table SPM.3, it is possible to estimate that the IPCC’s estimate of the centennial-scale climate-sensitivity parameter is approximately 0.435 Watts per square meter.
However, if – as Drs. Lindzen, Choi, Spencer, Braswell, Shaviv, etc., etc. have found – the temperature feedbacks acting in response to a forcing are net-negative, then the appropriate sensitivity parameter will be 0.2.
So, to second-guess the IPCC and make your own forecasts of CO2-driven warming, just pick your parameter and multiply it by 5.35 times the natural logarithm of the proportionate change in CO2 concentration.
An example. The IPCC’s central estimate of CO2 concentration in 2100, taken as the mean of all six emissions scenarios shown in Fig. 10.26, is 713 ppmv (though at present the concentration is rising very much more slowly than necessary to reach that value. And today’s concentration is 393 ppmv. So, assuming the net-negative feedbacks posited by Lindzen & Choi (2009, 2011) and by Spencer & Braswell (2011, 2011), the CO2-driven warming of the 21st century will be 0.2[5.35 ln (713 / 393)] = 0.6 K. Even if one adds a bit for other greenhouse gases, we shall not cause much more than 1 K warming this century.

u.k.(us)
August 27, 2012 7:25 am

Very well written post (wish I was smart enough to absorb it).
You even drew Kate into the discussion 🙂

davidmhoffer
August 27, 2012 7:31 am

I don’t want to get into a debate about the physical processes in this thread, my purpose was only to show that the use of anomaly data is not suitable for tracking them. But since the issue had arisen, I’ll refer readers to this link also:
http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch2s2-8.html
Which clearly states:
““It should be noted that a perturbation to the surface energy budget involves sensible and latent heat fluxes besides solar and longwave irradiance; therefore, it can quantitatively be very different from the RF, which is calculated at the tropopause, and thus is not representative of the energy balance perturbation to the surface-troposphere (climate) system.”
RF in the above quote stands for Radiative Forcing. In brief, even the “consensus” literature of the IPCC makes it clear that a simple extrapolation of radiative forcing to surface temperatures is not possible, and for a multitude of reasons. Despite plainly saying that radiative forcing and surface temperatures cannot be directly linked, the literature goes on to present surface temperatures as if they are.

Owen
August 27, 2012 7:54 am

Great article. I learned a lot. And I enjoy the comments section as well. The back and forth exchange of ideas, all in a civilized, adult manner is wonderful. Watts Up With That? is the best climate science site on the internet. Everyone who has contributed should be proud of themselves. I look forward to learning much more about climate science from people like David and others who don’t have to do this, but do it anyways. Thanks !

Jeremy
August 27, 2012 7:55 am

A bit confusing but I fully agree that attempting to monitor an average global surface temperature is indeed a completely meaningless concept. It tells us nothing about the radiative balance. monitoring of surface stations is a complete and utter waste of time for global radiative balance. Surface stations are useful ONLY for monitoring local weather conditions. The only thing I can see as being useful for radiative balance is a satellite!
Global Warming science based on the analysis of surface station data is not science it is FRAUD. Real scientists would never endorse such an approach. The ONLY way to explain this is that there is money and jobs to be had by playing with this data and drawing conclusions that serve a political agenda.(Alternatively, one must accept the unlikely scenario whereby all true scientists have exited from atmospheric sciences and that this entire academic domain has become dominated by geography majors.)

michael hart
August 27, 2012 7:59 am

How many apples in a barrel of grapes?…… Exactly…. there some “averages” which are meaningless. It is possible for some “average” temperature to be going up while, at the same time, the earth could be losing net heat. [and vice versa]
Essex, McKitrick & Andresen
http://www.uoguelph.ca/~rmckitri/research/globaltemp/GlobTemp.JNET.pdf
gives a good explanation [the first part of the paper can be read separately from the second, more mathematical, part.]
NB typo in the penultimate paragraph above “dirfectly”

davidmhoffer
August 27, 2012 8:07 am

Monkton of Brenchley;
In fact, the models relied upon by the IPCC do make the appropriate adjustment,
>>>>>>>>>>>>>>>>>>>>>
There’s little in your comment that I would disagree with. My point in this essay however is not in regard to the models, for that is a different discussion. The question at hand is the value of surface temperature anomaly data to confirm the warming that the models claim should be happening. Use of anomaly data simply averaged without regard to the 4th power relationship and depicted as a general rise in global temperatures by HadCrut and GISS inadvertanty biases the calculated average such that cold temperatures (high latitudes, winter seasons, night time lows) are over represented and warm temperatures (low latitudes, summer seasons, day time highs) are under represented.

Jeff Norman
August 27, 2012 8:18 am

David M. Hoffer,
If “that” was all you wanted to say about surface temperature anomalies, then you could have simply said the heat content of the Earth’s atmosphere is a function of the energy fluxes and the chemical makeup of the Earth’s atmosphere. This effects the heat content of the atmosphere near the surface which when combined with the chemical makeup of the atmosphere near the Earth’s can be used to estimate temperatures. Which is all trivial compared to the heat content of the oceans as someone indicated above.
Sorry, I agree with ericgrimsrud above.

CEH
August 27, 2012 8:20 am

David M Hoffer says: “That said, the point doesn’t change”
That is correct.

Stephen Wilde
August 27, 2012 8:28 am

“doubling of CO2, at equilibrium, changes neither the amount of energy absorbed from the sun, nor does it change the amount of energy radiated by earth back out to space. What it changes is the average altitude at which energy is radiated to space. This, combined with the lapse rate, changes the temperature gradient from earth surface to TOA.”
Correct in my view. With varying effects on the temperature gradient through the various atmospheric layers.
The change in atmospheric heights results in an air circulation response but an infinitesimal change compared to the solar induced (modulated by the oceans) natural changes such as from MWP to LIA to date.
The atmosphere always reconfigures itself to ensure that energy in equals energy out if anything other than surface pressure and insolation seek to disturb the energy budget.

Scarface
August 27, 2012 9:02 am

Peter Miller says: August 27, 2012 at 1:26 am
“At least we now know the ‘science’ is settled.”
Well, the more I read and learn about it, I would dare to say that the science is septic.

August 27, 2012 9:03 am

Thank you, David Hoffer, for a very clear essay on how tenuous is the link between temperature anomaly analysis and the reality of heat. Indeed, to do the science properly, it is important to measure anomalies of heat over the earth, not temperature. It is clear that the average[(K*T^4)] is not equal to K*(average[T])^4.
Not only does temperature vary by latitude from equator to pole, it varies by time of day and night at every location. Even taking the simplest average of (Tmax+Tmin)/2 by location is committing the error described by David.
But how big is the error? Take Denver, predicted 8/27/12: Tmax = 93F = 34C, Tmin = 68F = 20C. If you average temps, you will get 300.10 deg K. But if you convert to heat via Boltzman, (503.99, 419.73) you get an average of 461.36 W/m2 and an equivalent temperature of 300.34 deg K. So there is only a difference of a quarter of one degree Kelvin between the average of the temps and the temp of the average heat. This is a pretty standard difference over time at any one location, at least on seasonal basis. So maybe we can live with the average of temps, at least within each location.
Where I think there is an underappreciated element of error is that we forget that Tave at each location is never a measured value. It is a calculated “average” of the min and max values. Forget about TOB issues. The simple fact that you “average” 34 and 20 to get Tave = 27K implies that the uncertainty of that average is up to 7 deg K.
Take 30 of those daily Tave estimates to get a Tave for a month, and the uncertainty on that monthly Tave is 1.3 deg K. That is quite an error bar; the 80% confidence on the Tave is plus or minus 2.0 deg K, with its own weak simplifying assumptions about the shape of the curve between the min and max non-randomly sampled points.
Now, you attempt to string 30 years * 12 months/yr of Tave looking for trend y=mx+b hoping to find m 0 at statistical significance for each location. Sometimes you find it, sometimes it is insignificant. But when you add a 1.2 deg K uncertainty to each Tave point, even over 30 years, it will be very difficult to disprove a slope m = 0.0 deg/K/decade at statistical significance. Even when you can disprove 0.0, the uncertainty on m will still be uselessly large.
The statistical abuses do not end there. When the modeler’s grid their temperature data by month and grid cell, each and every data point contains uncertainty of the Tave derived from the min-max original data points. The resulting weighted gridded Tave must contain an uncertainty derived from the control points and with added uncertainty from the weighting method. Every adjustment to a data point adds to uncertainty because there is uncertainty in the adjustment and statistical variances at least add (unless you have SOLID evidence that sources of error negatively correlate in nature).

MikeB
August 27, 2012 9:16 am

David,
Don’t get too hung about whether ‘radiative forcing’ applies at the Top of the Atmosphere (TOA) or at the Tropopause. If the stratosphere is allowed to respond to this forcing (as stated in your link) then these become the same thing. This was made clear in the first IPCC assessment report
“The definition of radiative forcing requires some clarification Strictly speaking, it is defined as the change in net downward radiative flux at the tropopause, so that for an instantaneous doubling of CO2 this is approximately 4 Wm^2 and constitutes the radiative heating of the surface-troposphere system If the stratosphere is allowed to respond to this forcing, while the climate parameters of the surface-tropospherc system are held fixed, then this 4 W/m^2 flux change also applies at the top of the atmosphere It is in this context that radiative forcing is used in this section”
http://www.ipcc.ch/ipccreports/far/wg_I/ipcc_far_wg_I_chapter_03.pdf (Page 78)
So both you and John Finn are correct, for the purpose of calculating the surface temperature response.

davidmhoffer
August 27, 2012 9:21 am

MikeB;
So both you and John Finn are correct, for the purpose of calculating the surface temperature response.
>>>>>>>>>>>>>>>>>>
Technically neither of us are correct, because the IPCC not only admits that RF and surface temperature responde cannot be directly linked (see my comment upthread) but they actually consider multiple models of surface temperature response as depicted here:
http://www.ipcc.ch/publications_and_data/ar4/wg1/en/figure-2-2.html

August 27, 2012 9:23 am

@Monckton: when the climate has finished responding to a given forcing (typically after 1000-3000 years):
Whoa! Where did “1000-3000 years” to equilibrate come from?
When we have a KE=8 volcanic eruption, we get “a YEAR without summer”. A year — not a millennium. On a planet with day-night temperature swings, with seasonal cycles, any instantaneous forcing is quickly blended into the atmosphere. The evidence from volcanos are that the relaxation from the impulse is on the order of a couple of years, not millennia. Speculation that forcings are carried into the deep ocean for 1000 years are just that: speculations of disparate advocates trying to square the circle of their own making.

Bryan
August 27, 2012 9:53 am

Use actual values from a very large series of readings (lets say N).
Add the actual values and find their average (Nav) by dividing by N
Raise the individual values to the power of four and add them together.
Divide by N
Take this new value and extract the 0.25 root to get a new average (Nr)
You will find Nr much higher than Nav.
I would suggest that Nr is the more physically realistic value

David Ball
August 27, 2012 10:00 am

For the lay people out there, it is impossible to know if something is an anomoly if you have no understanding of what is “normal”. Hubert Lamb was trying to establish what is “normal” for the earth, thereby having an understanding of “unusual” or “man-made”. The baseline has NOT been established (despite claims to the contrary), so to claim “anomoly” is rubbish.
Well written article Mr. Hoffer. Glad you chose Winnipeg. Spent majority of my 50 years there. You can still enjoy life at -40 below, but it is MUCH easier at 40 above.

Werner Brozek
August 27, 2012 10:12 am

TomVonk says:
August 27, 2012 at 4:06 am
So there is a huge difference between the 4th power of an average and and average of 4th powers.

This is true but NOT very obvious. Even grade 12 physics students may not appreciate the difference and get an answer wrong because of it. I would like to illustrate this fact in a different way by answering the following question: How much does the kinetic energy of a 2 kg mass increase when the speed goes from 1 m/s to 3 m/s? The formula for kinetic energy is E = 1/2mv2. So the correct answer to this question is change in E = 1/2(2)(3)2 – 1/2(2)(1)2 = 9 – 1 = 8 J. The wrong answer is to assume the change in speed was 2 m/s, so the change in energy is E = 1/2(2)(2)2 = 4 J.

Werner Brozek
August 27, 2012 10:19 am

stephen richards says:
August 27, 2012 at 1:23 am
How do you know what the “normal” is?

You raise an excellent question. What is to stop any one from deciding that the average of the last 30 years was normal and that all anomalies from 1850 to 1980 were below normal? The facts would not change, but there would be a huge difference as to how things are perceived!

Gail Combs
August 27, 2012 10:21 am

Scarface says: @ August 27, 2012 at 9:02 am
…Well, the more I read and learn about it, I would dare to say that the science is septic.
___________________________________
That is the most precise summing up of the state of climate science I have seen so far. (I am glad I had just finished off my glass of iced tea before reading that given I am still laughing)

brad tittle
August 27, 2012 10:34 am

Thank you — This will fall on deaf ears again though.

JJ
August 27, 2012 10:36 am

David Ball says:
For the lay people out there, it is impossible to know if something is an anomoly if you have no understanding of what is “normal”.

You misunderstand the concept. The use of anomalies as critiqued by David is not intended to measure ‘man made’ or as an objective measure of ‘unusual’ or as deviation from an ideal. It is just a method of expressing measurements as differences from an arbitrary baseline, so that change can be assessed across sites that have different baseline values. It is not normative.
As David points out, there can be problems with the use of temperature anomalies for various purposes, but there is nothing intrinsically wrong with expressing temperatures as anomalies from an aribtrary baseline.

Steve C
August 27, 2012 11:58 am

JJ says that “The use of anomalies … is just a method of expressing measurements as differences from an arbitrary baseline … and … is not normative.”
I disagree. To anyone who speaks English well enough to have met or used the word ‘anomaly’, there is always the connotation in its meaning that something ‘anomalous’ is wrong. If you are talking about ‘differences from an arbitrary baseline’ in any field of normal science, you will use a word like … well, ‘difference’, or ‘variation’ or whatever, a word which does not suggest wrongness, just … variation. Only in climate science is it felt necessary to apply the subliminal brainwashing effect of “blah, blah, man-made greenhouse gases, blah, blah, anomaly blah blah”. The repeated juxtaposition of anything man-made and anomaly is just programming the public to feel that (whatever it is they’re blaming this time out) is causing things to be wrong.
No. They haven’t even shown yet that mankind has had any measurable effect on global climate (inasmuch as that phrase has any meaning), still less that anything is wrong. Thus far, all these “anomalies” with which we are beaten are merely modest differences from an arbitrary baseline – and nowhere near going outside the normal variability of Earth’s climatic parameters. There are NO ‘anomalies’, within the correct linguistic use of that word. Just Newspeak.
Nice post, David. Thanks.

Entropic man
August 27, 2012 12:23 pm

Stephen Rasey says:
August 27, 2012 at 9:23 am
@Monckton: when the climate has finished responding to a given forcing (typically after 1000-3000 years):
Whoa! Where did “1000-3000 years” to equilibrate come from?
When we have a KE=8 volcanic eruption, we get “a YEAR without summer”. A year — not a millennium. On a planet with day-night temperature swings, with seasonal cycles, any instantaneous forcing is quickly blended into the atmosphere. The evidence from volcanos are that the relaxation from the impulse is on the order of a couple of years, not millennia. Speculation that forcings are carried into the deep ocean for 1000 years are just that: speculations of disparate advocates trying to square the circle of their own making.
—————————
A volcano provides an impulse which propogates across the planet in a few weeks and then persists until the material injected into the atmosphere is removed, usually within a year or so.There is not time for a large heat loss from the ocean, so the effect of the eruption shows mainly in air temperatures and damps out quickly.
The forcings under discussion are sustained over decades. Since water has a high heat capacity, only a sustained increase in air and land surface temperature will produce a significant and long-term change in sea temperatures

woodNfish
August 27, 2012 12:33 pm

There is no such thing as a “global temperature”. The entire concept is bogus nonsense.

August 27, 2012 1:07 pm

@Entropic Man
Perhaps we should be clearer about what we call a forcing. In evaluating a system response, you should consider the output from an impulse. A volcano is a clear impuse at a climate scale and the data supports a response and relaxation time of just a couple of years.
The addition of a marginal 10 GT of CO2 into the atmosphere in a summer is another forcing. Since 450 GT are cycled per year through the biosphere, I would argue that the system response is also likely quick.
But I disagree that an extra 10 GT/year of CO2 is ONE Forcing. That is a step change input, not an impuse. It is equivalent to repeated, continuous forcings. If you argue that this step change is one forcing, then you must hold that a volcano is two forcings, one positive, one negative, an absurd concept.
So it seems that “forcings” is a term used for impulses (i.e. volcanos), balances (solar constant, CO2 in atmosphere), rates (x GT CO2/year added to atmosphere), and accelerations (Y GT CO2/year^2, CO2 balance in atmosphere under non-zero rates of addition.) To call all them forcings seems a bit imprecise.

Stu N
August 27, 2012 1:38 pm

@ Stephen Rasey “Perhaps we should be clearer about what we call a forcing. In evaluating a system response, you should consider the output from an impulse”
No, you should consider the output from a step change. That’s a big difference! The equilibrium response to a forcing is only achieved if the forcing is maintained until the system is no longer changing in response. Once again, Wikipedia to the rescue: http://en.wikipedia.org/wiki/Climate_sensitivity#Equilibrium_and_transient_climate_sensitivity
The climate system does not have time to equilibriate to the impulse from a large volcanic eruption because the material causing the reduction in energy absorbed at the surface is mostly gone within a couple of years.

Theo Goodwin
August 27, 2012 2:51 pm

Excellent article. I hope to see more articles from you. I have followed your posts for some years and you have an excellent understanding of the climate debates and of scientific method.

August 27, 2012 3:06 pm

N: The equilibrium response to a forcing is only achieved if the forcing is maintained until the system is no longer changing in response.
That’s not right. If it is a true step function, the “forcing” is applied at t >= t(a); forever! If you stop the forcing at t = t(b), then you are in fact applying a second negative step function super imposed on the first. That would be a box-car input.
http://en.wikipedia.org/wiki/Impulse_response.

Entropic man
August 27, 2012 3:44 pm

Stephen Rasey says:
August 27, 2012 at 1:07 pm
@Entropic Man
Perhaps we should be clearer about what we call a forcing. In evaluating a system response, you should consider the output from an impulse. A volcano is a clear impuse at a climate scale and the data supports a response and relaxation time of just a couple of years.
The addition of a marginal 10 GT of CO2 into the atmosphere in a summer is another forcing. Since 450 GT are cycled per year through the biosphere, I would argue that the system response is also likely quick.
—————————
We may be playing with semantics here.I attempted to distinguish between a short term change like the release of gas and ash from a volcano and the longer term changes associated with a change such as increased CO2.
The volcano generates a short term decrease in surface insolation, detectable for a year or two as a downward blip in the air and surface temperature record which quickly damps out. The high thermal inertia of the oceans means that they are hardly affected.
A change such as doubling CO2 over 100 years produces a gradual increase in energy flow, with each increment of CO2 producing a small increment of air and land temperature, with little or no lag. In the short term this will be hard to distinguish from the normal short term variation, but becomes apparant over decades.Sea temperature will lag considerably behind,acting to slow the overall rate of change and leading to a delay of up to 100 years before the system reaches equilibrium for the new CO2 concentration.
You do see differences in CO2 in a season. The Hawaian data shows a seasonal minimum in CO2 during the Northern Hemisphere Summer, though its effects on seasonal temperatures in the two hemispheres are hard to pick out of the noise.

Greg House
August 27, 2012 4:00 pm

TimC says:
August 27, 2012 at 4:40 am:
More fundamentally, I agree that global annual-average surface temperature is not a very good metric by which to measure global warming or cooling – but what is to replace it?
============================================
It is not necessary to replace it with another bad method, it would be enough if “climate scientists” admitted it publicly. And I do not care, if they lost their climate jobs then.

Richdo
August 27, 2012 4:46 pm

“As I said in the article itself, my point was not to debate the physical processes themselves, but to demonstrate that temperature anomalies as measured at the surface are unsuitable for tracking those processes.”
Well put David. Thanks for the interesting article. It’s nice to see a guest post from you, I always enjoy your comments.
Rich Dommer

August 27, 2012 4:48 pm

Mr. Rasey asks where the value 1000-3000 years to restore equilibrium after a forcing comes from. It is to be found in a 2009 paper by Susan Solomon, the lead author of the IPCC’s 2007 “science” report. Dr. Solomon had not thought through the implications of so long a time to equilibrium: what it means is that the headline equilibrium warming of 3.26 K per CO2 doubling (even if correct) will not be reached for at least 1000 years, with only half of it occurring this century.
Mr. Hoffer says surface temperature anomalies are unsuitable: however, the IPCC’s adjustment of the value of the Planck parameter allows them to be used with little error.
The real problem with the IPCC’s projections lies in its assumption of strongly net-positive temperature feedbacks that are not warranted either by theory or by any empirical method. Warming since the first IPCC report in 1990 has occurred at half of the IPCC’s then central estimate. Though a decade is not a long time in climate politics, it is now appropriate to question the reliability of the IPCC’s central estimates of future global warming.

August 27, 2012 5:01 pm

@Entropic
I remind you the issue was Monckton’s statement: when the climate has finished responding to a given forcing (typically after 1000-3000 years):
That implies to me that the system response to an impulse is 1000-3000 years. Volcano data implies a much faster response. Now, if you want a, 200 year string of gigantic input forcings, well I grant you it might be 1000 years before the system settles down, especially since it is a non-linear biologic system.
But “finished responding to a given forcing”? 1000 years seems to me to be a little long for ‘a given forcing.” That was my point. Making the forcings huge doesn’t invalidate it.

Entropic man
August 27, 2012 5:16 pm

Stephen Rasey says:
August 27, 2012 at 5:01 pm
1000 years seems to me to be a little long for ‘a given forcing.” That was my point. Making the forcings huge doesn’t invalidate it.
—————————-
We probably see wont settled this until the data comes in. See you in a hundred years, or a thousand. Loser buys the drinks.

Greg House
August 27, 2012 5:40 pm

Monckton of Brenchley says:
August 27, 2012 at 4:48 pm:
“The real problem with the IPCC’s projections lies in its assumption of strongly net-positive temperature feedbacks that are not warranted either by theory or by any empirical method.”
=======================================================
A simple logic requires the supporters of the “greenhouse gases warming” concept to agree with the notion of “net-positive temperature feedbacks”, because it is already a part of the concept.
As a second step they need to see that the “net-positive temperature feedbacks” would lead to a pretty much endless increase in temperatures and are therefore absurd.
Then they have no other choice than to conclude that the premise (greenhouse gases warming) is false. This method is called “reductio ad absurdum”.
And please, Christopher, do not rush to write a reply, take some time and think over it.

ericgrimsrud
August 27, 2012 5:48 pm

[Snip. Not referring to WUWT readers in derogatory terms like “the peanut gallery” would assist in getting your comments approved. ~dbs, mod.]

Spector
August 27, 2012 6:10 pm

This sounds like the case of confusing average temperature and average energy. It is similar to confusing average voltage with average energy in a situation where all voltages are positive.
You may have seen a rating that says 110 Vrms. RMS is short for [square]Root of the Mean[average] Square[ed values.] This usage is based on the fact that energy flow (power) is proportional to the square of the voltage if the resistance remains constant. So that the square root of the average value, usually averaged over time, of the squares of the observed voltages gives a single constant voltage that would dissipate the same energy as the actual values did. In this case negative values are allowed for the voltages. This term is often used to specify the direct current (DC) voltage that is equivalent to an alternating current (AC) voltage applied.
Most radiation calculations are based on the local equilibrium requirement that net energy flow coming in must be equal to the net energy flow going out. As energy flowing out is proportional to the fourth power of the temperature, the sum of all energy flows is proportional to the sum of all temperatures raised to the fourth power. If one takes the average of these values over the surface of the Earth, one gets a nominal average energy flow or power per unit surface area required to sustain equilibrium.
An inverse calculation from this energy-flow calculates the equivalent *uniform* absolute surface temperature required for equilibrium, this result is, of course, not an arithmetic mean temperature. It is the fourth-root of the average of all absolute temperatures after each has been raised to the fourth power. This might be called a ‘QRMQ’ (Quad-Root of the Mean Quad power) average. You probably would not want to plan your daily activity on such an average, nor, for that matter, would you want to do this on the basis of an overall simple arithmetic average.
I doubt if you will ever see a published paper that will have numbers like 271 Kqrmq, but that is what is meant by overall absolute temperatures derived from average radiated power.

davidmhoffer
August 27, 2012 6:19 pm

Monckton of Brenchley,
I don’t see that the manner in which the IPCC adjusts anomaly data excuses the manner in which HadCrut and GISS use it to calculate global average temps, but I am interested to know more as to exactly what they are doing vis a vis Planck’s constant. Have you got a link or other pointer to the details?

ericgrimsrud
August 27, 2012 6:39 pm

As Tim C said,
“I agree that global annual-average surface temperature is not a very good metric by which to measure global warming or cooling – but what is to replace it?”
Our “defined” method of measuring our “global average temperature” is useful because it provides a single number that reflects temperature CHANGES that occur on Earth with time. As long as the defined method (whatever we choose to use) includes a very large number of surface measurements recorded throughout the world and the measurement sites are selected to be distance from obvious local sources of heating or cooling, the so called “temperature change” thereby obtained is very likely to be useful. As long as one sticks to this criteria for the defined method, the observed “temperature change” thereby obtained is very likely to very useful in monitoring climate changes.
By injecting fundamental physical equations, such as the Stephan-Boltzman equation, into this discussion appears to be meaningless and inappropriate to me.
Again, these discussions seem to be designed more to “impress” the lay readers of this website than to clarify. Again an explanation to the contrary would be welcomed by those of us who do not see a legitimate point in this post. l

davidmhoffer
August 27, 2012 6:56 pm

By injecting fundamental physical equations, such as the Stephan-Boltzman equation, into this discussion appears to be meaningless and inappropriate to me.
>>>>>>>>>>>>>>>>>>>>>>
Fundamental physics is meaningless and inappropriate in the climate debate? Seriously?
LOL
BTW, it is Stefan-Boltzmann.

Amino Acids in Meteorites
August 27, 2012 7:02 pm

Michael Lowe
OMG, that Feynman story is so funny!

Amino Acids in Meteorites
August 27, 2012 7:06 pm

Bill Illis
Very nice comment! Worthy of a guest post!!!
http://wattsupwiththat.com/2012/08/26/lies-damn-lies-and-anoma-lies/#comment-1065802

ericgrimsrud
August 27, 2012 7:27 pm

What I meant, of course, was that the S-B equation has nothing to do with the extra surface level warming – of about 30C (global average) – that is caused by the greenhouse effect.

davidmhoffer
August 27, 2012 7:37 pm

What I meant, of course, was that the S-B equation has nothing to do with the extra surface level warming – of about 30C (global average) – that is caused by the greenhouse effect.
>>>>>>>>>>>>>>>>>>>>>>
I see. Everything in the universe obeys SB Law except the greenhouse effect. Got it.

Sleepalot
August 27, 2012 8:25 pm

That 33K for the “greenhouse effect comes from the same kind of mis-application of the SB equation.
A blackbody lit on one side by 1000 W/m^2 and dark on the other has temperatures of 364 K and 0 K which gives an average of 182 K.
A blackbody lit on both sides by 500 W/m^2 has a temperature of 306 K on each side and an average of 153 K.
QED.

Greg House
August 27, 2012 9:24 pm

Sleepalot says:
August 27, 2012 at 8:25 pm:
“A blackbody lit on both sides by 500 W/m^2 has a temperature of 306 K on each side and an average of 153 K.”
============================================
(306 + 306)/2 = …

Spector
August 27, 2012 9:41 pm

The following chart shows the calculated effect of increasing the CO2 content from 300 to 600 PPM on energy flowing out of the troposphere in a static environment without feedback effects. This was calculated by MODTRAN a program developed by the Air Force to test their equipment.
File:ModtranRadiativeForcingDoubleCO2.png
From Wikipedia, the free encyclopedia
http://en.wikipedia.org/wiki/File:ModtranRadiativeForcingDoubleCO2.png
As can be seen, this huge increase in CO2 just produces a slight widening of the CO2 ‘hole,’ which is like a one-foot diameter tree in the middle of a ten-foot wide stream. It can be seen that the CO2 band is almost fully saturated and masks most of the effect of the CO2. This shows the raw effect without feedbacks. The IPCC has postulated that the atmosphere has a dangerously high positive feedback factor that doubles or triples the raw effect. Others, (Dr. John Christy,) have argued that feedbacks, if any, are more likely to be negative.
Some economists are predicting that high extraction costs for increasingly more difficult to obtain carbon fuels will begin to curtail their use after a generation or two, so we can expect at most, one full anthropogenic doubling of CO2 in the atmosphere.
The Fate of All Carbon
by David Archibald
http://wattsupwiththat.com/2011/11/13/the-fate-of-all-carbon/
In the past, the climate has gone through wide temperature shifts without human intervention and there is every reason to expect that it will continue to do this in the future.

Sleepalot
August 27, 2012 10:27 pm

yep, well spotted. So it’s 306 K The point is that it’s different from 182K so it hows the method is invalid.

Al Tekhasski
August 27, 2012 10:45 pm

Dr. Eric Grimsrud wrote:
“Our “defined” method of measuring our “global average temperature” is useful because it provides a single number that reflects temperature CHANGES that occur on Earth with time. As long as the defined method (whatever we choose to use) includes a very large number of surface measurements recorded throughout the world and the measurement sites are selected to be distance from obvious local sources of heating or cooling, the so called “temperature change” thereby obtained is very likely to be useful.”
It was demonstrated to you that the average “temperature” can go up while energy balance went down. The idea of this construction does not change if you are using only two areas, or “large number of surface” areas (unless you invoke additional unproven assumptions about statistics of temperature field). It is easier to construct a case when the “global average temperature” goes up which energy balance also goes up. Upon little thinking it must be obvious that there could be cases when the energy balance stays unchanged, but the “global average temperature” can go either up or down. It also be that the energy content does change, but the “global average temperature” stays constant. As one can see, the “global temperature change” is not a “proxy” for change in planetary energy balance. Then how it can be possibly useful, Dr. Grimsrud?

TomVonk
August 28, 2012 1:43 am

Our “defined” method of measuring our “global average temperature” is useful because it provides a single number that reflects temperature CHANGES that occur on Earth with time.
This is of course absolute bogus statement because a global average doesn’t reflect anything such.
Even worse, like AlTekhasski rightly says above, variations in a global temperature average do not even reflect correctly the SIGN
This is mathematically so trivial that one should not even have to mention it.
Here is why the global temperature average is irrelevant to ANY energy and flow consideration.
The dynamics for the surface temperatures are defined by the heat equation with imposed flows on the free surface.
Solving this equation gives a unique temperature and a unique energy flow for every point of the surface.
The key concept to understand that there is a one to one correspondence between the temperature surface field and the energy flow field.
Now what happens if one looks only on the temperature average?
There is no more a one to one correspondence between this one temperature average number and the energy flow.
It has been destroyed by the averaging.
What we have instead is a one to many correspondence, actually a one to infinitely many.
That means for a given global temperature average there is an infinity of possible flows so that it is not possible to even know the sign of internal energy variation knowing the sign of average temperature variation.
That is why looking at temperature averages is not only useless but misleading because people might be lead to believe that average increasing implies internal energy increasing (or flows increasing) what is an absolutely wrong inference.
Besides that is what the data shows too. The “global” warming is anything but global. Actually about one third of the sites sees decreasing temperatures while two thirds see increasing temperatures. The average trivially increases but it still doesn’t say anything about energy or flows unless one looks locally at every single point of the globe where the one to one correspondence is restored again.

John Marshall
August 28, 2012 3:20 am

Is Boltzmann’s Law the correct law to use?
I do not think so because the important bit is the statement ”at equilibrium” we get a black body response but the planet is never at equilibrium, one reason we get weather.
The effect of CO2 in atmospheric heating, as proposed by the GHG theory, is understandable but wrong since the laws of thermodynamics must not be violated. It is also an observed fact that the mid troposphere heat anomaly, proposed by the GHG theory, has never been found despite daily checks with radiosonde. The radiation ”window” should close with rising atmospheric CO2, another GHG bit of theory, to raise temperatures at the surface. satellite observation shows no such change.
The planet has no temperature average so anomalies are of no real use. Also temperature is a poor metric for climate since no account of heat content is made and that is due to water content of the air mass under study. It is heat that drives atmospheric systems and processes.

Bob Layson
August 28, 2012 4:21 am

Off this topic but always relevant to websites such as this.- Any warming due to industrialisation, and there may have been none, has to be shown to have made the world less habitable to people now living in it – and living lives dependent upon industrialisation – if said warming is to be a justification for alarm and for moves to shackle growth by banning low-cost energy production. Since idustrialisation has led to more people living healthier and longer lives plainly the world has not become a more difficult place for humans equipped as we now are to live in. Habitability is not somthing Nature given but somthing won by Mankind.

Bob Layson
August 28, 2012 4:25 am

Make that ‘something’.

Stu N
August 28, 2012 5:25 am

@ Stephen Rasey
I said: “The equilibrium response to a forcing is only achieved if the forcing is maintained until the system is no longer changing in response.”
You replied: “That’s not right. If it is a true step function, the “forcing” is applied at t >= t(a); forever!”
That’s not right? My statement is self-defining! Anyone else agree with Stephen?
Anyway, yes, to consider the equilibrium response the forcing is applied forever – or at least as long as it takes for the system to reach equilibrium. That is the only way the true equilibrium response of the climate system could be reached. Now, in real life, the climate is always undergoing a transient response to a whole bunch of forcings. In your volcano example, you only see the transient response to a forcing that is itself a function time, peaking a few weeks after the eruption and becoming negligible a few years later.
“If you stop the forcing at t = t(b), then you are in fact applying a second negative step function super imposed on the first. That would be a box-car input.”
Yep but I never specified whether/when the original forcing would stop. It’s irrelevant once equilibrium has been reached anyway because you’ve got the value you’re seeking – the equilibrium response.

August 28, 2012 6:14 am

Reblogged this on The GOLDEN RULE and commented:
Reblogging, as I have said before, is a form of laziness, from an editorial point of view. Yet, when another blog/person publishes a post that I agree with and says basically what I would like to say myself, it makes sense to do it the easy way and simply promote the said post by reblogging.
An attempt will be made to come back to this post with, comment and links to posts and comments I have made in the past. Much of what David says is in tune with previous comments on this blog. It is nice to get some supporting, scientific input.

ericgrimsrud
August 28, 2012 7:25 am

To TomVonk,
In response to my comment:
Our “defined” method of measuring our “global average temperature” is useful because it provides a single number that reflects temperature CHANGES that occur on Earth with time.
You then said:
“This is of course absolute bogus statement because a global average doesn’t reflect anything such”
I don’t get your point. Can we assume that a true global average surface temperature would be the average of temperatures measured at every square inch of the planet? Would that not be the very definition of an average global temperature?
Now of course, we can’t do that continuously every second around the clock at every square inch of our planet’s surface, can we? So we do the next best thing – we make such measurements at as many points on the planet’s surface as we can – and hope that those measurements provide a valid measurement of the “true” global average.
Now, if our current measurement system does not provide a measurement of that true value, it would be simply because we do not have enough measurement sites in place, right?. So the question is simply, do we have enough measurement stations in place such that we are getting a useful measure of the global average.
Thus, I do not see the point of your remarks that followed in your comment of 1:43 am.

davidmhoffer
August 28, 2012 8:14 am

Can we assume that a true global average surface temperature would be the average of temperatures measured at every square inch of the planet?
>>>>>>>>>>>>>>>>>>>>>>>>>>
For the purposes of understanding energy balance, no we cannot. Different temperature distributions could result in exactly the same “average temperature” while delivering very different power fluxes. To understand what is happening from an energy balance perspective, one would have to first convert each point in time temperature reading to w/m2, then average and trend that. Attempting to trend either temperature or temperature anomalies as a proxy for energy balance is not mathematically sound.

Greg House
August 28, 2012 8:31 am

ericgrimsrud says:
August 28, 2012 at 7:25 am:
“Now of course, we can’t do that continuously every second around the clock at every square inch of our planet’s surface, can we? So we do the next best thing – we make such measurements at as many points on the planet’s surface as we can – and hope that those measurements provide a valid measurement of the “true” global average.
Now, if our current measurement system does not provide a measurement of that true value, it would be simply because we do not have enough measurement sites in place, right?”
===============================================
No, it is not right. You need to have a representative sample. There is a sample, but there is no proof that this sample is representative for the whole world.

ericgrimsrud
August 28, 2012 8:41 am

Sorry, but I still do have the impression that a measurement of the average global temperature and changes in it over time is of considerable value and even of primary importance.
For example, the fact that the average global surface temperature of Venus is near 400C and that of Earth is near 15C tells me a great deal about existing conditions on these two planetary surfaces. I understand that Venus once had Earth-like temperatures. If scientists existed there at that time and had monitorred the average surface temperatures of their planet, I suspect that those measurments would have been useful to them for seeing what has happening to their planet – even though those measurements might not have directly revealed the “energy balance” of Venus.
I am getting the feeling that a purpose of this threat is to diminish the perceived importance of simple, credible and easily understood surface temperature measurements – in favor of more obtuse measurements along with heavy doses of theory. Sorry, but when I think I might be running a fever, I will get out my theromometer first – before heading for the journals of medicine.

August 28, 2012 8:51 am

ericgrimsrud says:
“Again an explanation to the contrary would be welcomed by those of us who do not see a legitimate point in this post.”
Sir, the temperature of a location says only a portion about the actual energy content in the volume of air at that location. Please look up enthalpy. Once you understand that then you will grasp why many persons think findinf the average temperature of the earth is a fools errand.

Greg House
August 28, 2012 9:04 am

ericgrimsrud says:
August 28, 2012 at 8:41 am:
“Sorry, but I still do have the impression that a measurement of the average global temperature and changes in it over time is of considerable value and even of primary importance.”
===============================================
But look what you wrote 1 posting ago: “So we do the next best thing – we make such measurements at as many points on the planet’s surface as we can – and hope that those measurements provide a valid measurement of the “true” global average.”
HOPE! You can hope whatever you wish, but selling a hope as a scientific fact is a scientific/political fraud.

ericgrimsrud
August 28, 2012 10:19 am

Greg, Good to hear from you again. Am glad that you think there is more that mere “hope” associated with our AGW problem. I am not so optimistic but don’t let me weigh you down. Please do continue to look for absolute certainty and I sincerely “hope” you find it before its too late (which it might already be). Eric

Werner Brozek
August 28, 2012 10:52 am

ericgrimsrud says:
August 28, 2012 at 8:41 am
Sorry, but I still do have the impression that a measurement of the average global temperature and changes in it over time is of considerable value and even of primary importance.

In physics, I find it is often easy to take an extreme case and see the truth there. Then apply that truth generally. So let us assume we have two thermometers, one of which is 100 m down in the Pacific ocean and the other is 100 m up in the air at any location. Now let us assume the two thermometers read 24 C at a certain time. So the average is 24 C. Now let us assume that exactly one year later, the same thermometers were read and the one in the Pacific read 22 C and the one in air read 28 C. The average would now be 25 C. Now considering that the heat capacity of all the water on Earth is about 1000 times that of air, can you really say the thermal energy of the Earth went up and we need to take drastic steps to reduce it? Or is it more appropriate to take a weighted average of 22 in water and 28 in air to arrive at a weighted average of 22.03 or whatever it turns out to be?

ericgrimsrud
August 28, 2012 11:25 am

Werner,
That would depend on what you were trying to measure – temperature or heat content.
If one was interested in knowing what the average global surface temperature was, one would try to measure the surface temperatures of all places on the Earth’s surface (including the oceans) and take the average of all of those measurements, of course.
And, of course, if one was interested in knowing where the heat of the Earth is stored, one would take the temperature of all matter on the Earth, and multipy that number by the heat capacity and total mass of that form of matter.
And further, if one also wanted to know something about energy flow between all of these places as well as flow out into the universe, one would have to consider all of the know methods of energy flow including conduction, convection, and radiation.
But what’s the point of all of this? Sure heat content and energy transport are important and interesting subjects, but I thought we were talking merely about temperature.

Greg House
August 28, 2012 11:52 am

ericgrimsrud says:
August 28, 2012 at 10:19 am:
“Greg, Good to hear from you again. Am glad that you think there is more that mere “hope” associated with our AGW problem.”
=============================================
What “AGW problem”, Eric? You have just devalued the “measurements” of “GW”, in your own words: “So we do the next best thing – we make such measurements at as many points on the planet’s surface as we can – and hope that those measurements provide a valid measurement of the “true” global average.”
And it is not the “best” thing, Eric, in science it is the worst thing.

davidmhoffer
August 28, 2012 11:55 am

but I thought we were talking merely about temperature.
>>>>>>>>>>>>>>>>
We are talking about temperature as a proxy for energy balance.
Two points each at 300K have an average temperature of 300K and an equilibrium energy flux of 459.27 w.m2. Points with temps of:
300 and 300 => average temp = 300K, average w/m2 = 459.27
295 and 305 => average temp = 300K, average w/m2 = 460.04
290 and 310 => average temp = 300K, average w/m2 = 462.33
Three different scenarios that are the exact same average temperature yet have three different average power fluxes. It is not possible to rely on temperature averages alone and come up with a meaningful number that helps us understand energy balance. The only way this can be done is to convert ACTUAL temperatures (not anomalies) into w/m2, and average and trend that result.

BillD
August 28, 2012 1:26 pm

Anomalies are just the best, least biased, most objective and most comparable method for averaging temperature over time and space, whether it be for a state, country or planet. The goal of using anomalies is to compare changes, not to say how or why they occurred. So, i don’t understand how a critique of anomalies is related to CO2 (or possibly, sun spots). If we are going to talk about climate, it’s importantl to be able to quantify whether factors such as temperature and precipitation are increasing or decreasing over a region or even the whole planet. Rather than giving up on studying climate, we use anomalies. If you really believe that calculating anomalies are a problem, then describe an alternative.

David Ball
August 28, 2012 4:03 pm

BillD says:
August 28, 2012 at 1:26 pm
It the baseline that is the problem, not the anomalies. I tried to simplify it so that everyone could understand what is meant by anomaly ( in climate science ), but apparently it has served to confuse.

August 28, 2012 5:23 pm

Lord Monckton,
I am not that sure that the IPCC adjustments for anomalies are adequate. If the AQUA sea surface data is correct, the average true temperature of ~70% of the Earth’s surface is approximately 21.1 C, 294.25K with a S-B equivalent radiant energy of 425Wm-2. The revised surface energy estimate is 396Wm-2. That would require a fairly large adjustment for the use of anomalies on land especially at a higher average elevation.
Since the models started diverging circa 1995 when the northern oceans reach a relative equilibrium with the southern oceans, that could explain a large portion of the error. It is at least worth looking into more closely.
And 1951-1980 choice of baseline appears to be the worst possible now that there is satellite data to use.
http://redneckphysics.blogspot.com/2012/08/degrees-of-confusion-another-modest.html

davidmhoffer
August 28, 2012 5:27 pm

BillD;
So, i don’t understand how a critique of anomalies is related to CO2
>>>>>>>>>>>>>>>>>>>>>>>>
CO2’s effects are measured in w/m2, not degrees. There needs to be a mechanism to convert between the two in order to measure actual effects versus theorized effects. The point of this thread is that there is no straight forward conversion between the two.

David Ball
August 28, 2012 5:29 pm

It “is” the baseline, of course.

ericgrimsrud
August 28, 2012 5:51 pm

BillD nailed this threat when on
August 28, 2012 at 1:26 pm, he said:
“Anomalies are just the best, least biased, most objective and most comparable method for averaging temperature over time and space, whether it be for a state, country or planet. The goal of using anomalies is to compare changes, not to say how or why they occurred.”
A little bit of common sense sure does go a long way and I hope that we continue to report good old temperatures at places X, Y, and Z – as well as average temperatures at places X, Y and Z – as well as record temperatures at points X, Y, and Z – as well as all of the above averaged over the entire planet, if possible.

Gail Combs
August 28, 2012 6:15 pm

ricgrimsrud says:
August 28, 2012 at 8:41 am
Sorry, but I still do have the impression that a measurement of the average global temperature and changes in it over time is of considerable value and even of primary importance….
I am getting the feeling that a purpose of this threat is to diminish the perceived importance of simple, credible and easily understood surface temperature measurements – in favor of more obtuse measurements along with heavy doses of theory. Sorry, but when I think I might be running a fever, I will get out my theromometer first – before heading for the journals of medicine.
____________________________
I thought you were a chemist and therefore should understand enthalpy and the concept of measuring the correct parameter.
Here is an easy real life example.
1.) If I want to figure out the maximum weight a horse/pony can carry do I just measure the height? (easy)
2.) Do I look at the circumference of the cannon bone, the length of the back, weight and the age of the animal? (correct)
I have animals that are tall and have long weak backs that can not carry near the weight my short coupled, big boned stocky Shetland can. Given the ponies with weak backs will bite, balk or buck the incorrect analysis has fast feedback. Too bad “Climate Scientists” don’t get bitten in the rear when they make stupid mistakes.
(I am a chemist who finally quit in disgust after being asked to falsify data one too many times.)

Werner Brozek
August 28, 2012 7:04 pm

BillD says:
August 28, 2012 at 1:26 pm
If you really believe that calculating anomalies are a problem, then describe an alternative.

My understanding is that there are two huge problems with anomalies:
1. The humidity has a huge affect on the total heat but is not accounted for and
2. There is a difference in the amount of heat it takes to heat something from -40 to -39 than from +40 to +41.
One way to get around the first problem is to just use the sea surface temperatures and not land temperatures. In case you are wondering what the sea surface temperatures show, it is that there has been no warming since January 1997, exactly what RSS shows. See:
http://www.woodfortrees.org/plot/rss/from:1997/plot/rss/from:1997/trend/plot/hadsst2gl/from:1997/plot/hadsst2gl/from:1997/trend

Gail Combs
August 28, 2012 7:07 pm

As I said in the article itself, my point was not to debate the physical processes themselves, but to demonstrate that temperature anomalies as measured at the surface are unsuitable for tracking those processes.

I just want to say thank you David. I am amazed that you point about temperature being an unsuitable parameter is actually contested. It makes me think the calculations have been done correctly and they show information the IPCC would rather the public at large does not know, otherwise why fight the point?

davidmhoffer
August 28, 2012 7:28 pm

A little bit of common sense sure does go a long way and I hope that we continue to report good old temperatures at places X, Y, and Z – as well as average temperatures at places X, Y and Z – as well as record temperatures at points X, Y, and Z – as well as all of the above averaged over the entire planet, if possible.
>>>>>>>>>>>>>>>>>>>>>
Who the heck said ANYTHING about not recording them? The entire thread is about using the information derived from them correctly.

davidmhoffer
August 28, 2012 10:16 pm

Gail Combs;
It makes me think the calculations have been done correctly and they show information the IPCC would rather the public at large does not know, otherwise why fight the point?
>>>>>>>>>>>>
Well, I think that for the models themselves, they probably are handled correctly. Monckton of Brenchley certainly seems to think so. I’ll be looking into it myself to see exactly how they handle it, but if Monckton said the sky had turned purple with flourescent polka dots, I’d go outside and look for myself before disputing him, so I’m taking his word for it at the moment. Temperature trends like those presented by NASA/GISS and HadCrut on the other hand are simple area waited averages. They don’t take into account SB Law nor humidity, so are of little value in my mind to quantifyenergy balance. Nor do they take into account heat of transition, so all that ice that is melting in the arctic and being created in the antarctic also skew the numbers and aren’t picked up in any way by simple temperature trending.
That said, I don’t see a deliberate attempt to manipulate the numbers in this regard. I’m a big fan of never attributing to malice what can be explained by incompetance.

Steve Richards
August 29, 2012 1:55 am

@ericgrimsrud,
Yes, a single figure for the average temperature of the surface land mass is simple and easy to use, but, as the precise details of the planet warms and cools is being worked out, it does not help understanding these mechanisms if the main measure, average temp, is wholly unsuitable for the obvious reasons given earlier.
The crux is: define the power flow to/from the earth, and what influences it. Is it CO2, other gases, clouds driven by particles or what?
There are are 1001 parameters that influence power flows to/from the earth. Each one needs to be understood.
Power flows are the key to understanding the mechanisms of earth climate.
Average temperatures are a poor proxy for power flow…
This thread has become significant.

TomVonk
August 29, 2012 2:54 am

To Ericgrimsrud
I am not sure that you really understood what is criticised here. In the original post but mostly in the comments.
Of course no sane person denies the EXISTENCE of an average. Of any average.
You have N numbers? Just add them up and divide the result by N. Here you go. Works for people’s weights, tomatoe diameters, temperatures. Any list of numbers you want. Trivial and uninteresting in most cases.
The point is this your statement :
Sorry, but I still do have the impression that a measurement of the average global temperature and changes in it over time is of considerable value and even of primary importance.
And as it has been abundantly demonstrated here, including in my post, this statement is obviously wrong.
So again I will give a list of questions on which the global temperature average and its variations does NOT answer, can NOT answer and even worse, can mislead to give wrong answers.
– The value of the internal energy and the sign of its variation
– The value of radiation flow and the sign of its variation. Not even its average.
– The value of the convective heat transfer and the sign of its variation. Not even its average
– The value of humidity and the sign of its variation. Not even its average. Here just for fun, the following statement may be true :”The average temperature increased and the average humidity decreased”.
– And more.
Science is science because it can lead to correct predictions, preferably about the future.
Because the global temperature average and its variations can not aswer the questions above, it can lead to absolutely NO prediction. Amusingly not even to its own.
So how do you call a parameter which is useless, misleading and irrelevant to anything that is important ?
Well about every scientist will call it garbage.
You apparently call it a parameter “of primary importance” what is exactly the opposite and of course obviously wrong.
And the reason why it is garbage is precisely because a given global temperature average corresponds to an infinity of dynamical states and to an infinity of possible evolutions that mutually contradict themselves both in value and in sign.
The only possible use of a global average is a tautological inferrence “If the global temperature average increased/decreased then T2 is greater/smaller than T1”.
I hope you agree that this inferrence is trivial, quite stupid and with no added value.

sergeiMK
August 29, 2012 4:59 am

dallas says: August 28, 2012 at 5:23 pm
And 1951-1980 choice of baseline appears to be the worst possible now that there is satellite data to use.
Please check this plot using numerous different baseline dates. It should be obvious that there is no change in slope (or wiggles). The only change is in offset from zero for each baseline.
This makes no difference when considering how much the world has cooled/warmed from 1965 to 2012 All those different baselines will come up with the same result.
http://tinyurl.com/buk8e24

August 29, 2012 6:09 am

sergimk, “This makes no difference when considering how much the world has cooled/warmed from 1965 to 2012 All those different baselines will come up with the same result.”
True, there would be no difference in the amount the world has warmed or cooled by changing the baseline of the already averaged GMT, but it would make a difference as to what is suspected of causing the warming or cooling when the individual regions are compared with different base lines as I did in the link. For example in the 1940s, there is an abrupt drop in temperature in the tropics then a steady decline in temperature in the northern hemisphere. Is that consistent with man made aerosol cooling? Since the amplitude of the NH swing is so much larger than the rest of the world, selecting a baseline period where the NH is in a valley would show a larger increase in temperature than would be shown be selecting a neutral portion of the NH swing. The 1951 to 1980 baseline selection indicates more warming than a 1931-1960 baseline.
So I would think it would be worth the effort to determine what should be normal before declaring how abnormal things are.

BillD
August 29, 2012 6:13 am

I think that changes in temperature and precipitation are important to know and that anomalies are the best way of comparing and quantifying these changes. As a scientist, I say that the “baseline” is abitrary, but that it should not be frequently changed, since that makes comparisons confusing. It’s also better if everyone uses the same baseline, so that we are confused by anomalies that differ because the basedline is different. Measuring changes in radiative forcing (Watts/m2), water temperature and humidity can all help in our understanding of climate. However, we don’t need to answer all questions at once. Climate and weather are complicated so we will never find a single number that explains everything.

August 29, 2012 7:39 am

BillD, Using a consistent baseline would be nice. Since the satellite period limits what can be used as a baseline, what do you do?
http://redneckphysics.blogspot.com/2012/08/baseline-impact.html
It does matter as far as visual impact and the further the anomaly values vary from the baseline the greater the potential error.
Then the downside of a consistent baseline would be limiting the fun of cherry picking 🙂

ericgrimsrud
August 29, 2012 7:56 am

OK, so would those of you who say that average global surface temperatures are meaningless and that the anomalies thereby observed are nothing more than “anoma-lies” please suggest a measure that would replace those simple and readily understood temperature measurements?
Now let me also guess here what that suggestion is likely to be – if anything meaningful can be envisioned. It will surely be something very complex that the public (as well as many scientists) would not be able to understand and would be subject to determination by a combinations of measurements and theory. Because of its uncertainty and large error bars, it too could then be considered by the AGW contrarians to be of little practical value!! Mission accomplished – we then would no longer have any measurement that reliably tells us whether the Earth’s temperature is changing !!! And all this is possible if the contrarians can just get those damn temperature measurements off the table.
Excuse me once more, but there seems to be something fishy behind this post.

August 29, 2012 8:14 am

ericgrimsrud says:
“…contrarians …contrarians…”
Exactly what is that label supposed to mean? The fact is that everything we observe today is fully explained by natural climate variability. That is the default position; the null hypothesis.
Attempting to impute extraneous variables as the cause of entirely natural fluctuations violates Occam’s Razor. To throw an unnecessary variable like “carbon” into the explanation of natural variability only muddies the waters, and leads to a wrong conclusion. Thus, it is you who is a ‘contrarian’. You believe you see things that are just not there. You are only fooling yourself – the easiest person in the world to fool.

Venter
August 29, 2012 8:18 am

Smokey
Eric Grimsrud is not fooling himself. He is fully aware of what he is doing. He and his band of fanatics are specifically out to fool others with this AGW scam, especially innocent people, with deliberate intent. Don’t assume that these people are innocent and are honest.

David Ball
August 29, 2012 8:19 am

ericgrimsrud says:
August 29, 2012 at 7:56 am
It is fishy because you are clearly trying to misunderstand. Judging from your posts, I cannot figure out if it is on purpose, or through ineptitude. After reading all your posts, I have come to the conclusion it is the latter, ….

David Ball
August 29, 2012 8:27 am

“A little bit of common sense sure does go a long way and I hope that we continue to report good old temperatures at places X, Y, and Z – as well as average temperatures at places X, Y and Z – as well as record temperatures at points X, Y, and Z – as well as all of the above averaged over the entire planet, if possible.”
Have you even read Anthony’s paper? Are you also aware that there are approximately 4000 less monitoring stations than there were 40 years ago? Check out The Chiefio – E.M. Smith. Your knowledge about this issue is sorely lacking.

David Ball
August 29, 2012 8:35 am

Venter says:
August 29, 2012 at 8:18 am
I agree wholeheartedly. However, there are thousands of “lurkers” who frequent this site. I post stuff for people like ericgrimsrud, knowing full well that he will NOT read it, but many lurkers who are interested in learning ALL about this subject, will. I believe David M Hoffer posts a lot for this very reason. It is why Anthony has ALL the weather and sun data on his sidebar. Even some that may conflict with our presumptions. THAT is science. We have the courage to face and seek out ALL the information. What ericgrimsrud is doing is politics, not science.

davidmhoffer
August 29, 2012 8:58 am

ericgrimsrud;
Excuse me once more, but there seems to be something fishy behind this post.
>>>>>>>>>>>>>>>>>>>>>>>>>
That’s the best you can do Eric? It seems fishy? You cannot dispute the factual information provided, so you call it fishy? Sit back Eric and think for a moment as to what that says, not about the science being discussed in this thread, but what it says about you. As for your question as to what should “replace” temperature, it should be obvious from the extensive discussion that temperature CANNOT be replaced, but that there are appropriate ways of handling and interpreting that temperature data. There have been several explanations in this discussion. What value is there in asking me to provide an alternative and belittling it before I provide it when perfectly viable alternatives have already been posted in this thread and which you are free to comment on? Why is it that in each and every thread in which we converse, it ends this same way, with you throwing around comments like “it seems fishy” and failing to actually engage in the discussion of facts and science itself? Do you have a PhD in Chemistry or not? If so, then I suggest you start acting like it.
ALL
I’m unlikely to respond further to Eric Grimsrud until he grows up and engages in the discussion like the adult with a PhD in Chemistry that he claims to be. Over the course of the last few weeks and across several threads, he has accused Anthony of taking money from “Big Oil”, he has accused me of taking money from Anthony for the purpose of poisoning the science, he has engaged in repeated ad hominem attacks including calling both richardscourtney and me “feces” and is a self proclaimed sock puppet for the Union of Concerned Scientists.

Werner Brozek
August 29, 2012 10:33 am

ericgrimsrud says:
August 29, 2012 at 7:56 am
Excuse me once more, but there seems to be something fishy behind this post.

If the earth had warmed up from something like 15 C to the temperature of Venus, then I would agree with you. However it is presumed that a warming of only 0.7 C occurred in the last 150 years. And when you factor in how much of this 0.7 C is due to faulty weather stations, the UHI effect, thermometers disappearing, and the warming of extremely dry and cold air in the north polar region and how GISS handles it, then it is a very legitimate question to ask how much energy the earth really gained over the last 150 years.

Greg House
August 29, 2012 5:05 pm

ericgrimsrud says:
August 29, 2012 at 7:56 am:
“OK, so would those of you who say that average global surface temperatures are meaningless and that the anomalies thereby observed are nothing more than “anoma-lies” please suggest a measure that would replace those simple and readily understood temperature measurements?”
===================================================
Eric, there is apparently no reliable scientific method to find out, whether the so called “global temperature” changes or not.
Do you really suggest we should use something unreliable and sell the result as a scientific fact? Don’t you think it would be a fraud?

TomVonk
August 30, 2012 3:44 am

To ericgrimsrud
Mission accomplished – we then would no longer have any measurement that reliably tells us whether the Earth’s temperature is changing !!!
You apparently decided to stubbornly misunderstand so that one wonders why you even posted here first place.
Nobody needs global temperature averages to know that tempeatures change. They trivially do, do so since 4 billions of years and will do so for the next 4 billions of years and beyond.
Perhaps somebody finds it fun to put a number on it . You could compute a geometrical average too, why not ?
The point what everybody is trying to explain to you is that this information about a global average is irrelevant aka useless.
I notice that you carefully avoided to engage my detailed explanation of why it was useless.
It is useless because it explains nothing and predicts nothing.
It cannot even be used to falsify theories because an infinity of theories where an infinity – 1 will be wrong will show the same averages.
So while it may have a value to satisfy the curiosity of somebody who wonders whether temperatures change or not and is not bright enough to already know that they do, it is of no value whatsoever to a scientist who asks questions about the dynamics of the system.
The parameter which encodes the relevant information is the whole temperature field – its average is irrelevant to everything.
For this reason it is of course important to measure temperatures and their distribution but it is absolutely not interesting to compute their arithmetical (or geometrical for that matter) averages. They are just mathematically accurate but physically meaningless numbers.

richardscourtney
August 30, 2012 8:16 am

David:
Thankyou for your fine article.
I am now back in contact with the web and I write to point out something which directly pertains to your article. I have been saying this for years in many places (including on WUWT and in my review comments for the IPCC AR4). It is:
The observation of global temperature changes over the twentieth century could be entirely an effect of a change to the rate of horizontal heat transfer across the surface of the globe.
This possibility is because, as you point out, radiative flux is proportional to T^4. And, for radiative balance, the Earth must radiate energy to space at the same rate as it obtains energy from space (i.e. from the Sun). But the Earth receives most energy (per unit area) near the equator and emits least energy (per unit area) near the poles. Heat is transferred by oceans (and air) polewards from the tropics. Indeed, the tropics are net absorbers of radiation and the polar regions are net emitters of radiation.
Hence, a change to horizontal heat transfer (e.g. by oceans) across the globe will alter both the high and the low surface temperatures over the globe’s surface. These changes to local temperatures will provide a change to global temperature when the total radiative flux is a constant because radiative emission is proportional to T^4.
This possibility is important because it negates the ‘argument from ignorance’ which is used to support AGW. That argument says the rise in observed global temperature cannot be explained without including effects of increased GHGs (notably CO2) in the atmosphere. But it can: a change to the rate of heat transfer across the globe is an alternative explanation. And nobody knows what alters oceanic flows (e.g. ENSO behaviour cannot be predicted although the development of an initiated ENSO event can be predicted).
Richard
PS My use of ENSO as an illustration was not intended to invite discussion of epicycles.

davidmhoffer
August 30, 2012 9:58 am

richardscourtney;
Interesting comment. I came do very nearly identical conclusions via a somewhat different line of reasoning. The problem is disentangling two different processes.
We know from ERBE data and other sources that the arctic regions are net emitters of radiation and the tropics are net absorbers. For this to be true, energy must be transferred from tropics to arctic regions via air and water currents. We can actually measure those too, so no doubt that this is what happens.
So now the conundrum. When we observe a change in temperature in the arctic region, how do we quantify how much is due to direct effects of CO2 increases, and how much is due to changes in the amount of energy being transported from tropics to arctic regions?
The more I dig into the physics at this point, the worse it gets. For starters, the putative 3.7 w/m2 from CO2 doubling isn’t uniform. CO2 only has 150 w/m2 of upward bound LW to work with in the arctic regions, it has a whopping 450 w/m2 to work with in the tropics. So, the distribution of the 3.7 w/m2 direct effects of CO2 cannot be uniform. From there the math gets ugly. From an energy retention perspective, the change due to CO2 doubling in the tropics must be much higher than it is in the arctic regions, but because of the relationship to T^4, the temperature change is lower. Talk about counter intuitive! But worse still, the movement of energy from tropics to arctic regions must necessarily be driven by temperature differential which creates the high/low pressure cells and convective cells and so on that move air and water which in turn move energy around. So if the temperature differential between tropics and arctic zones gets reduced, then we’d expect LESS energy to be moved? Again, counter intuitive, because although less air and water gets moved, the energy density is higher…. at which point my head starts to hurt….

richardscourtney
August 30, 2012 10:54 am

David:
Thankyou for the reply to me which you provide at August 30, 2012 at 9:58 am.
As you say, the issues are difficult to isolate but – with respect – that is my point.
For sake of illustration, consider that the tropics were to cool by 0.1 deg.C . Then the rest of the world must warm to maintain radiative output. The result is global warming of more than 0.1 deg.C because the tropics are the hottest region and radiative output is proportional to T^4.
Please note that I am not claiming the tropics have cooled. Indeed, the tropical oceans are near their limit of maximum temperature of ~31 deg.C. This limit seems to be a result of increased evapouration and cloud cover with increased thermal input (from any source). Therefore, the existence of this limit implies that variations in tropical temperatures are improbable.
However, the same effect would occur if the region of maximum temperature were to vary in area.
Importantly, there is no way to know if this variation happened over the last century. But it may have happened. And if it did then it may have happened such as to provide all the observed global warming over the last century.
Therefore, it is not true that increased atmospheric GHG concentrations are required to explain global warming over the last century. It would be equally true to assert that altered temperature distribution is required to explain global warming over the last century. In fact, neither statement is true: both effects – or neither effect – may have contributed to the observed global warming.
In my opinion, the major importance of your post is that it draws attention to the fallacious nature of the assertion that increased atmospheric GHG concentrations are required to explain global warming over the last century. Whenever that assertion is made it can be refuted by pointing out the other – also unlikely – possibility.
Richard

sergeiMK
August 30, 2012 2:02 pm

Data has been replotted for base periods from 1931 to 1995. The results are plotted here:
http://climateandstuff.blogspot.co.uk/2012/08/the-effect-on-slope-using-base-period.html
Using a base period of
1931 0.087K per decade
and 1961 0.077K per decade
So 1961 to 1991 seems a fair period to normalise to.

Brian H
August 31, 2012 4:32 pm

Richard;
A horizontal change in distribution of humidity would accomplish the same. Maybe moreso!

D.Mayer
September 6, 2012 2:53 am

I am a lurker and I agree with ericgrimsrud, this article is fishy. And I believe it is justified to be said about an article which starts in the very beginning with a rather stron accusation of “lie”. Under this premise, there is no need to get personal towards Eric, David (August 29, 2012 at 8:58 am), since you set the tone yourself already.
So, again, why is this article fishy, also in my view: Because David denies a well developed method any possible merits. And David offers no better alternative. So, behind all scientific elaborations, he is on the simple rethoric trip to replace something with nothing. As far as I am aware, scientists are well aware that they have to work a lot with assumptions and uncertainities and that they nevertheless have to try to link factors based on this. This does not make them liers, and I believe this is still better than trying to assume – nothing.

Horst G Ludwig
September 10, 2012 9:51 am

So whats the point. To turn down on climate warning or to establish a new filosophy yourself? Whats it going to be with good old human observation keeping book by decades by now and testifying numberless climate changes without counting photons? Count bacterial and viral migrations if you really want to know about truth. There is no global math of any value or meaning but general.
Climate is of local importance in the continental part of the globe and the chain reaction to good or bad attached. Guess what, human kind is pretty much pending on this. And tell the US citizen. Half mid east burned down this year, sub soil dry as powder already. Oh yes, the number of photons on global output…. Seriously, we got to learn to be responsible while talking!