Anomalies are unsuitable measure of global temperature trends
Guest post by David M. Hoffer
An anomaly is simply a value that is arrived at by comparing the current measurement to some average measurement. So, if the average temperature over the last 30 years is 15 degrees C, and this year’s average is 16 degrees, that gives us an anomaly of one degree. Of what value are anomalies? Are they a suitable method for discussing temperature data as it applies to the climate debate?
On the surface, anomalies seem to have some use. But the answer to the second question is rather simple.
No.
If the whole earth was a single uniform temperature, we’d have no need of anomalies. But the fact is that temperatures don’t vary all that much in the tropics, while variations in the high temperate zones are frequently as much as 80 degrees over the course of a year. How does one compare the temperatures of say Khartoum, which on a monthly basis ranges from an average of 25 degrees to 35 degrees C, to say Winnipeg, which might range from -40 in the winter to +40 in the summer?
Enter anomalies. By establishing a base line average, usually over 30 years, it is possible to see how much temperatures have changed in (for example) winter in Winnipeg Canada versus Khartoum in summer. On the surface, this makes sense. But does the physics itself support this method of comparison?
It absolutely does NOT.
The theory of CO2’s direct effects on earth’s surface temperature is not terribly difficult to understand. For the purposes of this discussion, let us ignore the details of the exact physical mechanisms as well as the order and magnitude of feedback responses. Let us instead assume that the IPCC and other warmist literature is correct on that matter, and then see if it is logical to analyze that theory via anomaly data.
The “consensus” literature proposes that direct effects of CO2 result in a downward energy flux of 3.7 watts/m2 for a doubling of CO2. Let’s accept that. Then they propose that this in turn results in a temperature increase of one degree. That proposal cannot be supported.
Let us start with the one degree calculation itself. How do we convert watts/m2 into degrees?
The answer can be found in any text book that deals with radiative physics. The derivation of the formula requires some in depth understanding of the matter, and for those that are interested, Wikipedia has as good an explanation as we need:
http://en.wikipedia.org/wiki/Stefan%E2%80%93Boltzmann_law
For the purposes of this discussion however, all we need to understand is the formula itself, which is:
P=5.67*10^-8*T^4
It took Nobel Prize winning work in physics to come up with that formula, but all we need to use it is a calculator.
For the mathematically inclined, the problem ought to be immediately obvious. There is no direct correlation between w/m2 and temperature. Power varies with T raised to the power of 4. That brings up an obvious question. At what temperature does the doubling of CO2 cause a rise in temperature of one degree? If we use the accepted average temperature of earth surface as +15 degrees C (288 degrees K) simply applying the formula suggests that it is NOT at the average surface temperature of earth:
For T = 288K
P = 5.67*10^-8*288^4 = 390.1
For T = 289K (plus one degree)
P = 5.67*10^-8*289^4 = 395.5
That’s a difference of 5.4 w/m2, not 3.7 w/m2!
So, how does the IPCC justify their claim? As seen from space, the earth’s temperature is not defined at earth surface, nor can it be defined at the TOA (Top of Atmosphere). Photons escaping from earth to space can originate at any altitude, and it is the average of these that defines the “effective black body temperature of earth” which turns out to be about -20 C (253 K), much colder than average temperatures at earth surface. If we plug that value into the equation we get:
253K = 232.3 w/m2
254K = 236.0 w/m2
236.0 – 232.3 = 3.7
There’s the elusive 3.7 w/m2 = 1 degree! It has nothing to do with surface temperatures! But if we take this analysis a step further, it gets even worse. The purpose of temperature anomalies in the first place was supposedly to compare temperature changes at different temperature ranges. As we can see from the analysis above, since w/m2 means very different things at different temperature ranges, this method is completely useless for understanding changes in earth’s energy balance due to doubling of CO2.
To illustrate the point further, at any given time, some parts of earth are actually in cooling trends while others are in warming trends. By averaging temperature anomalies across the globe, the IPCC and “consensus” science has concluded that there is an overall positive warming trend. The following is a simple example of how easily anomaly data can report not only a misleading result, but worse, in some cases it can report a result the OPPOSITE of what is happening from an energy balance perspective. To illustrate, let’s take four different temperatures and consider their value when converted to w/m2 as calculated by Stefan-Boltzmann Law:
-38 C = 235K = 172.9 w/m2
-40 C = 233K = 167.1 w/m2
+35 C = 318K = 579.8 w/m2
+34 C = 317K = 587.1 w/m2
Now let us suppose that we have two equal areas, one of which has an anomaly of +2 due to warming from -40 C to -38 C. The other area at the same time posts an anomaly of -1 due to cooling from +35 to +34.
-38 C anomaly of +2 degrees = +5.8 w.m2
+35 C anomaly of -1 degree = -7.3 w/m2
“averaged” temperature anomaly = +0.5 degrees
“averaged” w/m2 anomaly = -0.75 w.m2
The temperature went up but the energy balance went down? The fact is that because temperature and power do not vary dirfectly with one another, averaging anomaly data from dramaticaly different temperature ranges provides a meaningless result.
Long story short, if the goal of measuring temperature anomalies is to try and quantify the effects of CO2 doubling on earth’s energy balance at surface, anomalies from winter in Winnipeg and summer in Khartoum simply are not comparable. Trying to average them and draw conclusions about CO2’s effects in w/m2 simply makes no sense and produces a global anomaly that is meaningless.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
“An anomaly is simply a value that is arrived at by comparing the current measurement to some average measurement.”
I would agree, to a degree, because comparing to an average is a weak method, but more generally an anomaly is an abnormal value when compared to a history of values. The definition of “abnormal” is many. and may include comparing to a distribution of values, an example is n-sigma from the mean, or the distance from n-nearest neighbors, where n may be 1 or the entire sample set, or a violation of a rule, or frequency based (near the mean, but does not happen often as in the case of multi-modal distributions), or based on a normal range, etc. Anomalies may be computed in one dimension, or multi-dimensional and to truly determine an anomaly one may need to look at it in many ways, not just “distance” from “average”.
Greenhouse warming theory is like a Russian matryoshka doll; there is always a deeper fraud inside.
Speaking of which, when did the word “average” get replaced with “normal” by our weather overlords?
Excellent analysis, David. This corresponds with my understanding as well.
The rest of this derives from the assumption that the anomalies live in L2 and parametric statistics can describe them. Weather and climate are highly coupled non-linear processes. Such processes are by their nature chaotic and live in a dimension less than L2. Second moments just plain do not exist. Parametric correlations do not exist. You can do rank correlations and that sort of non-parametric measure, but that is about all you can say with certainty. The tails of the distribution are way too fat. The hundred year flood comes way too often. To get hysterical about those hundred year floods is an admission that the person who does so does not know statistics.
What if the average is itself an anomaly?
wot jocky scot said…
And that begs the question: Is an anomaly of an anomaly an anomaly?
Happens often when you manipulate intensive variables in isolation.
I take some heart from your essay, David… and I expect others will, too… those who have asked from time to time on WUWT? why anomalies are used at all.
The explanations have always been profound, but have always left me with a deeper furrow on my brow than before the explanation was given… that and a sense that I must be rather dense.
Perhaps I am brighter than I thought?
Whichever; I now feel I am at least in good company.
A while back, I think it was on the Blackboard, I had expressed my “hair on the back of neck up” when people talk anomalies. Anomalous to what and when is my first question. Secondly, why is it that ALL anomalies are bad? History is choked full of anomalies.
Anyhow, Zeke H. explained to me, in many, many, many words how an anomaly is a normalization of a data-set, and I fell asleep and gave up (I am not sure in which order).
This post will be in my bookmarks. Thank you David!
“The temperature went up but the energy balance went down?”
FOOLS. Don’t you know that the IPCC, the Useless Nations, NASA, NOAA, Mann, etc., and the eco-cultists have re-written the laws of thermodynamics? Sheesh. Get with the pogro….er…..program.
And where the simple anomaly number really makes a difference is in the Arctic. Suppose for example that the Arctic warmed by 2 C over the last 60 years. It would make a huge difference if the low and dry temperature of 245 K warmed by 2 C and the high and more moist temperature of 275 K also warmed by 2 C, or if the lower, dryer temperature warmed by 4 C and the high temperature was unchanged. I believe it is the latter that is closer to the truth. And if the warming of cold and dry air is driving global anomalies, we must take these with a grain of salt.
That reminds me of this quotation:
I get the point of the post, and I agree that the method is lacking.
My comment won’t exactly be on point. But, is it really settled that co2 increases will cause temperature increases? I don’t think it is.
Reginald Newell, MIT, NASA, IAMAP, co2 and possible cooling
1 minute video
“Trying to average them and draw conclusions about CO2′s effects in w/m2 simply makes no sense and produces a global anomaly that is meaningless.”
Agreed. Saying adding co2 to the atmosphere will cause warming and then seeing manmade co2 is on the rise and temperature is on the rise is not enough to prove that proposed guess. The price of milk went up at the same time. So can I also conclude the price of milk controls temperature?
Short and sweet David.
Just think how the discrepancies multiply if you consider tha the actual range of temperatures on eart’s surface is much greater than the numbers you chose for your example locations.
Vostok Station has been measured officially below -128 deg F, and with anecdotal evidence of nearby higher altitude points as low as -130 or -90 deg C, 183 K, while at the exact same time, tropical desert locations have air temperatures over 135 deg F, and surface Temperatures of maybe 140 F or +60 deg C, 333 K.
The S-B Total radiant emittance then covers a range of more than 11:1, and Temperatures near those extremes ca and do exist at exactly the same moment, since the dark of the Antarctic winter night, is the northern summertime. Even more dramatic is the result if you calculate the peak of the spectral radiant emittance over that temperature range, which is where the bulk of the radiant energy is emitted.
That goes as T^5, not T^4 which gives almost a 20 : 1 ratio. And at those highest Temperatures, the wavelength of the peak emission hasmoved from around 10.1 microns at 288 K to around 8.75 microns, which is further inside the “atmospheric window” making CO2 even less important.
There is also another issue. The disciples of the greenhouse gremlin, are willing to swear on a stack of bibles, that earth’s atmosphere DOES NOT radiate ANY “black body” radiation, since gases cannot do so; only solids (rocks) and liquids (oceans) do. Well clouds are water or ice, so clearly clouds can and must emit thermal continuum radiation depending on the cloud Temperature per the S-B law. Otherwise only the surface can emit thermal radiation, so the non cloud part of the atmosphere can only emit the LWIR spectra of the various GHGs, which is NOT spectrally dependent on Temperature to a first order.
So earth’s external emission spectrum, should reflect the earth surface Temperatures, with their higher value and shorter wavelength range; not some 253 K signature spectrum.
Well of course, the non cloud atmosphere does radiate a thermal continuum spectrum, characteristic of its Temperature, and quite independent of its composition; but its density is so low, that even the total atmospheric optical absorption density doesn’t come close to the total absortion required of a “black” body. The deep oceans absorb 97% in about 700 metres,with around 3% surface Fresnel reflection, so they make a pretty good imitation of a black body radiator.
So when James Hansen re-adjusts the historic baseline Temperatures for “discrepancies”, as he seems fond of doing, does he apply the exact same fudge factor to the actual historic Temperature readings taken back then. That would seem to be essential to do, or else there would be a fictitious discrepancy generated anomaly change, every time he discrepancizes the baseline reference ??
From exploration geochemistry perspectives, (my bachground) an anomaly is a value significantly different to those historically or geographically around it, expressed in the same, correct units and with due consideration to noise and ditribution of values.
Whereas the occasional anomaly in climate science time series tends to be smoothed in long time series, it is can be a pity to downplay the anomaly (expressed this way), because it is information-rich. The “What” in “Whay made it different?” can unravel puzzles.
The use of “anomaly” to denote a property like temperature that has been elevated to an artificial baseline is relatively new to me, but I don’t like it. Once the statistician starts to truncate number sets, many types of possible analysis are subsequently invalid. So, David, the arguments you present are both correct and needed. The examples are quite well chosen, thank you.
I have some residual confusion about the mechanics of calculating anomalies in the climate change sense. If there is a single site with a single thermometer, it is easy to select and to adopt a 30-year reference period. (it certainly should not be called a normal. It is an artifice). The math to reduce the set to ‘anomalies’ is then trivial – until someone makes an adjustment. Pardon my ignorance, but for an area with many sites, is the ‘normal’ the average of all of the sites taken over that time, a constant that is then subtracted one-by-one from each site; or is each site first converted to anomaly form, then the composite calculated by averaging the residuals?
In any event, any change to the number of observations in the reference term, as happens frequently, will produce a new normal and a new anomaly string. Given that people like Ross McKitrick have published about the change on the number of sites used in the CONUS, has the ‘nomal’ been recalculated day after day as the number of observations changes, or are we free of errors from this type of mechanism?
What is the situation on normals when, like Australia had earlier this year, a completely different, revised time series named Acorn? If it becomes the official, accepted version, does this mean that uses such as proxy calibration have to be recalculated because the time-temperature response in the calibration period has been changed?
Reblogged this on Climate Ponderings and commented:
” averaging anomaly data from dramaticaly different temperature ranges provides a meaningless result.”
You knits deviation with mean power to a tee. Fore!
Absolutely spot on – I’ve been aware of this irreconcilable “anomaly” for several years. Average temperature does not give average radiation, and vice-versa. The inherent error over the surface of the Earth may well be equal (or greater) in magnitude to the “warming effect” of a doubling of CO2.
However, Kiehl & Trenberth 2009 didn’t calculate outgoing surface radiation from an average surface temperature, as is often supposed. They used a method of gridded temperature data over the surface to calculate outgoing radiation for reach grid, averaging the results (though that has it’s own problems, too detailed to go into here).
The 579.8 w/m2 and 587.1 w/m2 in the above are backassward.
Werner Brozek says: “…It would make a huge difference if the low and dry temperature of 245 K warmed by 2 C and the high and more moist temperature of 275 K also warmed by 2 C, or if the lower, dryer temperature warmed by 4 C and the high temperature was unchanged….”
Right. Temperatures don’t tell the whole story. Net heat flux is what counts, and that requires knowing what the humidity is in every call. The models don’t simulate real clouds rigorously, so they don’t do net heat flux well, either. “Global temperature” is an almost meaningless concept.
Roger Pielke Sr. has another critique of the average global surface temperature anomaly as a measure of warming:
http://pielkeclimatesci.wordpress.com/2012/05/07/a-summary-of-why-the-global-average-surface-temperature-is-a-poor-metric-to-diagnose-global-warming/
I assume David M. Hoffer’s argument would also apply to the UAH and RSS atmospheric temperature anomalies?
The “consensus” literature proposes that direct effects of CO2 result in a downward energy flux of 3.7 watts/m2 for a doubling of CO2.
So, how does the IPCC justify their claim? As seen from space, the earth’s temperature is not defined at earth surface, nor can it be defined at the TOA (Top of Atmosphere). Photons escaping from earth to space can originate at any altitude, and it is the average of these that defines the “effective black body temperature of earth” which turns out to be about -20 C (253 K), much colder than average temperatures at earth surface. If we plug that value into the equation we get:
253K = 232.3 w/m2
254K = 236.0 w/m2
236.0 – 232.3 = 3.7
——————————————————————————-
So the direct effect (before feedbacks) of doubling CO2 raises the earths surface T how much less then 1 degree?
Great article BTW. Also it should be pointed out the CO2 in the atmosphere does, as the IPCC shows, increase the residence time of LWIR energy. However it DECREASES the residence time of conducted energy which must stay in the atmosphere longer if it cannot radiate away. Our atmosphere is full of both conducted and radiated energy. additional CO2 accelerates the escape to space of conducted atmospheric energy.
For measuring the effect of greenhouse gases on the temperature, Stefan-Boltzmann equation is irrelevant. Greenhouse gases act as an insulator in the atmosphere and to measure their effect, we need to measure thermal conductivity of the atmosphere. It’s no simple matter but the question is, if we do it right, how much different results from average anomaly will we get.
For start we’d need to calculate differences between lower troposphere and stratosphere temperatures, and note that stratosphere is actually cooling over past decades which is enhancing the effect rather than diminishing it.