GISS (Goddard Institute of Space Studies) Surface Temperature Analysis (GISSTemp) released their monthly global temperature anomaly data for September 2008. Following is the monthly global ∆T from January to September 2008:
Year J F M A M J J A S
2007 85 61 59 64 55 53 53 55 50
2008 14 25 62 36 40 29 53 50 49
Here is a plot of the GISSTemp monthly anomaly since January 1979 (keeping in line with the time period displayed for UAH). I have added a simple 12-month moving average displayed in red.
For those astute readers of this blog, you will note how the addition of September data warmed our summer months:
GISS 2008 J F M A M J J A S
As of 8/08 14 25 60 42 40 28 50 39 ..
As of 9/08 14 25 62 36 40 29 53 50 49
In other words, when GISS closed the books on August, the summer average (JJA) was 0.39 C. Upon closing the books on September, the summer average increased to 0.44 C.

[…] GISS Releases September 2008 Data […]
Keith, I guess Al Gore never played hockey. Not only he is fat but when I was young we used to bend our stick (left or right depending on which side you held your stick) so we could grab the puck easier. This temperature hockey stick is so bent now that it will snap.
Michael J. Bentley (20:50:34) :
You didn’t read the book on Challenger that I did. I’ve lost it and can’t even remember the title, but based on what I read Dr. Fletcher, the head of NASA at the time, should have at least been charged with lying to congress about the shuttle when it was proposed and should have be charged with negligent homicide for shuttle design and construction.
Vicky Pope, head of climate change for government at the Met Office’s Hadley Centre,
said last year that a tipping point could soon be reached (2 degree rise in temperature) that would lead to the melting of the Greenland icesheet!
She then added that this might take 3000 years!! to occur.
http://global-warming.accuweather.com/2008/10/drastic_action_now_needed_with.html#comments
Strange that Hansen talks about centuries for this to happen – but I suppose 3000 years are only 30 centuries
“Mary” (04:15:40) :
Since you refer specifically to the Antarctic, I wonder why you didn’t link to the Southern Hemisphere charts.
http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/current.365.south.jpg
http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/current.area.south.jpg
http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/current.anom.south.jpg
Yessir, mighty impressive trends there!
Barry,
You’re right – I haven’t read the book, I only have what the press put out, and some info from my own “unidentified highly placed sources”. Therefore I only have part of the story.
The issue that makes me angry is after causing the “event” NASA hasn’t removed the cause, simple arrogance – “We’re the rocket scientists – and you’re not”. This CO2 caused global warming fiasco, at least NASA’s part in it, stems from the same mindset that sent the Challenger to it’s doom, and brought the Columbia to it’s firey end.
Sadly the NASA noise is drowning out legitimate voices doing good science that presently are on both sides of the fence.
The UN’s issue, I think stems from an entirely different mindset altogether.
Just maybe those who are saying the economic downturn will “help the environment” are correct. By slowing some of the overblown and downright dangerous fooling with the climate, the ol’ mudball might just be able to prove it’s in control, not NASA, not the UN, and certainly not us’ns.
Well being just a plain physicist/mathematician; with no credentials in the climate discipline; I can afford to look at all this stuff (the parts I understand) and not pay a lot of attention to the traditions. Hopefully this doesn’t tread on too many toes.
But thinking of GISS, or UAH or RSS or HADcrut, which you presented a nice compact dissertation on Anthony, not so long ago, I have a hard time seeing what all the fuss is about.
Somewhere Ir ead, that back in the 1850-1900 time frame, there were precisely 12 measuring stations in the Arctic; that being above Lat +60. This number gradually grew over the years to anumber about 86 if I remember correctly, and then declined to aorund 72; which I took a wild guess might have been a result of the implosion of the Soviet Union.
So throughou all this messing around with the instrumentation; I guess these global temperature watchers maintained the illusion that they still knew what the global temperature was relative to what it was in 1850 or thereabouts. The fact that they call them anomalies suggests they aren’t supposed to be different from zero, and nobody knows what zero really is; that being the unknown average of some baseline time frame.
So I don’t have the foggiest idea just exactly what James Hansen does for his annual budget, in order to come up with that one magical number that the lay press tells the public is the actual mean temperature of the earth.
Well we know it isn’t that, because the actual mean temperature of the earth would immediately vaporize every living thing on this planet.
So maybe its the average surface temperature of the planet, that being the most apparent thing you could actually see.
But then it isn’t that either; because some of those barn owl boxes that Anthony has shown us in funny places aren’t even on the ground.
So how exactly would you go about measuring the true average temperature of the earth’s surface; assuming that there even is some scientific significance to such a number (which there isn’t; except to compare it with previous calculations of the same number which also didn’t mean anything. I’ll explain why it doesn’t mean anything later.
So if I were to ask a bunch of 8th grade science students how to measure the average temperature of the earth’s surface; they wouldn’t have any problem coming up with a method. You simply measure the temperature at every point on the surface continuously, and then you integrate all that over time (say one year) and all over the earth surface. Well you can’t actually measure the temperature of every point, because that takes an infinite number of thermometers, so you can only sample it here and there.
you could start with the fundamental MKS units of the metre, and the second, and simply put a thermometer in the middle of each square meter plot, and read it once per second. Then after you have a year of data, you simply add all those numbers, divide by the number of seconds in a year, and divide by the surface area of the earth, and you get the average. What could be simpler ?
Well that still takes a lot of thermometers. And the sun moves about 463 meters in one second (equator), so you should either measure more often, or put the thermometers further apart. A good compromise would be to use a 1 km square rather than 1 metre, and read it once per second.
Obviously you still can’t do this, but if you did, and added all those temps, dividing by the number of seconds in a year, and the number of square km of the earth surface, then you truly would ahve the mean surface temperature of the earth.
It would also be much better than GISS (or anybody else); because if you had all that data, you actually could reproduce the complete continuous surface temperature map of the earth for any second of time in a year, or any longer period of time if you had been recording it for the last 100 years or more.
One thing you can say about Hansen’s annual budget one magic number, is that you can never go back and say what the temperature was at any random point on earth and any particular time; you cannot reconstruct the past from any of the so called “anomally ” methodologies. Even if you kept all the data, from how ever many (or few) sites where you actually measure it; you couldn’t tell what the temperature was any place else or any time other than when you measured it.
The problem is you have violated the most fundamental law of sampled data systems; the Nyquist criterion.
Nyquist says you can completely reconstruct a continuous function f(x), if and only if f(x) is a “bandlimited” function, and you sample f(x) at a rate at least twice the maximum frequency present in the band limited signal. So if the bandlimited signal contains no signals at any frequency greater than (B), then sampling at a rate of 2B is sufficient to enable complete reconstruction of the original continuous function f(x) from the samples.
If however your signal contains a component with a frequency that is B+b; and you sample that at a rate 2B, then the reconstructed function will contain a spurious signal at a frequency of B-b ; and that signal is now inside the pass band of your desired signal strethcing from zero to frequency B; so no amount of filtering can remove the erroneous “noise”. this “Aliassing noise” will prevent you from accurately reconstructing the original continuous function f(x).
Note that if you violate the Nyquist criterion by a factor of two, so your signal contains components at 2B frequency, the same as your sample rate, you now get aliassing noise at a frequency of B-B, whcih is zero frequency, and is a polite name for the average value of the continuous function f(x).
So now you have the essence of the problem with GISS or anyone else. They don’t read their thermometer once a second or anything like it; maybe max and min for the day which is twice a day, and you can easily show that is not often enough to avoid aliassing of the temporal variable. then the temperature changes much faster spatially than the gaps between these owl boxes, so in temrs of the spatial variable of the continuous function F(s,t), you have gross violation of both the spatial and temporal Nyquist criteria, and there is no way to reconstruct that function so that you can properly determine even the average value.
What GISS gives you or anybody else is the numerical result of performing some algorithm on some set of raw data, and comparing the result with the result of the same process performed at some other time; like last year, on presumably the data from the same ssytem, that was apporpriate for last year. So you have a number, which is meaningless except in the concept of the previous computations of the same number using the same algorithm; which also don’t mean anything; let alone the mean surface temperature of the earth.
I have to go and eat, so I’ll tell you tomorrow, why the number doesn’t mean anything, even if you measure it correctly (which you can’t).
George E. Smith:
Looks like enough material for a post to me. I was 4.0 in this data comm/transmission stuff and can say your analysis more than refreshed.
Just maybe those who are saying the economic downturn will “help the environment” are correct. By slowing some of the overblown and downright dangerous fooling with the climate, the ol’ mudball might just be able to prove it’s in control, not NASA, not the UN, and certainly not us’ns.
No, it won’t. As corporate and private grant money dries up due to the economic crisis, the shrill cries of impending doom will become louder, more frequent, and more scary. Only the most critical-sounding scenarios will continue to get funding.
“Gary Gulrud (06:48:48) :
George E. Smith:
Looks like enough material for a post to me. I was 4.0 in this data comm/transmission stuff and can say your analysis more than refreshed. ”
Thanks for the ratification Gary.
I don’t know beans about why the US climate is the way it is, and Africa’s the way it is; so I rely on the real Climatologists, like Singer, Lintzen, Spencer Christie et al; but when it comes to are we warming up or cooling down planet wise, and who’s in control of that; my thermodynamics, optics, and other aspects of Physics/Math incuding sampled data sytem theory are more than sufficient for me to comment intelligently (most of the time); and like a lot of other scientists, I take umbrage at these newspape “Science writers” who think that you can’t contribute if you don’t have a PhD in “Climate Science” (whatever that is). They don’t understand that basic physical principles operate in diverse fields from astronomy to semiconductors or particle physics. I could get a PhD in Ice cream making, but it never seemed useful to any of my employers; in some fields it’s mandatory .
All I care about is that they get the science correct, and nobody seems to mention that the whole of future energy policy the way things are going is predicated on the erroneous belief that CO2 is the devil incarnate. If that is not true; and I firmly believe it is not true, then suddenly the USA has more stored chemical energy than anybody else; if we just want to go get it. (responsibly of course).
The interesting thing about this ice business, is that the first polar orbit satellites went up circa 1979 to give us our first real look at polar ice extent. And it was that 1975/6 period, when the earlier fears of a coming ice age were rampant. so those early 1979 looks at the Arctic sea ice saw that ice at its most advanced stage in recent years, so everybody compares the 2007 meltback with not the normal ice, but the most advanced ice since the IGY in 1957/8.
But it is nice to see that all the Kayak adventure expeditions to the North Pole; had to be scrapped this year. I hate to wish a deep freeze on everybody, but taking a leaf out of James Hansen’s book, if that is what it takes to wake people up to the global warming scam; and get things back on a scientific footign then so be it.
George
Looking at either the GISStemp or UAH anomalies, one has to conclude that either the data of both is extremely noisy, or else it is not noisy but is real data.
In which case the five year running average concept makes no sense to me. You are just throwing away real data. I should add that I discarded the noisy option. so all those folks who say that the 2008 plunge doesn’t mean anything are implying that the numbers aren’t real.
I believe they are real numbers for whatever algorithmic ritual Hansen’s team and UAH go through; same for RSS, and HADcrut; but with those very rapid changes, I think it is fair to say they are certainly not the average temperature of the earth’s surface, which simply cannot change that fast.
If they were truly the average temperature of the earth’s surface, then you would have to say that any of the values is a real number and if the 2008 plunge says the temperature went back to its 1900 or its 1850 value or whatever, that that is a real effect, and whatever happens next will start from the latest low temperature value; it isn’t going to somehow know what the five year or ten year or whatever longer filtering you want to do, and try and return to that, it is going to move continuously form wherever it is at right now.
So the 1998 el nino peak, was a real uptick in whatever those algorithms compute from whatever data they input; but I don’t believe that the average surface temperature of the globe did anything like that.
On any midsummer day (north) the local temperature on the globe could be anywhere from as low as -90C (Vostock station) to maybe +60 in a northern tropical desert (surface temps), and actually everywhere between those extremes somewhere on the earth (due to that marvellous argument in Galileo’s “Dialog on the Two World systems”.
You could sprinkle thermometers around your house; in the freezer, and the oven, the attic, the TV set etc, and daily calculate your own house global temperature. Well you’d get some occasional “heat island phenomena, if your wife is baking a cake (or you are) when it is time to read the thermometers; but it will all average out over time. Of Course it won’t mean anything, because all of those locations have different thermal processes going on, that don’t simply relate to the local temperature; so averaging them makes no sense at all; they are supposed to be different.
Same goes for GISStemp of course; it doesn’t mean anything either, except with relation to all the previous GISStemp calculations, and all those yet to come. But it keeps a lot of scientists busy all round the world tending that thermometer in their local bat box or barn owl box.