NASA GISS a temperature outlier again – this time for the southern hemisphere

Bob Tisdale shows us that GISS is once again, “way out there” in 2009 compared to other global temperature data sets. It is not surprising, we’ve come to expect it.

Was 2009 The Warmest Year On Record In The Southern Hemisphere?

Guest Post by Bob Tisdale

After reading Roger Pielke Sr’s post Reality Check On Science Magazine’s Claim That 2009 Was The Hottest Year on Record in Southern Hemisphere, I plotted Annual GISTEMP Southern Hemisphere Land+Sea Surface Temperature anomalies from 1982 to 2009, Figure 1, and the Annual UAH MSU TLT anomalies for the same period, Figure 2. There’s nothing surprising with those graphs based on Pielke Sr’s post. GISTEMP is showing record 2009 combined surface temperatures for the Southern Hemisphere, while the 2009 TLT anomalies are far from record levels.

http://i50.tinypic.com/alq6wy.png

Figure 2

The annual NCDC Land+Sea Surface Temperature anomalies from 1982 to 2009, Figure 3, also do not show the record levels in 2009, but the NCDC does not infill with the 1200km smoothing like GISS.

http://i45.tinypic.com/2h2ghdy.png

Figure 3

GISS has used OI.v2 SST data since 1982. Figure 3 is an annual graph of SST anomalies for the Southern Hemisphere, and it illustrates that 2009 was not a record year for SST anomalies. That leaves the GISS land surface temperature anomaly data as the culprit.

http://i50.tinypic.com/2eceu74.png

Figure 4

Hadley Centre data is still not available for December, and they’ve been running late recently. The NCDC and GISS data through KNMI Climate Explorer data should be updated within the next few days, so we’ll be able to do some comparisons and try to determine which of the continents is responsible for the new record GISS Southern Hemisphere temperatures.

SOURCES

OI.v2 SST anomaly data is available through the NOAA NOMADS website:

http://nomad3.ncep.noaa.gov/cgi-bin/pdisp_sst.sh?lite

The GISTEMP Southern Hemisphere Land Plus Surface Temperature data is available from GISS:

http://data.giss.nasa.gov/gistemp/tabledata/SH.Ts+dSST.txt

The NCDC Southern Hemisphere Land Plus Surface Temperature data is available here:

ftp://ftp.ncdc.noaa.gov/pub/data/anomalies/monthly.land_ocean.90S.00N.df_1901-2000mean.dat

The UAH MSU TLT anomaly data was retrieved from the KNMI Climate Explorer:

http://climexp.knmi.nl/selectfield_obs.cgi?someone@somewhere

Posted by Bob Tisdale at 9:06 PM

0 0 votes
Article Rating
153 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
stumpy
January 20, 2010 6:55 pm

After enduring a very cold autumn (coldest may on record), cold winter, cold spring and a non-summer in NZ I will be particuarly miffed if they declare it was the “warmest year on record”!
Figures from NIWA for NZ show it was cooler than mean, so it shouldnt be NZ doing it.

BradH
January 20, 2010 6:56 pm

Hansen, et al have absolutely no shame (nor credibility). How many times does Peter have to cry, “Wolf” before the people stop paying attention.

ian middleton
January 20, 2010 6:57 pm

It’s Darwin airport, it’s always Darwin airport. betcha.

old construction worker
January 20, 2010 6:57 pm

On a geological time frame, it’s thirty years of weather.

R Shearer
January 20, 2010 6:59 pm

Who’s in charge of the GISTEMP, Dr. Bernie Madoff I presume?

boballab
January 20, 2010 6:59 pm

I would hazard a guess and say it’s Africa since just about the entire middle section of that continent is devoid of data.

I made the Animation in this video by making Anomaly maps from the GISS site. They are Jan-Dec, 1951-80 Normals but instead of using the 1200km “smooth” its set to the 250km option. Once you do that you see that the SH is really lacking in readings. Africa almost none, South America has a blank spot and if you do a polar view on the 250km scale Antarctica is very sparse in data.

Peter of Sydney
January 20, 2010 7:02 pm

Is it just me? All these rises and falls in temperatures of mere fractions of a degree. So what? They mean nothing. They certainly don’t mean man is causing global warming. The changes are too small to qualify and quantify the effects of man-made greenhouse gases and pollution. AGW alarmist in particular are treating everyone as fools to believe otherwise. If the temperature had risen by say 4 C over the past 100 years then I’d be listening to AGW alarmists. But something like 0.6 C rise? That’s nothing in the overall scheme of things. Common sense please!

tokyoboy
January 20, 2010 7:08 pm

Anthony, about when will the surfacestations summary come out?
Thouth I know you’ve been busy since the outbreak of Climategate and other stuffs, many folks probably await it.

January 20, 2010 7:27 pm

Bob Tisdale – Are we seeing the end of El Nino this southern summer?
The latest report from BoM seems to be suggesting it.
http://www.bom.gov.au/climate/enso/
I note that Jan 2010 is looking to be very hot according to the satelite readings.
Maybe Feb also…

R Shearer
January 20, 2010 7:27 pm

I have to agree with Peter of Sydney, and in actuality, that 0.6 C is probably 0.3 +/- 0.4C. The signal of the measurement is just insignificant compared to its noise.

Another Ford
January 20, 2010 7:31 pm

Although NCDC does not show a record, isn’t it’s ending anomalie higher? Is that due to a different baseline?

Neville
January 20, 2010 7:36 pm

Probably boring as hell but how much of the +0.6c to +0.7c is attributable to the recovery from the little ice age?
If we say the LIA ended in 1850 then the temp either goes up or we face a further reduction in temp.
Obviously the temp increased slightly over the next 150 years but what was a reasonable assessment of the LIA temp decline from 1350 to 1850.
Most of the info I have seen attributes the average decline to be -1.0c over that 500 year period.
Therefore if we have recovered say 0.7c in the next 150+ years what is the panic about?

Roger Carr
January 20, 2010 7:41 pm

Peter of Sydney (19:02:12) : Is it just me? All these rises and falls in temperatures of mere fractions of a degree. So what?… If the temperature had risen by say 4 C over the past 100 years…
You can add me in, Peter; and I would expect millions of rational beings the world around.

ajstrata
January 20, 2010 7:47 pm

Hey folks, just finished a post which I believe provides proof (or at least a clear way to prove) the AGW theories are mathematically invalid.
http://strata-sphere.com/blog/index.php/archives/12246
Comments welcomed!

E.M.Smith
Editor
January 20, 2010 7:48 pm

The NCDC and GISS data through KNMI Climate Explorer data should be updated within the next few days, so we’ll be able to do some comparisons and try to determine which of the continents is responsible for the new record GISS Southern Hemisphere temperatures.
I think I smell an assumption here… As of now, a large percentage of pacific basin thermometers are at airports. So take an island with a patch of tarmac and a bunch of jet engines. No UHI “correction” will be done since there are not enough “nearby” stations (GIStemp just passes the data through if there are not enough “nearby”. So that Airport Heat Island will now be used to fill in a 2400 km diameter circle of ocean with “abnomal” heat anomalies…
Compare a 2400 km diameter circle with, oh, Australia.
Now count the number of islands in the sun …
Then there is always The Bolivia Effect…

January 20, 2010 7:50 pm

I’d hate to see these people attempt to drive. Their reactions to the trees and rocks moving, let alone other vehicles, must be frightening. What happens when a bug hits the windshield???
http://www.flickr.com/photos/22364639@N07/3075281852/
http://www.flickr.com/photos/23094963@N08/2298533435/
http://image.motorcyclecruiser.com/f/8827875/WS-Lead-xl.jpg
http://image.motorcyclecruiser.com/f/8827933/WS-bug-tite-lg.jpg
Sorry it wasn’t a movie… 🙂
Ameteur vote: Darwin Station. (I read that great article last month, about how the hockey puck’s hockey stick was replicated, by a step-by-step process.)

Antonio San
January 20, 2010 7:51 pm

OT: In France a story is developing about skyfall.fr, a skeptics French language site, that was hosted on the “free” server. Skyfall has been unavailable now for few days and a letter has been sent to the CEO of Iliad the mother corp of “free” in order to find out what’s the problem.
http://www.objectifliberte.fr/2010/01/indisponibilite-du-site-skyfal-mail-ouvert-a-xavier-niel-president-du-groupe-iliad.html
To be continued…

sky
January 20, 2010 7:53 pm

boballab (18:59:57):
Good Work! Yours is a very important video, for it shows not only the very sparse global coverage in the early years, but also the massive drop-out stations post 198–not only in Africa. This glaring lack of a uniform geographic basis is what allows the construction of a “global anomaly series” to suit one’s taste.

sky
January 20, 2010 7:55 pm

OOPS! That should read: “drop-out of stations post1980…”

January 20, 2010 7:57 pm

The Australian Bureau of Meteorology “high quality” data for land temperatures in Western Australia are relevant since this jurisdiction, at 2.5 million square kilometres, comprises a reasonable chunk of the southern hemisphere.
Chart of BoM data from 1979 to 2009
http://www.waclimate.net/1979-2009.html
Comparing 1979-1989 with 1999-2009, the average mean temperature across Western Australia increased by 0.197C over the 31 year period. Comparing 1990-1999 with 2000-2009, the average mean temperature increased by 0.013C – if the “high quality” data is considered accurate.

January 20, 2010 7:59 pm

The only way we are going to get a global 6deg C rise in temperatures by 2100 is if we cover almost all of the land in cities, so it is all covered in an 8 degree UHI effect to bring up the overall average.

January 20, 2010 8:00 pm

Or as they have done move all of the thermometers, into cities which seems cheaper, and easier.

j.pickens
January 20, 2010 8:02 pm

ajstrata,
Run that posting through a spell checker, and a style checker, and get back to me.
Day-2-day
Fickled
There instead of Their
identofy

p.g.sharrow "PG"
January 20, 2010 8:02 pm

I wonder where the 0.1C added at start of 2007 came from? Before and after that date the trace appears to be constant.

Dave F
January 20, 2010 8:03 pm

Peter of Sydney (19:02:12) :
A little restraint is certainly in order for some. Even with the 0.6C rise, it does not mean that it is 0.6C warmer everywhere. It does not even mean that it is, on average, 0.6C warmer everywhere. It could be 4.6C warmer in one place, and 1C colder in 4 places. At least, insofar as I understand the way they average.

Roger Carr
January 20, 2010 8:08 pm

Pursuing the fractional degree rises and falls a moment: I guess this is why a tipping point had to be introduced, else no one would have even listened.

January 20, 2010 8:08 pm

ajstrata
AJSTRATA:
Darn you! You’ve just taken away my next project. What you did is essentially what I was going to embark on. If you would like 190 years worth of Minneapolis/St. Paul temp data, let me know I have it. (Highs/Lows). You can contact Anthony for my Email.
The one thing I’ve really been TICKED OFF about is the LACK of statistical analysis, which is DAMNING of the various “Phd’s” who are doing this work.
When you look at the GISS, if one does a “Standard Deviation” one finds that the “curve fit” of “trend” is statistically meaningless. Or that’s what I believe by INSPECTION. I have not done the analysis.
Do you have a “csv” file, of the GISS..can you do a quick check? I’m sure I’m right. I think you and I are taking the same approach. And, yes, it makes MINCE MEAT out of the AWG’s arguements that they are seeing ANYTHING of merit in their limited temperature data sets.
Max

January 20, 2010 8:11 pm

Just a thought to get an ideal climate in less than 20 years you could build an impulse sprinkler into the base pole of the temp station, and plant fast growing trees in the circular drip line of the sweep, turn it on every day for 3 hours at mid day. What a wonderful world we could report on to the public, (can you say future immigration problem?)

January 20, 2010 8:18 pm

ajstrata (19:47:43) :
Very interesting. I hope you get some responses from those more in the know that I!

vigilantfish
January 20, 2010 8:33 pm

ajstrata:
Nice graphical illustration of temperature variability on one day in a homogenous region smaller than the gridded areas used by GISS to record data. I was actually somewhat surprised by the range of high temperatures. Another nail in the IPCC coffin? Agree with j. pickens, though; some spell-checking and proof-reading is in order. The “there” on the second to last line is particularly jarring. ( I’m hoping meanwhile that I’ve avoided a grammar gremlin in this post – amazing how mistakes slip past one).

Doug
January 20, 2010 8:39 pm

@ ajstrata (19:47:43) :
Very helpful post AJ. I think your analysis of the surface stations ties in well with what E.M. Smith has been working on and sharing with us. I find it hard to believe anyone would accept the premise that you can have a few thermometers scattered around the globe and then use those small number of readings to extrapolate a temperature profile for the entire globe. It’s not intuitive and as you’ve shown, the math doesn’t work well either.

rbateman
January 20, 2010 8:45 pm

If GISS keeps up this parade of ‘outer limits’ reporting, there’s going to be a day of reckoning for wayward spenders of taxpayer monies. They are probably alread there or darned close to it.

brc
January 20, 2010 8:50 pm

The southern hemisphere is sparsely populated (compared to the NH) and about 50% ocean. I wonder how accurate the data really is.

January 20, 2010 8:51 pm

BernieL: You asked, “Are we seeing the end of El Nino this southern summer?”
NINO3.4 SST anomalies have dropped the last few weeks.
http://i48.tinypic.com/5d2xzr.jpg
It’s the right time of the year for it to peak. Assuming it doesn’t turn into a multiyear El Nino, the next questions are how quickly will SST anomalies drop, and how significant will the La Nina be?

yonason
January 20, 2010 8:56 pm

HORSE HOCKEY!!!
From Wolfram Alpha…
Data from YBAS (Alice Springs Airport) in central Australia from 1945 to present show a decline in temps., with a best fit to the data of…
-0.0024 deg F/y+-0.0263 deg F/y
http://www.wolframalpha.com/input/?i=+AUSTRALIA+TEMPERATURE
Data from YPDN (Darwin International Airport) for Darwin Australia for the same time frame yields the following best linear fit…
-9.9×10^-5 deg F/y+-0.008968 deg F/y
http://www.wolframalpha.com/input/?i=darwin+AUSTRALIA+TEMPERATURE
I’m really getting tired of being preached to by a bunch of idiots and/or liars.
They are obsessed with their high-tech tools, which either don’t work the way they are supposed to or they don’t know how to use them properly, but are incapable of even reading a simple #@$% thermometer.
The data that falsifies their claims is there. Why won’t someone in the field please collate and post it, before they “lose” it, too?

January 20, 2010 8:57 pm

Neville
If we say the LIA ended in 1850 then the temp either goes up or we face a further reduction in temp
1850? I keep seeing that date crop up and am not certain why. The low that preceeded the current warming trend was 1830 on almost every reconstruction and even the cherry picked ones from the IPCC spaghetti graph of 9 reconstructions show it. Further, that low was after a very short cooling period which was preceeded by a low around 1600 or so which the IPCC graph also shows, regardless of how warm the reconstruction highs are compared to the current measured highs, the trend is obvious…. warming since 1600, the IPCC says so and I choose to believe them:
http://en.wikipedia.org/wiki/File:1000_Year_Temperature_Comparison.png

J.Peden
January 20, 2010 9:24 pm

But something like 0.6 C rise? That’s nothing in the overall scheme of things. Common sense please!
0.6 would be almost too little to reassure me that a cooling trend is not in the offing. I’d rather see the temp. increase vs stay the same. Not that I trust even the 0.6.

Baa Humbug
January 20, 2010 9:25 pm

Can any data from these people be believed?
November 22, 1996: email 0848679780
“Remember all the fun we had last year over 1995 global temperatures, with the early release of information (via Australia), “inventing” the December monthly value, letters to Nature, etc., etc.?
I think we should have a cunning plan about what to do this year, simply to avoid a lot of wasted time.
We feed this selectively to Nick Nuttall (of the United Nations Environment Program) (who has had this in the past and seems now to expect special treatment) so that he can write an article for the silly season. We could also give this to Neville Nicholls (climate scientist at the Bureau of Meteorology Research Centre in Melbourne, Australia)?
I know it sound a bit cloak-and-dagger but it’s just meant to save time in the long run”.
Any data that has been anywhere within 1000miles of these crooks must be ditched in the bin. Yes they have been “inventing” data, torturing data and misrepresenting , misinterpreting data for years. They are useless for any meaningful scientific analysis.

Baa Humbug
January 20, 2010 9:31 pm

Irregardless of whether they say “2009 was the warmest on record for Australia” or “such and such was 2nd warmest” etc they have proven themselves to be untrustworthy with important data like global temp readings.
Only if the data used by good blogs like WUWT is untouched, raw, unscrewed and virginal, can we begin to believe what that data may be telling us. Otherwise bin it.

Douglas Field
January 20, 2010 9:31 pm

Jimmy Haigh (20:18:06) :
ajstrata (19:47:43) :
Very interesting. I hope
I’m with you Jimmy. It is very logical and easy for laymen (like me) to understand.
i have always though that the data was too variable from place to place, that the margin of error even in reading the data over the years let alone recording was far too great for the AGW folk to arrive at such a certain conclusions for such a small annual increment to have any credibility.

Terry Jackson
January 20, 2010 9:46 pm

Bolivia.

Daniel H
January 20, 2010 10:03 pm

If it’s too cold they hide the decline. If it’s not hot enough they fudge the ascent.
That reminds me. In a previous thread someone was saying something about clowns. Maybe GISS should adopt the following lyrics as their official climate change theme song:
Don’t you love a farce? My fault, I fear
I thought that you’d want what I want, sorry my dear
But where are the clowns? Send in the clowns
Don’t bother they’re here

gerard
January 20, 2010 10:13 pm

SPAusnet a Victoria Australia electricity retailer proposal to increase to 42cKWH electricity during peak time (presently 8cKWH) is a warning of things to come and will be responsible, if (when) it happens for the deaths of thousands of the aged and the babies of poor people who will not be able to afford to run aircoditioning to cool their homes during heatwaves. I am also concerned about the stoic nature of the aged who if they can afford it will not use it because they think they are saving the planet. This is more about electricty supply than climate change nonsense but they are linked. If we are to rely on wind power rather than coal (which we have a plentiful supply of) or nuclear power (which is political suicide in Aus) in the future.
When are the wheels going to fall off this global warming craziness?

Baa Humbug
January 20, 2010 10:22 pm

January 6, 2005: email 1105019698
David Parker of the UK Met Office to Neil Plummer, Senior Climatologist at the National Climate Centre of the Bureau of Meteorology, Melbourne, Australia:
“There is a preference in the atmospheric observations chapter of the IPCC 4AR to stay with the 1961–1990 baseline. This is partly because a change of baseline confuses users, e.g. anomalies will seem less positive than before if we change to a newer baseline, so the impression of global warming will be muted”.
So the anomalies will SEEM LESS POSITIVE ha?
I was told by bloggers that changing baselines didn’t alter anomalies. Not so coming from the horses mouth. What baseline do we use and why?
REPLY: From GISTEMP “Temperature anomalies are computed relative to the base period 1951-1980.” which is part of the reason that GISS is an outlier, since they use a baseline that has some of the coldest weather in the 20th century. HadCRUT uses a 1961-1990 baseline. UAH by necessity uses a baseline starting in 1979, when data gathering first started. – Anthony

rbateman
January 20, 2010 10:46 pm

gerard (22:13:05) :
When are the wheels going to fall off this global warming craziness?

They coming off as we speak.
Climate Change was the exit strategy for AGW, and we see how well that worked out for them. NOT. They have dug themselves a pit. No exit.
What you hear now are squealings of desperation.

Jeef
January 20, 2010 11:23 pm

“Hadley Centre data is still not available for December, and they’ve been running late recently.”
Phil Jones still suspended? No records of how to manipulate his computer model? Bet the records show it’s cooler there these days!

JC
January 20, 2010 11:33 pm

Can someone explain why Niwa (NZ) has been saying since 1999 that the 1990s was the warmest decade ever in NZ.
This was repeated by the head man at Copenhagen (James Renwick, Principle Scientist) where he explicitly said that the NZ records show the 2000s were the warmest decade, followed by the 1990s followed by the 1980s.
Yet less than a month later (early Jan 2010) Niwa reports that the warmest decade is the 1980s, followed by the 2000s, followed by the 1970s and trailing the rear are the 1990s.
What happened to the 1990s? Did they offend the King or something?
JC

Peter of Sydney
January 20, 2010 11:35 pm

Unless AGW believers are stupid enough to think that temperatures have remained completely flat up until the industrial age started, they have no case with a rise of only 0.6 C since then.

Norm in Calgary
January 20, 2010 11:44 pm

“it shows not only the very sparse global coverage in the early years, but also the massive drop-out stations post 1980”
Why does everyone say the stations dropped out? They are still there reporting as usual, it’s just GISS doesn’t use there data anymore. Of all the things that most influences the ‘global warming’ it’s the land surface measurements, in themselves open to all kinds of errors, and the GISS then massages the data and we’re supposed to believe them when virtually every adjustment increases global warming.

yonason
January 20, 2010 11:50 pm

Terry Jackson (21:46:03) :
“Bolivia.”
Do you mean, as in Bolivia?

yonason
January 21, 2010 12:12 am

JC (23:33:37) :
“Can someone explain why Niwa (NZ) has been saying since 1999 that the 1990s was the warmest decade ever in NZ.”
Not me, but even though there was a slight burp upwards in about 1990, it didn’t last, and overall temps have been dropping ever so slightly.
New Zealand temperatures from NZWN (Wellington International Airport), reported by Wolfam Alpha fit this DECREASING linear regression curve…
-0.019 deg F/y+-0.02 deg F/y
http://www.wolframalpha.com/input/?i=new+zealand+temperature
I think Terry Jackson (21:46:03) is as close as anyone to what the source is, BOLIVIA! Sing it with me now, OH HOW HOT DOES IS IT GET IN BOLIVIA…?

Ian Cooper
January 21, 2010 12:18 am

Stumpy
you are bang on the money mate. 2009 was not the coldest year in my 52 years in New Zealand, but the proxy data that I collect showed it to be a significant cold year, and a very long way from being the hottest year of the past three decades. The most ground frosts in the 1980-2009 period. Dr Jim Renwick of NIWA announced in 2007 that with rising global temperatures we should see a significant drop in frost numbers, even on the central plateau of the North Island (m.s.l. 600m/2,000ft). On that basis, down here near sea level on the Manawatu plains we shouldn’t even have frosts, and yet we had 36 last winter/spring. Mountain snowfalls in this region also hit a new record for the 1980-2009 period. Locally last Monday was the first day with a maximum above 25C. This is the fifth latest date for such an event in the last 55 years, and based only on T-Max 2009-10 is on track to be the 6th coolest summer here in that period.
So if the S.H. has just experienced the hottest year in the past 30, it was due to somewhere other than N.Z. By comparison with the north the S.H. is mostly ocean. When we look back over the SST Anomaly charts for last year some of the hottest parts in the S.H. were in the emptiest part of the Pacific below French Polynesia where there aren’t even any small islands.
What I really want to know is what do the people at GIS take to sleep well at night. As long as it is not toxic they should share it with the rest of us and keep their graphs to themselves!

January 21, 2010 1:03 am

Just finished reading AJStrata’s clear and clever post. This has filled in the gaps in my suspicions of what is actually wrong with the AGW theories. As a person who admittedly struggles with maths in the abstract, something about the methodology of collecting data from the earth’s surface seemed terribly wrong, apart from choosing to use data from weather stations heated by tour bus exhausts, air conditioner exhausts, etc which is a fraud in itself. When I became aware of the sampling method being progressively rigged to accept mostly warmer sites and leave out the cooler sites, my suspicions increased to near certainty that the entire surface temp-sampling methodology is plainly ridiculous.
Strata’s work, brief as it is, makes it crystal clear that AGW, let alone CAGW is a fraud. The amount they claim the planet has warmed is actually less than the noise in the data!
Even a Sixth Form student from the most dubious sort of college would not be allowed to proceed with a survey as badly designed and structured as this.
As a Kiwi temporarily domiciled in the UK, nobody I correspond with at home would agree that New Zealand has recently been warmer than usual and one of them remarked a couple of weeks ago in an dmail that the idea of any global warming from whatever cause is pretty to hard sell down there.

Veronica
January 21, 2010 1:26 am

Good to hear some discussion of signal to noise in this thread. When you consider, hundreds of readings all subject to equipment and operator error, rounding errors, and to local changes such as urban heat effect, new buildings, airport runways etc. Then you consider that all these are “adjusted” as Anthony has demonstrated many times in the past, then they are averaged…
… and not an error bar in sight on any of these “average temperature” plots, no mention of what the standard deviations are, or the P values.. My PhD supervisor would have slapped me for producing a graph with no error bars. You would struggle to get a paper published in mt field (biological sciences) without a discussion of the P value i.e. what is the probability that this data was the result of chance?

David
January 21, 2010 1:39 am

Ian Cooper 00:18:44
You really don’t want to take what they are taking, as it clearly causes significant loss of cognitive ability….

Geoff Sherrington
January 21, 2010 2:03 am

Anthony,
You write –
“UAH by necessity uses a baseline starting in 1979, when data gathering first started.” I hate anomaly data, but that’s a personal preference.
Question: As the global number of reporting stations has dropped, did they also drop out the corresponding stations in the baseline period so that the anomaly changes each time there are more drop-outs? Or are they merely using a constant temperature subtraction year after year? The latter would be so, so wrong for the march of the thermometers.

Geoff Sherrington
January 21, 2010 2:05 am

Ah heck, I’m wrong again. I mean the land based global stations, not UAH satellite. Used the wrong part of your quote. Sorry, Geoff.

inversesquare
January 21, 2010 2:27 am

Veronica (01:26:14) :
Good to hear some discussion of signal to noise in this thread. When you consider, hundreds of readings all subject to equipment and operator error, rounding errors, and to local changes such as urban heat effect, new buildings, airport runways etc. Then you consider that all these are “adjusted” as Anthony has demonstrated many times in the past, then they are averaged…
Agreed!
People are freaking out day to day about the shocking headlines they read in the papers. No one realizes that the actual data these headlines are based on is so far within the margin of error that its just as likely to be noise as it is to be a detected temp change .
We had one such headline from NIWA ‘warmest decade on record’ a few weeks back. It turned up on the front page of all the main media outlets…. None mentioned that NIWA was talking about less than ONE TENTH of a degree C.

Oefinell
January 21, 2010 2:45 am

ajstrata (19:47)
Interesting post but misguided. Your blog correctly states that it is not possible to get an accurate global surface temperature from current data. In fact that is well understood by Climate Scientists. That is why they do not measure the absolute temperature at all, they measure the temperature change or variance.
Provided the methodology is consistent from month to month then you can accurately see how temperatures have changed over time. You do not need to calculate an absolute temperature at all.
There are lots of further questions over how their estimates of variance are calculated (data adjustments etc) but they are not addressed in your blog post.
Your final idea is a good one though – comparing surface temperature measurements with the satellite data. There are lots of papers on this subject if you care to Google a bit. In fact all the data is publicly available and it is very simple to calculate a variance from the satellite data and compare it with a variance from the GISS data – since 1979 at least.
They actually correlate pretty well, although the satellite trend suggests it is warming a touch slower than the surface stations but not so much that it invalidates the GISS methodology.
Graphs and links to data here:
http://www.climate4you.com/GlobalTemperatures.htm#Comparing%20surface%20and%20sattellite%20temperature%20estimates
I think the sceptic community needs to get over all this controversy with the surface station data. There are other much bigger and more important questions – in particular whether the climate sensitivites included in the IPCC models are correct. These are largely calculated from paleoclimate studies so the data from surface stations is irrelevent.
A recent study by some eminent climate scientists led by Steven E. Schwartz, including one from the NASA team, highlight the problems titled:
“Why Hasn’t Earth Warmed as Much as Expected?”
(good question!)
http://ams.allenpress.com/perlserv/?request=get-abstract&doi=10.1175%2F2009JCLI3461.1
If you dig around you can find links to the full report – one in a comment here if I recall correctly.
This paper is remarkably clear of AGW spin and points out the urgency of finding more accurate estimates for the effect of Aerosols (air pollution) which are usually blamed by the modellers for the global cooling that happened 1940 – 1970. If the effect of aerosols turns out to be too insignificant to account for this period of cooling then the paper states that the climate sensitivity in the models must be too high.
Of course this is only one among a ton of papers on the subject, however it is refreshing that some climate scientists are prepared to stick their necks out and ask the question the rest of us have been asking ourselves for some time!

January 21, 2010 3:10 am

brc: You wrote, “The southern hemisphere is sparsely populated (compared to the NH) and about 50% ocean.”
Oceans occupy about 70% of the global surface area and about 80% of the Southern Hemisphere.

artwest
January 21, 2010 3:21 am

Peter of Sydney (19:02:12) : Is it just me? All these rises and falls in temperatures of mere fractions of a degree. So what?… If the temperature had risen by say 4 C over the past 100 years…
——————–
I agree. It would be interesting to see a poll asking the general public, unprompted, by how much the planet has supposedly warmed in the last century.
I suspect that the average response would be several whole degrees, assuming (quite reasonably) that fractions of a degree wouldn’t merit all the panic.
I don’t think that even someone who is fairly well-informed about world news generally but who hasn’t taken a particular interest in the temperature trends has a clue, let alone anyone else.

John Finn
January 21, 2010 3:23 am

RE: GISS/UAH SH ‘discrepancy’
Are you sure that the ENSO lag is not playing some part in this, i.e. satellite readings don’t respond as quickly to El Nino development.
The surface temperatures started to rise much earlier in the year. UAH readings for the 6 months since July have been at record levels – higher than the same period in 1998. The SH surface (GISS) temperatures seem to have moderated while it’s not clear that has happened with the satellite temperatures. I think we need to look at the Jan/Feb numbers before jumping to conclusions.
This issue came up some months ago (June??) when UAH was recording near zero anomalies while GISS and Hadcrut anomalies were rising. I reckon there’s a good chance that by May or June, UAH will have recorded it’s warmest ever 12 month period (SH and NH).

Tenuc
January 21, 2010 3:31 am

ajstrata (19:47:43) :
“Hey folks, just finished a post which I believe provides proof (or at least a clear way to prove) the AGW theories are mathematically invalid.
http://strata-sphere.com/blog/index.php/archives/12246
Comments welcomed!

Good piece of work which illustrates the problems caused by our deterministically chaotic climate, even for such an apparently simple thing as measuring GMST.
Too few thermometers and too many assumptions = information not fit for purpose.
Your article needs a wider circulation.

January 21, 2010 3:36 am

boballab: You wrote, “I would hazard a guess and say it’s Africa since just about the entire middle section of that continent is devoid of data.” And you included a link to your video:

I would tend to agree over the past 30 years. I compared GISS land surface temperature anomalies to UAH MSU TLT anomalies back in June:
http://bobtisdale.blogspot.com/2009/06/part-2-of-comparison-of-gistemp-and-uah.html
Of the areas I was able to check, Central and Southern Africa had the greatest difference between the GISTEMP and UAH MSU linear trends:
http://i40.tinypic.com/1hb5sm.png
But in that comparison I extended the data across the equator, so it’s not exclusively Southern Hemisphere.
PS: Thanks for the link to the video.

January 21, 2010 3:49 am

John Finn: You asked, “Are you sure that the ENSO lag is not playing some part in this, i.e. satellite readings don’t respond as quickly to El Nino development,” then clarified that satellite data meant TLT.
Note the difference between the GISTEMP and NCDC data.
Also, this was a quick post, and with the differences in base years, I didn’t create any comparison graphs. There appear to be a few “shifts” in the GISTEMP data that don’t show in the other datasets. I’ll be looking at them in more detail in a follow-up post when KNMI updates.
Regards

R.S.Brown
January 21, 2010 3:50 am

You can check out the animated global weather patterns for the past
ten says here:
http://www.ssec.wisc.edu/data/comp/cmoll/cmoll.html
Four frames = one 24 hour period.
This gives land & sea temperatures. Note how rapidly the day temperatures change as the sun warms the land. This is
especially obvious across Australia, the southern part of the
Indian subcontinent, Africa and South American.

January 21, 2010 5:00 am

I also thought that last winter here in South Africa was a bit longer and colder than usual, e.g. compared to the last 5 years. Most recently we are experiencing a lot of clouds and cloudiness, and cooler weather.
Just for interest sake: what is the margin of error for reading temperature, does anyone know?? And how did the margin of error develop of the century?

kzb
January 21, 2010 5:05 am

Ajstrata, I’m afraid there are some serious flaws in your article. For example,
“…The standard deviation in percent temperature change was 38.5%!… ”
No ! to do this correctly you have to first convert to the Absolute temperature scale, on which zero is -273.15 degrees C. This would make your percent SD much smaller.
Also, I am not at all sure about your statistical reasoning. In principle, it is perfectly valid to use the average temp for a cell, as long as the same method is used to find the average for each cell.
This is what statistics is all about, picking out trends and patterns from noisy data. That’s it’s whole purpose in life.
It’s true that the noisier the data, the less the confidence level of the calculated trend. But nevertheless a trend line CAN be calculated, and a confidence band put on it.
It’s also true in these records, as you point out, the SD of an individual point is greater than the effect supposedly being measured. However, if there are a sufficient number of points (cells) the statistical calcs take this into account automatically. It can even weight points inversely relative to their uncertainty.

Pascvaks
January 21, 2010 5:12 am

For those interested in a proxy record of “global” temperatures since mid-1979 up until a couple weeks ago, the following link should give a fair picture of Earth’s heat and cooling status. It even has an anomoly graph:-) Again, I said proxy (we’re not talking degrees *F or *C). Simple minds love simple answers –
http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/global.daily.ice.area.withtrend.jpg

Frank K.
January 21, 2010 5:20 am

Oefinell (02:45:04) :
“That is why they do not measure the absolute temperature at all, they measure the temperature change or variance.”
“Provided the methodology is consistent from month to month then you can accurately see how temperatures have changed over time. You do not need to calculate an absolute temperature at all.”
A couple of questions:
(1) What were all of those mercury thermometers of yore measuring? How were they calibrated? Were they all calibrated regularly and consistently (especially those highly accurate bucket thermometers used by the scientifically-trained sailing crews upon which we base a large portion of the historical ocean surface temperature data)?
(2) Does the absolute temperature matter? Thermodynamically, of course, it does (e.g. in evaluating equations of state, thermophysical properties of gases and liquids, freezing/melt points, etc. etc.). But is climate science somehow exempt from this?
(3) Is the metric of a “world averaged temperature anomaly” thermodynamically meaningful? That is, does it accurately imply anything about the heat content of the air-ocean-land system?

Jared
January 21, 2010 5:30 am

GISS is a joke.
Look how bad they messed up Charlotte, NC temperature for July 2009.
http://data.giss.nasa.gov/work/gistemp/STATIONS//tmp.425723140001.1.1/station.txt
^^^ There June-July-August average of 25.7 means they “gave” July a temperature of 26.5
Actual temperature in Charlotte for July was 24.98
So 24.98 became 26.5 according to GISS. Good job GISS, you do us Americans proud.
If you can mess up Charlotte, NC that bad then what are you doing with temps out in the middle of the ocean that none of us can confirm?

January 21, 2010 5:38 am

@ ajstrata –
Good job.
btw, Firefox browser has a built-in spell checker, which is wy my spelling is always perfect.
Mike
humble grammar n*zi

Pascvaks
January 21, 2010 5:40 am

Once again from the simple answers department, for those interested in a proxy record of “Southern” temperature anomolies since 1979 up until a couple weeks ago.
http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/current.anom.south.jpg

Danimals
January 21, 2010 5:55 am

I haven’t read through all the comments (in hurry this morning) so I apologize if this was already covered. A question to:
boballab (18:59:57) :
I would hazard a guess and say it’s Africa since just about the entire middle section of that continent is devoid of data.

Thanks for the work you did! It seems the middle Africa section dearth of data is for the recent years, with some decades prior coding in the yellow range which was about the same as large parts of the rest of the earth. What would cause the “dropout” of sensors for Africa in recent years??
Dan
New Jersey, USA

hunter
January 21, 2010 6:04 am

I am beginning to wonder if the entire concept of relying on monthly or daily anomalies to determine if the world temp is heating or cooling is a useful tool.
Hansen’s mental contortion to claim that we have just finished the hottest decade on record seems contrived, at best.

January 21, 2010 6:15 am

hunter: You wrote, “Hansen’s mental contortion to claim that we have just finished the hottest decade on record seems contrived, at best.”
Especially when you consider that the majority of the difference between the 1990s and the 2000s was caused by the aftereffects of the 1997/98 El Nino (and 1998/99/00/01 La Nina). Refer to:
http://bobtisdale.blogspot.com/2009/11/global-temperatures-this-decade-will-be.html
Regards

January 21, 2010 6:24 am

Pascvaks: Thanks for your numerous comments that include links to graphs of sea ice as a proxy for global or hemispheric temperature…
BUT…
Sea ice is a poor proxy for global and hemispheric temperature. It is impacted by many other variables: changes in ocean currents, surface wind direction, polar atmospheric pressure, etc.

Pascvaks
January 21, 2010 6:27 am

Bob Tisdale : Do the graphs that I’ve posted at
(05:12:10) and (05:40:28) accurately reflect the Global and Southern ice record? Are the a reasonable proxy for Global and Souther temperatures?

Pamela Gray
January 21, 2010 6:37 am

hunter, it is not. The proper (or the best that we can do) data would be a running 3-month average to coincide with other indices reported likewise, such as SST anomalies in the ENSO report used to determine El Nino/La Nina events. Even solar data would benefit from this type of calculation. The universe cares not one iota which month this is, yet we get our knickers in a twist if this January is warmer or colder than last January. Which reminds me, comparing today’s Arctic ice cover with last year’s cover on this date, or any comparison date by date, is just absurd.

January 21, 2010 6:57 am

John Finn (03:23:59) :
RE: GISS/UAH SH ‘discrepancy’
Are you sure that the ENSO lag is not playing some part in this, i.e. satellite readings don’t respond as quickly to El Nino development.
The surface temperatures started to rise much earlier in the year. UAH readings for the 6 months since July have been at record levels – higher than the same period in 1998. The SH surface (GISS) temperatures seem to have moderated while it’s not clear that has happened with the satellite temperatures. I think we need to look at the Jan/Feb numbers before jumping to conclusions.

If you look at UAH data for Ch 5 you’ll see that it’s been at or above the 20 yr record levels for the whole month of Jan. (and substantially above last year)
This issue came up some months ago (June??) when UAH was recording near zero anomalies while GISS and Hadcrut anomalies were rising. I reckon there’s a good chance that by May or June, UAH will have recorded it’s warmest ever 12 month period (SH and NH).
UAH has a problem with anomalies ever since they switched to the Aqua satellite, it’s always low in the summer now. I think it’s due to the change from a drift corrected dataset (on which the mean for calculating the anomaly is based) to now using a non-drifting satellite. It’s notable that RSS didn’t shift to Aqua and doesn’t show the seasonal variation.

James Chamberlain
January 21, 2010 7:08 am

As a lifelong scientists who has looked at a lot of data, charts, spectra, etc., I just find it amazing that the warmists look at any of these charts and see anything but noise.

Harold Vance
January 21, 2010 7:13 am

The ever-changing methods of adjustments and the dying of thermometers enable GISS to create a continuous bull market for global warming. Ultimately, imho, the point here is to create higher highs and an ongoing bull market. They understand that the PR is more important than the science in terms of accrual of power and money.
As boballab has clearly demonstrated, the center of Africa is butt naked. It is a huge continent with sparse coverage. In the 21st century no less.

January 21, 2010 7:16 am

Pascvaks: You asked, “Do the graphs that I’ve posted at
(05:12:10) and (05:40:28) accurately reflect the Global and Southern ice record?”
I have no reason to dispute the accuracy of the data used to create the graphs.
You asked, “Are the a reasonable proxy for Global and Souther temperatures?”
As I replied above, no.
Sea ice is a poor proxy for global and hemispheric temperature. Sea ice is impacted by many other variables in addition to temperature: changes in ocean currents, surface wind direction, polar atmospheric pressure, etc.
Regards

January 21, 2010 7:17 am

Peter of Sydney–Thanks for emphasizing that the entire AGW debate is about 0.6 C. Much of the debate can be likened to early theologians arguing about how many angels can dance on the head of a pin. To believe that AGW is real, one would have to believe that the temperature measurements and the climate models are exceptionally precise. If you look at the purported science behind AGW, one would have to be BRAINDEAD not to be a skeptic. Please see my latest post at http://socratesparadox.com/?p=121#more-121 for further discussion.

Pascvaks
January 21, 2010 7:22 am

Ref – Bob Tisdale (06:24:15) :
“Sea ice is a poor proxy for global and hemispheric temperature.”
_________________
Thanks!
Keep thinking that we’ve over defined some things; that we’re discussing string theory weather. Like taking apart a clock and looking at all the pieces, we can’t tell the time. Seemed that Sea Ice cover was a pretty effective way to tell if we were actually cooling or warming from a Macro point of view. Thanks again!

Oliver Ramsay
January 21, 2010 7:27 am

Oefinell (02:45:04) :
ajstrata (19:47)
Interesting post but misguided. Your blog correctly states that it is not possible to get an accurate global surface temperature from current data. In fact that is well understood by Climate Scientists. That is why they do not measure the absolute temperature at all, they measure the temperature change or variance.
————–
So, what you’re saying is that when Alexei pulls on his parka and goes out to the airport stevensonsky screensky, he just looks at the top of the thermometer, not the whole thing?

boballab
January 21, 2010 7:34 am

Danimals (05:55:15) :

What would cause the “dropout” of sensors for Africa in recent years??
Dan
New Jersey, USA

My personal opinion for right now with just a cursory glance and going by memory the answer is found in history. During the 80’s and 90’s you had a lot of turmoil in Central Africa with governments comming and going. Things like the Genocide in Rwanda, Zimbabwe’s problems and famine across the region (remember that is what got Bush I into Somalia in the 90’s) would in my mind take precendent over the need for reading thermometers.
You will also find the same pattern in the Middle East. There is a blank spot in southern Saudi Arabia and with good reason: Its a desert with very little there so no theremometers. However if you stop/slow dow the progression of the animation you will see that gray blob spread to Iraq/Iran area. Again that happens at a time when there is wars fought in those areas and it spreads outwards from there through and into Afghanistan.

John Finn
January 21, 2010 7:38 am

Bob Tisdale (03:49:45) :
John Finn: You asked, “Are you sure that the ENSO lag is not playing some part in this, i.e. satellite readings don’t respond as quickly to El Nino development,” then clarified that satellite data meant TLT.
Note the difference between the GISTEMP and NCDC data.

Bob
Fair point about NCDC, but there’s an issue about the GISS 1998 anomaly. It’s always seemed, relatively speaking, much lower that the other datasets.

Richard M
January 21, 2010 8:09 am

Pascvaks, I suspect Bob is right on a yearly basis, however, at some timescale sea ice may very well be a good proxy for temperature. However, it might take another 50 years before we understand this enough to say anything concrete.

Tilo Reber
January 21, 2010 8:09 am

Bob,
The GISS divergence is no longer a mystery. Let me get things up to date a little. Since around 1998 GISS has been significantly diverging from HadCRUT and UAH and RSS. The reason that was given by Real Climate was that GISS did infill at the poles and the others did not represent the poles. Real Climate considered this to be a reason to consider the GISS data set as superior. Of course GISS is the only data set that shows warming since 1998. HadCRUT, UAH and RSS do not. I made the argument that it could not be the poles because it would take a huge amount of heating at the poles in order to produce such a large global divergence. Well, I was wrong. It was the poles. But I was not wrong about the huge amount of warming that it would take at the poles to cause the divergence. Four days ago Jim Hansen did a guest post on RC. What he had done was to take the GISS grided map and remove from it those areas that were not covered by HadCRUT. These were mostly polar of course. Then, when he did a global average on the map that resulted, the answer was very close to HadCRUT. So far so good. The divergence was in fact due to the poles. But the problem was that the infill that GISS is doing at the poles is completely irrational. For example, there were some polar cells that exist and show cooling in HadCRUT. These cells were converted to maximum hot by the GISS algorithms. The difference for some of these cells between the values that HadCRUT had for them and the value that GISS had for them was 6.7 C or greater. An outrageous difference. Then, if you look at the total anomaly that HadCRUT has for it’s most northern (a row with 30% coverage) row and compare it to the same row in GISS, the GISS anomaly was more than two times as large on a per cell basis. The GISS infill algorithms are pure nonesense. I have been pointing this out in the comments for Hansen’s latest post. And I have had no response from Hansen or Gavin. And there has been no meaningful response from the RC sycophants. Here are the Hansen charts that I am refering to:
http://www.realclimate.org/images/Hansen09_fig3.jpg
And here is one of the comments that I made on the Hansen thread.
“I’ve been talking about the qualitative problems that I noticed in the GISS charts in figure 3 above. So I thought that I would try to take a rough shot at quantizing the problem as well. I used the HadCRUT 2005 chart and the GISS 2005 chart to make comparisons. And I wanted to compare the HadCRUT gridcell row that was furthest north to the GISS gridcell row that occupied the same position. First I counted the number of gridcells in a row. There are 72. Then I counted the number of HadCRUT cells that have data in that row. There are 24. This means that the topmost HadCRUT row has 30% coverage. So I added up all of the covered gridcell anomaly values in the row. The total was 43.8. Dividing by 24 I got an average covered gridcell value for the HadCRUT row of 1.85 C. The GISS row obviously had 100% coverage using interpolation and extrapolation. When I added all of the gridcell anomaly values together for the GISS row I came up with 300. Dividing by 72 gave me an average anomaly value of 4.17 C. So the anomaly for the top row of GISS is 2.25 times as large as that of HadCRUT.
It seems to me that this reflect very badly on the GISS interpolation extrapolation algorithm. The other problem is that there are 6 cells in that top row that HadCRUT has negative values for. GISS turns them all to the maximum positive value. The difference is 6.7 C or greater per cell for those 6 cells.
I can only conclude from this that the GISS divergence from the HadCRUT data is an artifact of the GISS processing algorithms and not a reflection of actual temperature variance at the poles.”
As I point out, the GISS interpolation extrapolation algorithms are just nuts.

January 21, 2010 8:11 am

Just found a relatively old paper by Dr Vincent Gray at nzclimatescience.net with a really broad approach as to why surface temp recording is totally unreliable – his main point is that the equipment providing the current database is constructed from ad-hocery and little else. His premise gives a similar outcome to Strata’s – impossible signal/noise ratio due hopelessly inadequate earth stations network and many other factors.
Loved the video clip of the Yorkie EuroMP having his say on AGW!

Richard M
January 21, 2010 8:15 am

davidmhoffer (20:57:24) :, I believe 1850 is when temps appear to cross what is considered the median temp for the last 2000 years. Naturally, temps rose from the depths of the LIA to reach this median value. Therefore, temps have been rising for longer than 160 years.
If we consider the LIA and MWP to be around 400 years in length that means temps should start dropping about 200 years after crossing the median line. That would mean another 40 years of warming.
BTW, I don’t believe anything in nature is this exact. These numbers are only used as a projection based on my simple model. 😉

Richard M
January 21, 2010 8:16 am

For people who use IE there is a google toolbar capability to do spell checking. It works better than nothing but I still screw up quite often. No context checking capability.

PJP
January 21, 2010 8:19 am

AjStrata: You were obviously in a hurry to get this on the web – you do need to look at the spelling for typos (not important to the message) and also at the grammar in places (which is important to the message – there is at least one paragraph which makes no sense at all).
Anyway, comments on the actual content:
The discussion on how much variability there is in temperature over the day is somewhat irrelevant since we know this and so do the AGW “scientists”. The readings used as supposedly taken at the same time every day to remove such fluctuations (more on this later).
As for the noise signal masking any fundamental temperature range, well, there are methods that can be used to extract a useful signal from nois which is many times higher than that signal. For example, signals from voyager, many hundreds of millions of miles from earth, now outside the solar system itself.
However, I have seen no sign that any of these techniques are being used, they are using simple low pass filters assuming (with some reason) that the noise is all high frequency compared to the very low frequency (decades or centuries) temperature change signal.
There is no way you will ever see such a signal with a few months worth of data and using very basic statistical techniques, so I don’t find your results very surprising.
But even the techniques used by CRU and others (low pass filters) are not really good enough. They assume that there is no low frequency noise (at the decade/century frequency). In fact, there probably is. We have the signal caused by changes in the Sun output, the signal caused by the Earth’s non-circular orbit, the signal caused by the precession of the Earth in it’s orbit, changes in cosmic ray intensity, and probably a few I haven’t thought of. These signals need to measured and accounted for, of which I see no evidence.
Measuring temperature at the same time of day helps to eliminate some of the daily temperature fluctuation, and probably works tolerably well at a specific location. However, I have issues with the idea of comparing across locations unless the following (at least) are accounted for:
* Time of day needs to by UC/GMT bases not clock time if DST is in use.
* Time of day should be sidereal time because timezones are all over the place wrt “solar” time.
* Because of seasonal variations the only times compared should be those taken at the same time, at the same location, on the same date (i.e. 365 per year), to try to eliminate orbital effects.
Trying to do averaging of temperatures is a clumsy and amateurish approach.
Sit and think about it (which you appear to have done) and it becomes obvious that the required infrastructure is not in place and the accumulated data probably so full of artificial and natural signals as to make any sort of analysis at the level being attempted futile. If these people had any intellectual honesty they would say so.
As to this underlying temperature change, I think that most people agree that it is real. The “global” temperature does change, we see evidence all around us and in the human historical record.
The argument is whether this change is directly tied to CO2 concentrations in the atmosphere. There is no direct proof of this, so the AGW proponents use the fact that there is a CO2 absorption band in the infra-red in the CO2 transmission spectrum, and attempt to fit the observed CO2 concentration change to the observed temperature change as “proof”. Its not proof, of course, even if they could derive an accurate enough temperature signal to compare to the CO2 concentration.

Richard M
January 21, 2010 8:23 am

Phil., I want to congratulate you for pointing out the slight seasonable bias that appears in the UAH numbers. I remember you also pointed this out in June when the anomaly was zero. I was wondering if any of the AGW supporters would point this out at this time.

Gail Combs
January 21, 2010 8:34 am

ajstrata (19:47:43) :
“Hey folks, just finished a post which I believe provides proof (or at least a clear way to prove) the AGW theories are mathematically invalid….”

Interesting analysis. I was always bothered by the claims such as “this year is 0.1 C hotter than last year” given the inaccuracy of the measurements, sparsity of the data and non-uniformity of temperature over distance.
So how many BILLIONS have we wasted on this boondoogle

Richard M
January 21, 2010 8:36 am

Baa Humbug (22:22:47) :
I was told by bloggers that changing baselines didn’t alter anomalies. Not so coming from the horses mouth. What baseline do we use and why?
Changing baselines will change the absolute value of the anomaly but not the pattern. You will see all the values shifted upward or downward by a constant value. So, using a baseline from a cool period, as GISS does, will shift the entire anomaly record higher. And, of course, if this baseline contains more thermometers in colder locations, as E.M.Smith discovered, the result will go even higher.
In addition, if the 1930s values contain those colder thermometers it will also seem colder in comparison to the 2000s.

A C Osborn
January 21, 2010 8:38 am

RE
Oefinell (02:45:04) :
ajstrata (19:47)
Interesting post but misguided. Your blog correctly states that it is not possible to get an accurate global surface temperature from current data. In fact that is well understood by Climate Scientists. That is why they do not measure the absolute temperature at all, they measure the temperature change or variance.
Oefinell can you explain to me how they calculate the variance without using the Absolute Temperature?
What Temperature do they use in the variance calculation?
Or do they just make it up?

January 21, 2010 8:44 am

the use of a baseline also obscures the big picture. It is usefull for exposing trends because it exagerates them, but we tend to forget that they are, in fact, now exagerated. If the graphs that show a 1 degree or so rise against a baseline of zero were all done in degrees K (the actual measure of absolute temperaure instead of the Celsius relative to freezing point of water scale) we would see a scale of about 300 degrees with earth avergage temp hanging in about (I’m guessing now) 280 and rising slightly to 281. It would look like a flat line in other words, nothing to get excited about. Of course comfort zone for us humans would be an equally narrow range, but you see my point.

ajstrata
January 21, 2010 8:50 am

j.pickens,
I have a day job, I am sure you can muddle through as is.

Gail Combs
January 21, 2010 8:50 am

davidmhoffer (20:57:24) :
“… which the IPCC graph also shows, regardless of how warm the reconstruction highs are compared to the current measured highs, the trend is obvious…. warming since 1600, the IPCC says so and I choose to believe them:
http://en.wikipedia.org/wiki/File:1000_Year_Temperature_Comparison.png

You forgot the

John from MN
January 21, 2010 8:51 am

Ajstrata,
You have some serious problems with your work. First and most glaring is saying 2 degreesF or 2 degreesC is a larger percentage from a low temp day to a high temp day. False. Using your flawed theory your numbers would change just based on which scale the readings are being represented, as in C, F, absolute, K, etc.
Secondly if data is unadulterated by UHI, Station placement etc., a small number of land based stations could end up being accurate on average. Why? Because you have an equal chance of being high or low and an average gets rid of the noise in theory.
The problem of course in lies in the fact that many stations are not unadulterated in some form. Even lay of the land surrounding the station can make a large difference, such as higher or lower to surrounding land masses, open to the breezes the stir the temps and which direction the prevailing breezes are in comparison to temp influencing areas up or down wind from these influences.
But with all these influences obviously in play and they are not throwing out the high and low outliers, what it comes down to is, what is the LSD or noise in the model. I realize this is what you are trying to do but your method needs some major cleaning up……FWIW. Sincerely. John

Gail Combs
January 21, 2010 8:52 am

davidmhoffer (20:57:24) :
“… which the IPCC graph also shows, regardless of how warm the reconstruction highs are compared to the current measured highs, the trend is obvious…. warming since 1600, the IPCC says so and I choose to believe them:
http://en.wikipedia.org/wiki/File:1000_Year_Temperature_Comparison.png

You forgot the [/sarcasm]

ajstrata
January 21, 2010 8:53 am

C Osborne,
Actually they interpolate changes using guesstimates. You cannot measure .1% and claim you have a valid number for the globe (absolutes or variances). And in their effort to cover the globe they neglect to add in the natural error (+/- 2°F or greater) introduced in extrapolating distances of 100’s of kms – making their conclusion meaningless. I.e., we see a 0.8°C increase +/-10°???
If they worked for NASA launching and designing spacecraft they would be fired.

ajstrata
January 21, 2010 8:54 am

Hmm, seems people are more focused on spelling than science??

Tilo Reber
January 21, 2010 9:07 am

John Finn
“Fair point about NCDC, but there’s an issue about the GISS 1998 anomaly. It’s always seemed, relatively speaking, much lower that the other datasets.”
John, as I point out above, the polar infill algorithms that GISS uses are garbage. The fact that GISS is cooler than the other sources in 1998 is also a product of the GISS infill algorithms.

PJP
January 21, 2010 9:13 am

AjStrata: I think people are only worried about spelling because in many cases (I am not saying this is true of you!) poor spelling and grammar are an indication of the general quality of the paper.
What I see there is just being in too much of a hurry — what I see are typos and “flow of consciousness” grammar/structure.
You did the write-up in a hurry, and it shows.

January 21, 2010 9:16 am

John:
a small number of land based stations could end up being accurate on average. Why? Because you have an equal chance of being high or low and an average gets rid of the noise in theory
Your comment on scale affecting variance is right on, these kinds of things should be done in degrees K only. But your comment on averages holds only if the distribution of errors is random and the direction of the errors also random. Since the preponderance of observation stations are located in proximity to human activity, the preponderance of errors from urban heating are all in the same direction. Except for all those Russian stations that were in urban centers that got subsidies in the days of Communism for heating fuel dependant on the winter temperatures and so reported false lows to get more fuel subsidy. The only way station data gets down to an error rate that is reasonable is if you comb through each and every station and adjust it for urban heating index across its entire data set and THEN analyze.

January 21, 2010 9:23 am

Gail,
“… which the IPCC graph also shows, regardless of how warm the reconstruction highs are compared to the current measured highs, the trend is obvious…. warming since 1600, the IPCC says so and I choose to believe them:
http://en.wikipedia.org/wiki/File:1000_Year_Temperature_Comparison.png
You forgot the [/sarcasm
Actually I was mostly serious. Most of the reconstructions show a brief cooling period ending in 1830, but an over all warming trend since about 1600, these ones included. The only quible I have with this specific graph is that the tail end is observation data and the “normalization” between observation data and reconstructed data is done in a fashion that makes the reconstruction look relatively cooler. This is in fact the quible I have with almost ALL the IPCC statements. They are technicaly accurate and highly misleading all at the same time.

January 21, 2010 9:36 am

…and if you want to have a little fun with graphs and exageration and perception… take the CO2 graph from IPCC and extend it out to show CO2 steady at 280 ppm prior to 1920 and rising to meat the IPCC measurements starting in 1960. Then scale it so that the 38% rise in CO2 roughly equates to the 1 degree rise in temp over the same time period, then superimpose THAT on the IPCC temp graph starting in 1880 and it should be very simple to see where the impact of CO2 greenhouse effect kicks in. Too much work? OK, I will do it for you, but you will have to spot the place where CO2 kicks in yourself because I couldn’t:
http://knowledgedrift.files.wordpress.com/2010/01/temp-vs-c02-long-term1.png

A C Osborn
January 21, 2010 9:44 am

ajstrata (08:54:39) :
Hmm, seems people are more focused on spelling than science??
Not me.
I was asking Oefinell (02:45:04) : how they arrive at a Variance, hopefully they will come back and give me an answer.

Tim Clark
January 21, 2010 9:54 am

Pascvaks (05:40:28) :
Once again from the simple answers department, for those interested in a proxy record of “Southern” temperature anomolies since 1979 up until a couple weeks ago.
http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/current.anom.south.jpg

And:
Pascvaks (07:22:11) :
Seemed that Sea Ice cover was a pretty effective way to tell if we were actually cooling or warming from a Macro point of view. Thanks again!

Assuming your interpretation of that graph of Southern sea ice is in an uptrend (more ice), it doesn’t seem to agree with GISStemp.

January 21, 2010 10:28 am

Tilo Reber (08:09:31) : I haven’t thought the continued warming illustrated by GISTEMP as much of a mystery. The 1200km infilling also adds to the linear trend (when compared to UAH TLT anomalies) for Africa, Asia, and South America. Refer to:
http://bobtisdale.blogspot.com/2009/06/part-2-of-comparison-of-gistemp-and-uah.html
So Hansen was simply illustrating one reason for the divergence, not all.
Regards

Pascvaks
January 21, 2010 10:41 am

Ref – Tim Clark (09:54:10) :
“Assuming your interpretation of that graph of Southern sea ice is in an uptrend (more ice), it doesn’t seem to agree with GISStemp.”
________________
Right! While the Southern Ocean temps may be up ‘here and there’, the overall effect on the polar weather seems to say ~what? Seems a good gage of climate change (North or South or the Planet) is the amount of ice (or not). Just looked like a good proxy for climate change. If Gore&Co is right we’ll be sunbathing in Nome soon. If they aren’t, we might be skating in Tahiti soon, or not.

Rod Smith
January 21, 2010 10:44 am

My 2 cents worth:
First I believe using temperature as a base to predict climate is absolutely absurd. Climate in almost every area of the world is defined by more than temperature? (Think moisture, wind, pressure, etc., for starters.)
Average global temperatures are also patently absurd. Such a global average has absolutely no bearing on the the temperature where I live, or my climate.
So if we take poor/inadequate/flawed temperatures, then “correct” them with some (any?) sort of flawed logic, then finally “average” them, what do we have as a result?
Has anyone ever heard of GIGO?
I sometimes wonder how “climate scientists” ever convinced anyone these methods were either realistic or accurate?

Gail Combs
January 21, 2010 10:50 am

Oefinell (02:45:04) :
“…In fact that is well understood by Climate Scientists. That is why they do not measure the absolute temperature at all, they measure the temperature change or variance.
Provided the methodology is consistent from month to month then you can accurately see how temperatures have changed over time. You do not need to calculate an absolute temperature at all….”

The hypothesis is an increase in CO2 causes an increase in temperature and that CO2 is the primate driving force in the present day climate. So AJ’s, Anthony’s and Willis’ analysis of the temperature data are very important.
Anthony’s US surface station analysis showed changes in the methods used to collect temperature data and changes in the weather station siting and location that effect the temperature trend over time. Willis’ Darwin analysis [ http://wattsupwiththat.com/?s=darwin ] showed how a flat temperature trend was adjusted to show a rising trend. Now AJ shows how large the minimum error is in determining the temperature for a “grid” therefore negating the reasons for adjusting the Darwin data in the first place. On top of that add in the station drop out and the Russian’s complaint about the choice of data stations dropped and I do not care what type of “spin” you put on it, the data is crap and therefore ” the temperature change or variance” is also crap. “Climate scientists” are not comparing the change from the same thermometer at the same location with the same surrounding conditions but are looking at the variations in artificially “homogenized” and adjusted data.
If you have the time you can look at all of the local graphs of non-urban areas collected by John Daly. Some graphs show an increase in temperature, some a decrease but the thing I noticed was the sixty year sinusoid curve peaking in the thirties indicating the influence of the oceans on the weather. http://www.john-daly.com/stations/stations.htm
I do not see where diverting attention away from temperature to ” ..whether the climate sensitivites included in the IPCC models are correct.” gains us anything.
The argument reminds me of the Delphi technique used by the USDA to gain “Consensus” on animal ID, the technique didn’t work on the farmers either.

Gail Combs
January 21, 2010 11:28 am

Henry Pool (05:00:08) :
“I also thought that last winter here in South Africa was a bit longer and colder than usual, e.g. compared to the last 5 years. Most recently we are experiencing a lot of clouds and cloudiness, and cooler weather….”
I am in central North Carolina USA.For 2009 the spring was cold, the summer cooler than normal, the fall rainy (therefore warm) and the winter (December) was frigid. The summer high temperatures were at least 4F lower than normal and the December temp was an average of 2F below normal.
It is interesting to see where the warmer and where the cooler weather happens.

RobP
January 21, 2010 12:15 pm

I still have a problem with using temperature as the defining concept when we are really talking about energy. It has come up in a few posts that measurements of temperature are affected by air pressure, humidity etc. which touches on some of this, but I still can’t get over the changes we are measuring in temperature actually represent massive energy flows.
Given that all of the energy ultimately comes from the sun and there is little variation in this (in relation to the monthly changes in temperature) we have to get to the bottom of where the energy is coming from to warm the atmosphere, and going to when it cools.
I note a couple of mention here that the satellite measurements lag the surface trends and I am assuming that people mean the stratospheric temperatures gain (or lose) energy by exchange with the surface air. This is the kind of thing I am getting at, but when they are both moving – in absolute terms – in the same direction, where is the energy coming from?
In a previous thread it was suggested the oceans were the only place which could “store” the kind of energy and I asked if there was any measurement of the ocean heat content that could be included in the energy audit. People mentioned the SST records, but this is not a full account because it is surely affected as much (if not more) by humidity and pressure. Water loses a lot of energy by evaporation so the surface will constantly be losing energy in this way.
I am sorry for being a cracked record here, but, to me, the energy accounting approach is the only one which can begin to answer the “why is it warming” question.

Chuckles
January 21, 2010 12:23 pm

Several comments have queried the accuracy/quality of the raw temperature readings ‘at the weather station’. These vary tremendously from ‘unknown’ or ‘made up’ through ‘same as yesterday’ all the way up to ‘very good’.
For USHCN and electronic minmax thermometers (MMTS) the basic specs are – inherent accuracy is plus or minus 0.5 deg Centigrade. Electronic display accuracy is 0.1 units (deg Fahrenheit, and readings are captured by the operator to the nearest degree Fahrenheit.
e.g. If 76.6 is displayed, this implies an actual temperature somewhere between 24.3 and 25.3 degrees Centigrade, and would be recorded as 77 degrees Fahrenheit (25 Centigrade).
The system is calibrated annually.
YMMV.

John Finn
January 21, 2010 12:28 pm

Tilo Reber (09:07:57) :

John Finn
“Fair point about NCDC, but there’s an issue about the GISS 1998 anomaly. It’s always seemed, relatively speaking, much lower that the other datasets.”


John, as I point out above, the polar infill algorithms that GISS uses are garbage. The fact that GISS is cooler than the other sources in 1998 is also a product of the GISS infill algorithms.
I’m not sure you can say they are garbage with any certainty. That is the easy response – which probably goes down well on WUWT. It’s clear that that ENSO fluctuations have a sharper impact on satellite troposphere readings. The UAH anomaly for 1998 was 0.54 deg. If we use the same base period as UAH (1979-1998) the GISS anomaly was 0.33deg, i.e. 0.2 deg lower. If we compare GISS to Hadley/CRU using the Hadley base period (1961-1990), the GISS anomaly is 0.50; Hadley is 0.54 deg – much closer.
Since ~1992 the trends of GISS, UAH, RSS and Hadcrut have been within about 4 hundredths of a degree of each other.
I’ve just checked the GISS 2009 SH anomaly relative to the satellite base period (1979-1998). It’s +0.366 deg . The UAH anomaly for the last 6 months of 2009 is +0.365 deg . As I wrote in an earlier post: give it a few more months.

January 21, 2010 12:34 pm

Now AJ shows how large the minimum error is in determining the temperature for a “grid” therefore negating the reasons for adjusting the Darwin data in the first place.
The reason for adjusting the Darwin data was inter alia to account for station moves (the first one was destroyed by bombing shortly after being moved because the original location had become unsuitable).

Gail Combs
January 21, 2010 12:45 pm

Pascvaks (05:40:28) :
Once again from the simple answers department, for those interested in a proxy record of “Southern” temperature anomolies since 1979 up until a couple weeks ago.
http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/current.anom.south.jpg
Interesting. The problem, aside from those suggested by Bob Tisdale, is it only starts in 1979. This is after the cold of the seventies so it only shows the rise in temperature (recovery)
Too bad the satellite data record is not long enough yet to confirm or deny the sinusoidal curve seen in this graph from John Daly’s site of Jan Mayen Island. Not all of the records show the variation as clearly as this record does. But then it is a small mostly uninhabited island in the Arctic ocean (the Greenland Sea) so the ocean’s influence would be more clearly seen without influence by man.
http://www.john-daly.com/stations/janmayen.gif

Pofarmer
January 21, 2010 12:48 pm

Has anybody plotted a graph of the divergence of GISS to UAH?

Gail Combs
January 21, 2010 1:26 pm

A C Osborn (08:38:42) :
“Oefinell can you explain to me how they calculate the variance without using the Absolute Temperature?
What Temperature do they use in the variance calculation?
Or do they just make it up?”

They just make it up. That is why they “adjust” the early 20th century temp data down and the late 20th century data up, as they did to the Darwin record. Gives nice variance charts for the MSM to show the unwashed masses. That is why Mann just got rewarded with another million dollar plus grant. Honesty of course gets you a pink slip.

Editor
January 21, 2010 1:28 pm

Gail Combs,
from looking at data in different parts of the world, the sinusoidal curve comes up again and again, doesn’t it? I was shocked by how widepread the cooling is in the 1940-1970s period.
Warming 1880-1939:
http://82.42.138.62/GHCN/images/GISSraw1880to1939map.png
Cooling 1940-1969:
http://82.42.138.62/GHCN/images/GISSraw1940to1969map.png
Warming 1970-2010:
http://82.42.138.62/GHCN/images/GISSraw1970to2010map.png
Color key: http://2.bp.blogspot.com/_XKX5ZAAET-Y/S1Sla-dMmsI/AAAAAAAAAD8/Lhv-rmKqaYY/s1600-h/maplegend.png
Link to full post here.

Gail Combs
January 21, 2010 2:07 pm

davidmhoffer (09:23:40) :
Gail,
“… which the IPCC graph also shows, regardless of how warm the reconstruction highs are compared to the current measured highs, the trend is obvious…. warming since 1600, the IPCC says so and I choose to believe them:
http://en.wikipedia.org/wiki/File:1000_Year_Temperature_Comparison.png
You forgot the [/sarcasm
“Actually I was mostly serious. Most of the reconstructions show a brief cooling period ending in 1830, but an over all warming trend since about 1600…”
Try this instead: http://www.c3headlines.com/temperature-charts-historical-proxies.html
I would not trust any of the IPCC data since reading the climategate e-mails.
“IPCC assessment reports, and particularly their Summaries for Policymakers (SPM), are noted for their selective use of information and their bias to support the political goal of control of fossil fuels in order to fight an alleged anthropogenic global warming (AGW).” http://www.powerlineblog.com/archives/2010/01/025294.php
Authors whose reports have been included have complained about the twisting of their reports. The Resignation Letter of Chris Landsea from IPCC details some of the problems. http://www.climatechangefacts.info/ClimateChangeDocuments/LandseaResignationLetterFromIPCC.htm

January 21, 2010 2:38 pm

Pofarmer: You asked, “Has anybody plotted a graph of the divergence of GISS to UAH?”
That’ll be part of the next post on this, but I’m waiting for the KNMI update of the temperature datasets for December.

Romanoz
January 21, 2010 2:39 pm

Ajstrata
I have one concern about you article and that is you used basic statistics which makes certain assumptions about observations ie that they are independent. This is not true of climatic data which shows “persistence” ie the temperature/rainfall will not be dramatically different 100m or even 10km away.
Whatever one may think about Jones of the CRU he does understand the need to use spatial statistical tools in analysing climate data. The wiki article on spatial statistics is very good, see especially spatial autocorrelation and spatial interpolation. For those who understand Time Series analysis, the problems of autocorrelation will be familar.

January 21, 2010 2:52 pm

John Finn: You wrote, “…but there’s an issue about the GISS 1998 anomaly. It’s always seemed, relatively speaking, much lower that the other datasets.”
TLT anomalies can be more volatile:
http://i50.tinypic.com/208jqyq.png
And HADSST2, part of HADCRUT, has a step up in 1998 that is caused by a change in the SST dataset used by the Hadley Centre, which exaggerates its response to the 1997/98 El Nino. The OI.v2 data in the following graph is what GISS has used since 1982:
http://i47.tinypic.com/2gx2nwm.png
From this post:
http://bobtisdale.blogspot.com/2009/12/met-office-prediction-climate-could.html
Regards

Gail Combs
January 21, 2010 2:54 pm

Phil. (12:34:45) :
“Now AJ shows how large the minimum error is in determining the temperature for a “grid” therefore negating the reasons for adjusting the Darwin data in the first place.
The reason for adjusting the Darwin data was inter alia to account for station moves (the first one was destroyed by bombing shortly after being moved because the original location had become unsuitable).”

The adjustments were minus one degree before 1940 and then stepwise up to over two degrees giving a total of close to a six degree per century adjustment to an otherwise flat temperature trend. (Gee makes it look just like Mann’s hockey stick) I can see a need for a one time half a degree adjustment up after the station was bombed and possibly another couple of tenths down after it was destroyed in a cyclone, IF the station was moved a significant distance and/or IF the calibration of the thermometers indicate the need for an adjustment because the thermometer is off. Otherwise you leave the data alone.
To be repeatedly adjusting the data without good solid evidence that an adjustment is absolutely necessary just adds MORE error to the data. Instead information about changes should be noted and if possible parallel readings taken. Since the original locations of the weather stations are often known that still could be done in many cases by going back to the original site and setting up a temporary station for up to a month so a more accurate and SCIENCE BASED adjustment can be calculated. The climate scientists were just too lazy to get off their duffs and do the work.
Reminds me of the “quality control” done on the USA surface station networks. Lazy @#$! government scientists are more interested in attending conferences and writing papers to get more grant money than doing real work aka Steve McIntyre’s “Starbucks Hypothesis” http://wattsupwiththat.com/2009/12/20/steve-mcintyre-on-fox-news-special-tonight-about-climategate/#more-14338

Gail Combs
January 21, 2010 3:19 pm

vjones (13:28:30) :
“Gail Combs,
from looking at data in different parts of the world, the sinusoidal curve comes up again and again, doesn’t it? I was shocked by how widepread the cooling is in the 1940-1970s period. “

Yes it does seem to be seen all around the world although not in all the records. That is why looking at the individual records gives a better idea of what the climate has really been doing. Sort of like this winter where we have abnormally cold weather over the North America, Europe and China and normal weather in the southern hemisphere. Mapping those sorts of changes are going to yield a lot more information than some artificial hype of “Theres a 0.6 C increase in world temp/ it has to be due to increased CO2”
There is a heck of a lot of all sorts of data out there stretching back for more than a hundred years. I would not be surprised if Piers Corbyn and joe Bastardi have put a lot of it together and uses it for their predictions. It really makes me angry when I think of the amount of progress the world could have made in climate science if Mann, Jones et al had been honest scientists instead of…. Oops I’ll get snipped.

January 21, 2010 4:55 pm

Try this instead: http://www.c3headlines.com/temperature-charts-historical-proxies.html
I would not trust any of the IPCC data since reading the climategate e-mails
Tx Gail! what a great link!
They mostly confirm what I have seen in other reconstructions… nasty cooling period about 1830ish stuck into a general warming trend since about 1600. Same pattern as the IPCC spaghetti, but showing that it was warmer in the mwp than it is now. Was interested to note though that while almost all of them showed the nasty cooling around 1830, a handfull did not show the same around 1600.

Can't wait for the peer-reviewed publication on station siting
January 21, 2010 5:30 pm
January 21, 2010 5:33 pm

I am sorry for being a cracked record here, but, to me, the energy accounting approach is the only one which can begin to answer the “why is it warming” question.
I think you are 100% correct. All the temperature record accomplishes is to be a proxy for energy balance. When you try and correlate the models to energy balance there is invariably a hole in the models.
BTW, while you can store energy in the oceans, you can ALSO store “cold” in ice. The transition from ice to water consumes a huge amount of energy. At 1 watt/m2 (the current contribution from man made CO2) it would take about 12 years to warm the top 100 meters of the ocean 1 degree C. If I recall my physics correctly, assuming the ice was already at the melting point, the same would melt about 1.3 meters of ice.

January 21, 2010 5:41 pm

If I recall my physics correctly, assuming the ice was already at the melting point, the same would melt about 1.3 meters of ice.
oops… correction. heat capacity of water is 4j/g/degree at zero and heat of fusion is 80… so assuming I got the heating up the ocean thing right then the ice thing would be about 5 meters.

Geoff Sherrington
January 21, 2010 6:10 pm

Australia has found a permanent answer to its poor climate records.
http://www.satirewire.com/news/jan02/australia.shtml

January 21, 2010 6:52 pm

Can’t wait for the peer-reviewed publication on station siting (17:30:14) :
To the purveyor of nonsense who fails to use his or her real name: What does your link have to do with this the topic of this thread? Or are you just trying to hide it from Anthony by throwing it onto a thread that has nothing to do with the Surface Station Project or with the USHCN…or with the Northern Hemisphere, for that matter? What, are you going to go back to Eli’s now and claim you’ve made your point here at WUWT?

Jay Sezbria
January 21, 2010 8:23 pm

Interesting to see that the surface stations project uncovered a cool bias to the temperature record. Great to see all the hard work of the volunteers come to fruition, though it is worrying that the rapid warming trend wasn’t simply an artifact.

January 22, 2010 1:52 am

Jay Sezbria: Had you bothered to check, the paper referred to by “Can’t wait for the peer-reviewed publication on station siting” was discussed in a number of posts here at WUWT six months ago. Just type Menne into the search function:
http://wattsupwiththat.com/?s=Menne
Also, as I asked in my 18:52:34 reply, what does Menne et al have to do with the subject of this post. You’re in the wrong hemisphere.

John Finn
January 22, 2010 4:03 am

Bob Tisdale (14:52:33) :
John Finn: You wrote, “…but there’s an issue about the GISS 1998 anomaly. It’s always seemed, relatively speaking, much lower that the other datasets.”
TLT anomalies can be more volatile:
http://i50.tinypic.com/208jqyq.png

I did acknowledge in my earlier post (John Finn (12:28:17) 🙂 that “It’s clear that that ENSO fluctuations have a sharper impact on satellite troposphere readings”. But, by the same argument, it’s possible that other factors amplify the surface temperatures (relative to the troposphere).
I’m not out to defend GISS, Hadley or the UK met office (certainly not them) but I do like to “play fair”. I’m not implying that you don’t. In fact every post I’ve seen from you suggests that you are as keen as I am to present the facts as accurately as possible.
I would just like to make this observation, though. If GISS are fudging the data to make it appear that it’s warming more than it actually is, they are not doing a very good job of it. Over the last 20 years the difference in trends between UAH and GISS is wafer thin.

January 22, 2010 6:16 am

Richard M (08:23:14) :
Phil., I want to congratulate you for pointing out the slight seasonable bias that appears in the UAH numbers. I remember you also pointed this out in June when the anomaly was zero. I was wondering if any of the AGW supporters would point this out at this time.

Thank you, it’s still relevant, the annual cycle in recent UAH data gives high anomalies in winter which has to be borne in mind when considering the indication of 20 yr records in the current month. I don’t see what this has to do with ‘AGW support’ or otherwise.

January 22, 2010 6:32 am

Gail Combs (14:54:22) :
I can see a need for a one time half a degree adjustment up after the station was bombed and possibly another couple of tenths down after it was destroyed in a cyclone, IF the station was moved a significant distance and/or IF the calibration of the thermometers indicate the need for an adjustment because the thermometer is off. Otherwise you leave the data alone.

It was removed a significant distance in ~1941, so no ‘if’ needed
To be repeatedly adjusting the data without good solid evidence that an adjustment is absolutely necessary just adds MORE error to the data. Instead information about changes should be noted and if possible parallel readings taken. Since the original locations of the weather stations are often known that still could be done in many cases by going back to the original site and setting up a temporary station for up to a month so a more accurate and SCIENCE BASED adjustment can be calculated. The climate scientists were just too lazy to get off their duffs and do the work.
Difficult to do that accurately in this case since the effect of the tree growing over the weather station (which was the prime reason for the move) would be difficult to replicate and the destruction of the buildings by bombing will have changed the microclimate.

Connor
January 22, 2010 2:57 pm

LOLOLOLOLOLOL!
You guys should start another temperature station audit, see if you can cover another COOLING BIAS for NASA LOLOLOLOLOLOL
A spectacular own goal! You lot must feel preeeeeetty stupid! 😀
[i]”wish to thank Anthony Watts and the many volunteers at surfacestations.org for their considerable efforts in documenting the current site characteristics of USHCN stations.”[/i]
http://www1.ncdc.noaa.gov/pub/data/ushcn/v2/monthly/menne-etal2010.pdf

January 22, 2010 3:27 pm

Connor (14:57:39)…
…is such a tool. He can’t see that NOAA is furiously backing and filling because Phil Jones is now blaming them for providing the original raw temperature data to the CRU. Notice the paper was ‘revised’ a few days after Climategate hit? That’s not a coincidence.
Also, notice that raw and urban temperatures match, until they started “adjusting” the data in the ’50’s. Just because a few jamokes put together a paper trying to cover the NOAA’s butt doesn’t mean squat — except to someone who believes that a temperature station sitting on an airport tarmac or next to an A/C exhaust will read the same temperature as a rural station out in a grass field: click
NOAA has been routinely “adjusting” the temperature record to show artificial warming, and now they’ve been caught: click [takes a few seconds to load]
That won’t matter to the True Believers like Connor. Cognitive dissonance has taken hold, and they can’t think straight any more. But the rest of us can see what’s going on. Most all of the inconvenient rural stations have recently been eliminated: click
That leaves the tarmac stations, which – unsurprisingly – show warming. But it isn’t real.

Connor
January 22, 2010 4:10 pm

The paper was published in 2010, genius! Ah, it must suck to be full of so much fail like you guys! Epic lulz! 😀

January 22, 2010 7:52 pm

True, I am a genius by comparison; I can read dates.
Cut ‘n’ pasted from Connor’s link:

Submitted to
Journal of Geophysical Research – Atmospheres
August 27, 2009
Revised December 21, 2009

[Revised right after the Climategate scandal made the news, and 4 months after it had been submitted to the publisher – AKA ‘backing and filling.’]

Reading comprehension, me boy. It matters.

Latinamerican
January 23, 2010 2:38 pm

What does TLT means? It is very annoying how in the USA you use acronyms and you do not have the courtesy to post what those acronyms mean. This blog is excellent but it is supposed to be a blog where we laymen can see the extreme fallacies and falsehoods that academia, press and bureaucracy systematically feeds us. But when you use unexplained acronyms you greatly damage the superb work that you are doing with these posts.
[Maybe this will help: http://www.acronymfinder.com/AFAIK.html ~dbs, mod]

maksimovich
January 24, 2010 12:17 am

Comparing the NZ and Giss 2009 sh T Anomaly ,there seems to be a divergence in the plots ie opposite sign.
With NZ we can see the effects of the ozone hole 9and breakdown of polar transport in AUG however Giss shows a downtrend it night be an interesting exercise to test other nid latitude sites in the SH
http://i255.photobucket.com/albums/hh133/mataraka/nzgisst2009.jpg

Connor
January 24, 2010 1:30 am

Maybe if you look hard enough you people will find A YETI!!!! ZOMGLOLWTFBBQ!

yonason
January 24, 2010 7:55 am

Connor (01:30:49) :
“Maybe if you look hard enough you people will find A YETI!!!! “
Well, we have been receiving reports of sightings…

Deech56
January 26, 2010 6:43 pm

RE Smokey (19:52:38) :

True, I am a genius by comparison; I can read dates.
Cut ‘n’ pasted from Connor’s link:
“Submitted to
Journal of Geophysical Research – Atmospheres
August 27, 2009
Revised December 21, 2009
[Revised right after the Climategate scandal made the news, and 4 months after it had been submitted to the publisher – AKA ‘backing and filling.’]”
Reading comprehension, me boy. It matters.

Why the is revision date noteworthy? Menne, et al. were probably revising this as soon as they got the review back. It’s more likely that they were working to publish before the surfacestations stuff was submitted and that the paper was not related to the stolen e-mails (unless people think that the riginal submission was done in anticipation of having the e-mails stolen – what forecasting!), which didn’t seem to cover the NASA-GISS temperature record. Kind of a shot across the bow, but that’s just my guess.
The quick turnaround suggest that little revision was necessary – certainly not a major re-analysis. But that’s just based on my experience with reviews.

Deech56
January 26, 2010 6:45 pm

OMG – should be “Why is the revision…” Guess one typo will void my post. At least the formatting was just fine.

January 26, 2010 7:15 pm

Deech56:
“Why the is revision…”?????
[Actually, I never noticed it until you pointed it out. But now I’ll never let you forget it. Not in a million years! That’s how it must be. It’s in my job description.]
Is revision date the noteworthy because Connor jumped on my comment: “Notice the paper was ‘revised’ a few days after Climategate hit? That’s not a coincidence.”
Climategate hit around 19 November 2009. [We’ll never know what the actual revision was; I just mentioned it because the timing was noteworthy.] But Connor only looked at the publication date.
[Unlike your labeling of the CRU emails as being “stolen” – the rest is speculation]. So, how do you know the emails were stolen? Have you got facts not in evidence?
Anyway, Connor mistakenly thought he had a gotcha on me with the dates. I just had some fun setting him straight, that’s all. Happens all the time.
Is embarrassment the good for him, no?