Normally we see the HadCRUT monthly temperature data released by about the 20th of each month. It is now November 2nd, and the data has not yet been published. I can’t recall them ever being two weeks late.
![]()
HadCRUT (Hadley Climate Research Unit Temperature, UK)
HadCRUT3 anomaly data which can be found here
description of the HadCRUT3 data file columns is here
Perhaps they are a bit flummoxed with recent developments, such as the erasing of “sensitive” temperature data, or maybe they are just busy processing FOI’s?
Or maybe its the new supercomputer holding up the data?
Maybe the row over one tree has them delayed. Or maybe the 25% funding cut?
Who knows, but it sure is odd to see them so late getting the data out the door.
The Met Office is not in fear of privatisation because it ‘failed’ to pass the Governments own ” Commercial Audit ” recently. ie it does not generate sufficient income. I expect its staff are secretly quite pleased.
On a local level, the Met Office signally failed to forcast Englands very pleasant 6 week Indian Summer, indeed it spent most of it telling us it was going to end ‘tomorrow’.
They are so obsessed with Climate Change that they have forgotten to do what we pay them for which is to tell us what the weather is going to be like for the next couple of days.
Pamela Gray
Or I could say it this way:
May they’re still stuck in that car wreck?
Why did you change it, you bothered to read about Anthony’s interests?
😉
I often email the Met Office – just to let them know they’re wrong. He he. Anyway, I contacted them about the September data a couple of weeks ago. This was the reply:
“We have been in contact with our Climate Monitoring and Research Scientist. He has advised we are still waiting for some of the land temperature data to be received. Once we have the complete data sets this will be published. Hopefully the missing data will arrive with in the next week.”
Bulldust (18:19:47)
“I’m sorry, Phil. I’m afraid I can’t do that…”
😀
I see nothing odd in Gavin Schmidt pointing out that he is not involved in the production of the GISS temperature record. If I were not involved in a project I would be inclined to point that out also regardless of my level of confidence in the product.
New?
“A substantial proportion of the September CLIMAT monthly land station summary report data that was sent over the GTS (Global Telecommunication System) was obviously incorrect. For the past few weeks we have been liaising with the sources to gain a version that was correct. As this issue affects a substantial portion of the globe we are not in a position to release a ‘global’ estimate. Nor will we do so until we are satisfied that an adequate amount of verified data is present.”
Found at link: http://hadobs.metoffice.com/hadcrut3/index.html
IMHO they’re just double-checking the figures. Publishing results with a scientific value is not like producing a tv series or publishing some random news on blogs. There’s hard work involoved. In fact, I would tend to trust “slow” indicators better than “rapidly published” ones.
Look, they’ve been spending a lot of time at the vets with the dog. Must have been something he ate.
Well, between Barry Foster @ur momisugly 4:53:44 and Pascvacs @ur momisugly 6:17:20 there is an obvious contradiction.
Well, the report is late; obviously something is out of the routine. Why would reports from a long running network be delayed? What was ‘obviously wrong’ about what was initially received? Why should it take this long to get ‘correct’ information?
Questions, questions. There is something rotten in HadCru, and I can smell it all the way over here.
===================================
Rob Vermeulen (06:24:44) :
“IMHO they’re just double-checking the figures.”
I suspect that is correct and may be a routine they have become more sensitive to after the problems last year around this time. The word ‘obvious’ in [b]Pascvaks (06:17:20) : [/b] link
suggests that someone is at least thinking about the input carefully rather than glancing at it before processing. At least I hope it does.
So …. this ‘obviously incorrect’ data …. I wonder what it was that was incorrect about it?
I cant figure out how the amount of sea ice change average sea temperature in HadSST2.
If there is less ice and more open sea would that probably be close to freezing temperature and probably show a lower average Ice free temperature. It the water freeze will that probably be showing a higher average temperature of the unfrozen water.
But I cant find the gridding calculation for open water. Anyone out there that have checked the HadSST2 calculation for open sea?
“copy this quickly before its pulled hahahah
http://www.srh.noaa.gov/srh/jetstream/atmos/ll_gas.htm”
They have already pulled it. Luckily i took a screengrab last night.
Phil (18:30:42) :
i think you may be spot on with your last line.
for anyone not aware of Chiefio’s work on this, visit
http://chiefio.wordpress.com/category/agw-and-gistemp-issues/
Lucky,
Pamela Gray (20:59:44) :
What if all the sensors happened to be located, all 126 or however many they use now,
It’s 136. From:
http://chiefio.wordpress.com/2009/10/24/ghcn-california-on-the-beach-who-needs-snow/
Would You Believe a Little Over 2 Thermometers Per State?
And no, that is not a “Maxwell Smart” imitation.
My “by years” tool told me there were 136 active thermometer records in the U.S.A. in 2008. For the whole thing. Including Alaska and Hawaii. But in fairness, Hawaii got three thermometers, all at airports.
And this brings up a point about the fragile nature of the gauge network. What if we the people, the keepers of the sensors, just decided to get off this silly train?
While I loved your imagery, the reality looks to be far more dull, and far more damaging.
GIStemp gets it’s “raw” input data from 2 sources. One is the USHCN – the US thermometers that Anthony et. al. are so focused on. These are all still in use and still being used to forecast weather. The data flows, via NOAA, into a publicly available data set. Unfortunately, NOAA changed the data format to something called Version 2 or USHCN.v2 and when they did that, GIStemp declined to do any maintenance programming to download the new USHCN.v2 file with all the new temperature readings in it. So US data ‘cuts off’ from USHCN at that point in time.
The US data also flowed in via the GHCN, the Global thermometer network, but after a conversion from F to C and a few other changes in how it is fudged. Fine you say?
Well, not quite…
It seems that at about the same time, GHCN decided to drop all but 136 US thermometers (leaving California with NONE in the cold snowy parts: we have 4, one in SFO and 3 near the beach in So.Cal: LA, San Diego, Santa Maria). Kind of hard to get anything OTHER than record heat when you do that…
So every single temperature anomaly reported for the USA since 2007 is flat out bogus.
Every
Single
One.
Why? Who knows. You might suspect a similar failure to follow the USHCN data format change; but then you would have to explain the 136 that do get in to GHCN somehow…
There was a World Meteorology agency of some sort calling for a Global Climate system about 1990 and it’s possible that everyone just hopped on the bandwagon of picking a couple of hundred “good thermometers” for each continent and didn’t bother to think through that this would completely break GIStemp. (And, I suspect, Hadley. They must get their input from somewhere, and if not GHCN, then where? Hmmm?)
I’ve been exploring the degree of damage to the GHCN data set and found great deletions scattered all over the world. Many focus on a date near 1990. The end result of it is little thing, like:
Japan has no thermometers above 300 meters. Nope, not a one. It’s as flat as Kansas as far as GHCN is concerned.
93% of the thermometers in the USA are ignored. If you want their data, you have to “go fish” in the USHCN.v2 pool on your own.
Australian thermometers are migrating to the northern beaches.
And in New Zealand, and “Inconvenient Island” (Campbell Island) located down near the polar zone was deleted, giving the Kiwi data set a nice lift and a good shot of temperature tonic. If you take the record out of the past (which GHCN does not do…) so that you have a stable composite instrument to measure with, New Zealand is not warming. Such power in one little Island…
and there is more. But there is also some good news.
When you look at the bits that have not been molested, like Argentina, you find that it isn’t warming. All the parts of North and South America that have not had significant thermometer deletions show stable temperatures.
The Pacific Ocean (minus Australia and New Zealand, that have been molested) shows no warming at all to speak of. Nice stable temps with a bit of a cyclical roll that I think is the PDO flip flop flipping.
I’ve done an analysis of every continent, most major countries by land area, and a couple of minor places and links to them are here:
http://chiefio.wordpress.com/2009/11/03/ghcn-the-global-analysis/
which also has directions on how to download the data and includes, near the bottom, the source code for the software to do a lot of this yourself.
So what do I think happened with the Hadley data (this time…)? I think it was likely one of two things:
1) It showed the world really IS getting cold and they just could not believe it. (I think this is unlikely, given the “cooked” state of GHCN).
2) It showed things like the bogus 115 year record heat in California and maybe, just maybe, they realized that this was “A Whopper Too Far” to cross…
I’d bet my money on #2. I would speculate they are using GHCN data, and that they have just started to have the lag from 2007 that most averaging type adjustments will put into a series wear off. This would result in unbelievable warming spikes (such as in California) and they are probably trying to “figure out what is wrong”. Well, it only took me about a half a year, so give them a while… But make sure they keep the dog on a tether and his bowl full of kibble. We don’t need any more Hadley being “The Dogs Lunch”, now do we …
And for those who have found what I’m doing worthy of mention to others: Thanks! It makes it worth while…
Now what I’m left to ponder is this:
With GIStemp (marginal to begin with) completely broken now due to the USHCN.v2 breakage and GHCN ‘thermometer mass dying”;
With Hadley being “The Dogs Lunch”;
With satellites being too short a history to use and calibrated against what again? Oh, perhaps the land series from Nasa and … oh deary me can i get back later….
Exactly WHAT temperature series is there to use to make any statements about the climate or current trends of temperatures?
I think I’ll look out the window and ask the TV Weather Man …
Ah, it’s Hansen, in the temperature record, with a memory hole machine. Let the inquest begin.
And thanks, E.M. Very impressive and useful work.
====================================
As Nigel, now Lord, Lawson (former UK Chancellor of the Exchequer = Finance minister) once said, it always takes longer to add them up when the numbers are giving you bad news. Which makes you wonder what the data will tell us when it is released.
MarkE (06:47:40) :
Your Nigel Lawson quote: “…it always takes longer to add them up when the numbers are giving you bad news.”
Which would be the bad news, cooling or warming?
Anthony: HADCRUT3GL for September has been posted:
http://hadobs.metoffice.com/hadcrut3/diagnostics/global/nh+sh/monthly
It’s out now:
2009/09 0.457 0.473 0.442 0.649 0.266 0.457 0.451 0.650 0.265 0.650 0.265
Something seems funny with those numbers at the end. What are the chances of this series at the end happening as a repeating identical pattern?
Pamela Gray (07:55:43) :
Something seems funny with those numbers at the end. What are the chances of this series at the end happening as a repeating identical pattern?
Good, since they’re for September, the ninth number after the date. Patrik must have had problems pasting, or he’s having a little joke at HadCrut’s expense.
With 3 decimals?
0.000
Hmmm, if you lot don’t trust the data anyway, why do you want it so bad?
Patrick – those are not the figures from the source I usually go to for those figures: http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3gl.txt
Which don’t have the Sept data yet and the year up to August is given as
2009 0.384 0.364 0.371 0.415 0.407 0.499 0.498 0.532
Where the heck are your figures from?