CRU produces something useful for a change

World temperature records available via Google Earth

Climate researchers at the University of East Anglia have made the world’s temperature records available via Google Earth.

The Climatic Research Unit Temperature Version 4 (CRUTEM4) land-surface air temperature dataset is one of the most widely used records of the climate system.

The new Google Earth format allows users to scroll around the world, zoom in on 6,000 weather stations, and view monthly, seasonal and annual temperature data more easily than ever before.

Users can drill down to see some 20,000 graphs – some of which show temperature records dating back to 1850.

The move is part of an ongoing effort to make data about past climate and climate change as accessible and transparent as possible.

Dr Tim Osborn from UEA’s Climatic Research Unit said: “The beauty of using Google Earth is that you can instantly see where the weather stations are, zoom in on specific countries, and see station datasets much more clearly.

“The data itself comes from the latest CRUTEM4 figures, which have been freely available on our website and via the Met Office. But we wanted to make this key temperature dataset as interactive and user-friendly as possible.”

The Google Earth interface shows how the globe has been split into 5° latitude and longitude grid boxes. The boxes are about 550km wide along the Equator, narrowing towards the North and South poles. This red and green checkerboard covers most of the Earth and indicates areas of land where station data are available. Clicking on a grid box reveals the area’s annual temperatures, as well as links to more detailed downloadable station data.

But while the new initiative does allow greater accessibility, the research team do expect to find errors.

Dr Osborn said: “This dataset combines monthly records from 6,000 weather stations around the world – some of which date back more than 150 years. That’s a lot of data, so we would expect to see a few errors. We very much encourage people to alert us to any records that seem unusual.

“There are some gaps in the grid – this is because there are no weather stations in remote areas such as the Sahara. Users may also spot that the location of some weather stations is not exact. This is because the information we have about the latitude and longitude of each station is limited to 1 decimal place, so the station markers could be a few kilometres from the actual location.

“This isn’t a problem scientifically because the temperature records do not depend on the precise location of each station. But it is something which will improve over time as more detailed location information becomes available.”

This new initiative is described in a new research paper published on February 4 in the journal Earth System Science Data (Osborn T.J. and Jones P.D., 2014: The CRUTEM4 land-surface air temperature dataset: construction, previous versions and dissemination via Google Earth).

For instructions about accessing and using the CRUTEM Google Earth interface (and to find out more about the project) visit http://www.cru.uea.ac.uk/cru/data/crutem/ge/. To view the new Google Earth interface download Google Earth, then click here CRUTEM4-2013-03_gridboxes.kml.

Advertisements

  Subscribe  
newest oldest most voted
Notify of
Steve W

At last we are getting somewhere with transparency. Well done to the CRU!

John Peter

So will it now be possible for independent analysts to ascertain if CRUTEM4 is reliable as an indicator or if “warming” has been added lately by reducing pre satellite era temperatures through “adjustments”?

Are those temperature records raw data or have they been, “Hansen-ed”? Is not putting carefully-selected everythings on Google a good way of turning doubtful computer-generated data into accepted truth to underwrite the CAGW narrative? “A lie will be halfway round the world before the truth can get its boots on”. One of Stalin’s favourite sayings just may be the watchword behind this move. CRU has form in this matter.

bertief

Top job CRU. Kudos to Dr Osborn and the team. It’s great to see this kind of openness. Given the amount of rain the UK has had lately I wonder will there be a project to get precipitation and wind speed data on there too?

Rob

A potentially positive step. I have most of the raw and “corrected” data for the U.S.
I’ll be watching with interest!

Mailman

As a few others have already touched upon it would be interesting to see if you could run “reports” using unadjusted temperature data wouldn’t it?
Somehow I doubt this data will be available? Hopefully I’m wrong and unadjusted temp data is available but I suspect it’s not.
Regards
Mailman

Somebody

“the temperature records do not depend on the precise location of each station”
Notice the wording. ‘Temperature records’.
Not temperature, which of course it depends on position, they do not have a system at thermodynamic equilibrium to have the same temperature everywhere. In fact, to have a temperature defined…

rtj1211

I have to say that getting the defendants at trial to put their version of events, verbatim, into the judge’s summation to the jury does seem a slightly strange way of proceeding in climate justice.
The data should only be ‘put out there’ after it is accepted that it is raw data. Sanitised data can only be put out there if it includes all the details of how it was sanitised and how that sanitisation has been justified.
Otherwise, you’ve just got ‘digital warming’ gone mad……..

Stephen Richards

Peresumably these are the adjusted, sanitised and greenpiss approved temperatures. What’s the point. Let’s see the RAW date. You know, the stuff they haven’t adulterated in the name of CO² tax.

wayne

“… the world’s temperature records …”
What the heck does that mean? Will the CRU also provide the adjustments applied to the temperature records per location or cell and actually make this transparent and “open”? Without letting people also see what they have also done to the raw temperature data this is but another huge layer of Global Warming propaganda. No? Well many understand why… because all of the collective adjustments are upward, inversed, and then applied negatively backwards to make the far past reading cooler than the thermometers literally read at that historic time and location. Walla… climatologist-made Global Warming.
If I am wrong on this and CRU used the pre-adjusted records on Google Earth, my apologies in advance, but that will be an incredible first.

Kev-in-Uk

Regarding others comments about data adjustment, I also tend to view with suspicion. If unreasonable data adjustment has taken place, it might be fairly easy to find out (in reasonably developed areas at least). Take your local ‘main’ library for example, it may house weather records, or a copy of them. Hence, a bit like the surfacestations project, a number of volunteers could perhaps search for the ‘written’ information and then compare to the ‘official’ record shown on this dataset?
Much as I am sure that many older written records could have been removed (intentionally or not) – I’m also sure that many will remain forgotton on dusty bookshelves!

D, Cohen

It should provide access to the raw data at the locations — the specified data stations — where it was collected by specified individuals or organizations. Otherwise, not interested. Don’t be fooled by the illusion of having nothing to hide.

Disputin

As I read it, Wayne is being a little unfair (quite understandable, given the history of bad faith from warmisti). It seems they are going to put actual weather station records on. If so, well done indeed CRU!
I’ve long argued that, to see if the world is warming, or cooling, or just buggering about, rather than trying to get an “average temperature” which is completely meaningless for all the reasons people on here have said, it is only necessary to look at trends of individual stations. Then any trends can be evaluated, e.g. for urbanisation or other land use changes and obvious causes eliminated. Then, should you wish, you can take the average trend.

Mailman says: February 6, 2014 at 12:44 am
“As a few others have already touched upon it would be interesting to see if you could run “reports” using unadjusted temperature data wouldn’t it?
Somehow I doubt this data will be available? Hopefully I’m wrong and unadjusted temp data is available but I suspect it’s not.”

This is gridded data, so the notion of “raw” data doesn’t really apply. It’s locally averaged, and some kind of homogenisation is likely done; it should be.
If you want unadjusted station data, it’s all on the GHCN unadjusted file.
If you want to see that in a GE-like environment, it’s here, month-by-month. It won’t, currently, pop up a graph, but it will produce the monthly numbers.

Berényi Péter

Here is the paper.

We have now developed a KML interface that enables both the gridded temperature anomalies and the weather station temperatures to be visualized and accessed within Google Earth or other Earh browsers.

Fine. Where is the interface definition?
If it is done properly it should be flexible enough to accommodate to any other dataset, including raw temperature data, wind speed, pressure, precipitation, etc.
Therefore this device needs to be published urgently under GPL in a revision control system, otherwise it is nothing but another useless propaganda tool.

Patrick

“Nick Stokes says:
February 6, 2014 at 1:47 am
This is gridded data, so the notion of “raw” data doesn’t really apply. It’s locally averaged, and some kind of homogenisation is likely done; it should be.”
Rubbish! Stokes stop trying!

William McClenney

Now wait just a multi-decadal minute! Haven’t I seen this P..D…O…. before?????

Patrick

I disagree. It “appears” useful, to the “useful idiots” (Politicians and the like). Given the screenshot, how many ground based thermometers are there in Australia? I understand it is ~180, of which ~112 are used to calculate a national “average”. LOL its total bullcarp!

Old Ranga

Are the figures fudged or unfudged, dare one ask?

Patrick says: February 6, 2014 at 2:00 am
“Rubbish! Stokes stop trying!”

OK, Patrick, where would you expect to find raw data for a grid cell?

Nick Stokes says: February 6, 2014 at 1:47 am
“This is gridded data,…”

Although the top level data is gridded, I see you can drill down to get station data.

mark

Newbie questions about the temperature data.
– The data in the dozen or so stations I looked at are all monthly averages. Is that the data that Crutem4 uses?
– In this paper http://www.nrcse.washington.edu/NordicNetwork/reports/temp.pdf they show computing averages by the minute vs. min/max and get major differences in standard deviations. Is there a standard algorithm that the various stations use? Or does each station “operator” choose the algorithm they use and provide the daily/monthly average to CRU? Do they report the algorithm they use to CRU?
Sorry if this is basic info – links to educate myself would be great.

holts

by using raw data on each grid cell mean!

charles nelson

As the Warmists flood the world with their adjusted, homogenised, gridded data, one is reminded of the switch from the Julian Calendar to the Gregorian Calendar used today…it was only when people began to notice that Christmas was getting closer to the middle of Spring that the revision took place.
Already there is a strong sense that simply by looking out the window the general public are becoming more and more skeptical of Warmist claims.

DDP

“This isn’t a problem scientifically because the temperature records do not depend on the precise location of each station.”
Derp. Lots of small errors combined make big errors

John Shade

I do not think anything from CRU deserves such automatic trust and admiration, although I admire the generosity of spirit that such responses reveal. My own immediate reaction was less noble. It was along the lines of ‘what are they up to now?’. I would like to see some critical review of this product.

Bloke down the pub

“This isn’t a problem scientifically because the temperature records do not depend on the precise location of each station.”
Now remind me, how do we know if a station’s record is affected by uhi?

johnmarshall

How do we know if this data is altered or raw data? Only raw data will do.

Patrick

“Nick Stokes says:
February 6, 2014 at 2:09 am”
Well, nowhere. That’s the point.

Alan the Brit

I hate to sound so a Grumpy Old Man, but I think the lines spoken by the late, great, Trevor Howard (actor), in a scene from the movie Battle of Britain may ring true……”the bastards are up to something!” It just sounds a little too good to be true at the moment, for me anyway.

I see that CRU has archived versions back to 1998, if you want to see what changes have been made.

troe

Anthony
You have to post the “Professor living in a dumpster for a year in Austin, TX”story up on Climate Depot. It really can’t wait till Friday.

Patrick says: February 6, 2014 at 2:57 am
“Well, nowhere. That’s the point.”

Well, my point is that grid averaged data is necessarily processed. You at least need to anomalise to average.
Anywhere, their paper is here. They describe homogenisation in Sec 2.2. In fact, it seems they don’t do much now. They did in early days. So the underlying station data, which they show, is rarely changed.

I notice Skeptical Science now has “Google Earth: how much has global warming raised temperatures near you?” [It hasn’t]
Local station seems to have values from before it came into existence, otherwise the trend looks believable. 1 degree rise in average, airport site over 50+ years, a few prop aircraft to numerous jets, UHI.

wayne

Here is one example of the “adjustments” I am speaking of, in a town in my state, randomly chosen:
Went to http://cdiac.ornl.gov/epubs/ndp/ushcn/usa_monthly.html, scrolled down to map of states to pick mine.
Get the raw monthly minimums for example:
http://cdiac.ornl.gov/cgi-bin/broker?id=343821&_PROGRAM=prog.gplot_meanclim_mon_yr2012.sas&_SERVICE=default&param=TMINRAW&minyear=1892&maxyear=2012
Get the “adjusted” monthly minimums:
http://cdiac.ornl.gov/cgi-bin/broker?id=343821&_PROGRAM=prog.gplot_meanclim_mon_yr2012.sas&_SERVICE=default&param=TMIN&minyear=1892&maxyear=2012
Notice the difference? You should! Seems 1900’s temp was moved all of the way down from 51°F down to 47°F, just four degrees, that’s all. Same for close years to 1900. THAT is what I mean when I said the adjustments are overwhelmingly upward but they are being applied inversely, that is downward, to the far past years. The further back you go, the larger the negative adjustment is applied to the ABSOLUTE values! This is not only seen in anomalies.
I rest my case.
Try some towns about your state. It is quite easy.

Now any monkey can get raw data on historical temps around the globe, but only a “Climate Scientist” knows how to make the data do what he wants.

I actually like the GISTEMP colored maps, very useful for making a trends since-then, or comparing some period against any selected reference period. Pity that data itself are crap.
http://data.giss.nasa.gov/gistemp/maps/

If you want to compare adjusted (raw) and adjusted data via Google Earth, KML files have been available since 2010 for GHCNv2 and GHCNv3 (beta)
http://diggingintheclay.wordpress.com/2010/10/06/google-earth-kml-files-spot-the-global-warming/
http://diggingintheclay.wordpress.com/2010/10/08/kml-maps-slideshow/
Note though that these are snapshots of the data in time and have not been updated. It is however instructive to see the trends of the individual stations and their variability.
Concerning gridded data, I am not a fan. I agree some adjustments are necessary (basically agree with evanmjones’ comments here: http://wattsupwiththat.com/2014/01/29/important-study-on-temperature-adjustments-homogenization-can-lead-to-a-significant-overestimate-of-rising-trends-of-surface-air-temperature/) but as soon as you homogenise to produce gridded data you mix well sited stations, with badly sited ones, and methods to pick out station moves etc are far from perfect.

RichardLH

Nick Stokes says:
February 6, 2014 at 2:09 am
“OK, Patrick, where would you expect to find raw data for a grid cell?”
Why would you want grid cell, partially interpolated, information in the first place?
That is just an exercise in trying to re-create a 2D field when it would appear that you do not have the Nyquist level of sampling required to do so accurately.
It just, in effect, creates a set of weighting factors that are then applied to the sampling point records themselves.
You could just track the changes in the sampling points directly and achieve a higher overall level of accuracy.

richardscourtney

RichardLH:
At February 6, 2014 at 5:39 am you write

Why would you want grid cell, partially interpolated, information in the first place?
That is just an exercise in trying to re-create a 2D field when it would appear that you do not have the Nyquist level of sampling required to do so accurately.
It just, in effect, creates a set of weighting factors that are then applied to the sampling point records themselves.
You could just track the changes in the sampling points directly and achieve a higher overall level of accuracy.

Yes! Well said!
I have repeated it as emphasis and in hope that this will catch the attention of any who missed it when you wrote it.
Richard

Bill from Nevada

Here is what the writer of the now
legendary file in the climate gate emails called “Harry_Read_me.txt
had to say about CRU and the shape of the information they are in control of.
If you haven’t really, pored through the Climate Gate Emails, go online to some of the many sites where the emails are highlighted
and the background, explained.
Someone who was a computer modeler doing global climate research put notes into the ‘remarks’ of one of the climate models he was building.
In programming, you can insert ”remarks” which give details, or important fundamentals, regarding the program or whatever, and since they are annotated AS ”remarks”,
the program when running simply ignores those lines. But later on people working with the computer program can read: and discover for themselves, whatever is documented about places the program performs well, or performs poorly, or really just about anything.
Here is a quick list of some of the most telling things said about CRU and it’s data manipulation, in the Harry_Read_Me.txt file.
If you’ve already seen it all then it’s old news.
But every day there’s a whole group of people checking into this the first time.
Obviously the lines lifted from the Harry_Read_me.txt file below are highlights.
I googled the file, opened some tabs, and grabbed some excerpts.
=======
“But what are all those monthly files? DON’T KNOW, UNDOCUMENTED. Wherever I look, there are data files, no info about what they are other than their names. And that’s useless …” (Page 17)
– “It’s botch after botch after botch.” (Page 18)
“Am I the first person to attempt to get the CRU databases in working order?!!” (Page 47)
– “COBAR AIRPORT AWS (data from an Australian weather station) cannot start in 1962, it didn’t open until 1993!” (Page 71)
“What the hell is supposed to happen here? Oh yeah — there is no ‘supposed,’ I can make it up. So I have : – )” (Page 98)
– “You can’t imagine what this has cost me — to actually allow the operator to assign false WMO (World Meteorological Organization) codes!! But what else is there in such situations? Especially when dealing with a ‘Master’ database of dubious provenance …” (Page 98)
– “So with a somewhat cynical shrug, I added the nuclear option — to match every WMO possible, and turn the rest into new stations … In other words what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to become bad …” (Pages 98-9)
– “OH F— THIS. It’s Sunday evening, I’ve worked all weekend, and just when I thought it was done, I’m hitting yet another problem that’s based on the hopeless state of our databases.” (Page 241).
– “This whole project is SUCH A MESS …” (Page 266)

RichardLH

richardscourtney says:
February 6, 2014 at 5:47 am
Thanks. Just a simple engineering point of view 🙂

Greg

“The move is part of an ongoing effort to make data about past climate and climate change as accessible and transparent as possible.”
Where’s the “transparency” in all this? CRUTemX is all based on unverified adjustments to non existant data.
Has anyone forgotten Prof. Phil Jones’ famous “why should I give our data you only want to find something wrong with it” ?
Or the “oops the dog ate it” excuses for not having the original data ANYWHERE?
Or “if I did have to hand over the file I think I’d rather destroy it ” ?
Or the Infomation Commissioner’s decision that there probalby was grounds for procecuting a criminal breach of FOIA , except that they were smart enough to procrastinate long enought for the statutory time limit to run out on the offence?
Sorry guys but this is not more “transparent” than it was last week. They’ve just made thier data, which has no grounding in observable records (they’re gone), more readily available to that peopel can be more easily duped into thinking it is an objective scientific record.

Now we can finally see for ourselves how much the parking lots of the world have warmed.

Greg

Bill from Nevada says:
February 6, 2014 at 5:55 am
Thanks for the helpful tips from “Harry”. , we can see why Phil Jones would rather destroy thier files than let someone rigorous like Steve McIntyre get a look at them.

Greg

Juraj V says:
February 6, 2014 at 5:21 am
I actually like the GISTEMP colored maps, very useful for making a trends since-then, or comparing some period against any selected reference period. Pity that data itself are crap.
http://data.giss.nasa.gov/gistemp/maps/
===
Indeed, this whole exercise of putting a fancy hi-tech front-end on a corrupt and non-verifiable database is like have jacked-up suspension, and a custom paint job on car with a worn out Lada engine.
It’s masking mess that is inside and trying to fool the observer.

Something doesn’t add up. A quick look at the GE data shows warming on the south west coast of BC Canada. The weather records from Environment Canada show no such warming.

RichardLH says:
February 6, 2014 at 5:39 am
You could just track the changes in the sampling points directly and achieve a higher overall level of accuracy.
============
Assuming that accuracy was the objective.

Pamela Gray

Gridded data is convenient. For all the wrong reasons scientifically but for all the right reasons politically. Station data is inconvenient. For all the right reasons scientifically but for all the wrong reasons politically.

JJ

“This isn’t a problem scientifically because the temperature records do not depend on the precise location of each station.”
This is the mindset that ultimately succumbs to the notion that it isn’t a problem scientifically that the temperature records do not depend on the precise temperature of each station.
Of course, they don’t subscribe to the notion that it is a problem scientifically that the things they call “temperature records” are not temperature records. The rest follows.