CRU produces something useful for a change

World temperature records available via Google Earth

Climate researchers at the University of East Anglia have made the world’s temperature records available via Google Earth.

The Climatic Research Unit Temperature Version 4 (CRUTEM4) land-surface air temperature dataset is one of the most widely used records of the climate system.

The new Google Earth format allows users to scroll around the world, zoom in on 6,000 weather stations, and view monthly, seasonal and annual temperature data more easily than ever before.

Users can drill down to see some 20,000 graphs – some of which show temperature records dating back to 1850.

The move is part of an ongoing effort to make data about past climate and climate change as accessible and transparent as possible.

Dr Tim Osborn from UEA’s Climatic Research Unit said: “The beauty of using Google Earth is that you can instantly see where the weather stations are, zoom in on specific countries, and see station datasets much more clearly.

“The data itself comes from the latest CRUTEM4 figures, which have been freely available on our website and via the Met Office. But we wanted to make this key temperature dataset as interactive and user-friendly as possible.”

The Google Earth interface shows how the globe has been split into 5° latitude and longitude grid boxes. The boxes are about 550km wide along the Equator, narrowing towards the North and South poles. This red and green checkerboard covers most of the Earth and indicates areas of land where station data are available. Clicking on a grid box reveals the area’s annual temperatures, as well as links to more detailed downloadable station data.

But while the new initiative does allow greater accessibility, the research team do expect to find errors.

Dr Osborn said: “This dataset combines monthly records from 6,000 weather stations around the world – some of which date back more than 150 years. That’s a lot of data, so we would expect to see a few errors. We very much encourage people to alert us to any records that seem unusual.

“There are some gaps in the grid – this is because there are no weather stations in remote areas such as the Sahara. Users may also spot that the location of some weather stations is not exact. This is because the information we have about the latitude and longitude of each station is limited to 1 decimal place, so the station markers could be a few kilometres from the actual location.

“This isn’t a problem scientifically because the temperature records do not depend on the precise location of each station. But it is something which will improve over time as more detailed location information becomes available.”

This new initiative is described in a new research paper published on February 4 in the journal Earth System Science Data (Osborn T.J. and Jones P.D., 2014: The CRUTEM4 land-surface air temperature dataset: construction, previous versions and dissemination via Google Earth).

For instructions about accessing and using the CRUTEM Google Earth interface (and to find out more about the project) visit http://www.cru.uea.ac.uk/cru/data/crutem/ge/. To view the new Google Earth interface download Google Earth, then click here CRUTEM4-2013-03_gridboxes.kml.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

134 Comments
Inline Feedbacks
View all comments
John Shade
February 6, 2014 2:37 am

I do not think anything from CRU deserves such automatic trust and admiration, although I admire the generosity of spirit that such responses reveal. My own immediate reaction was less noble. It was along the lines of ‘what are they up to now?’. I would like to see some critical review of this product.

Bloke down the pub
February 6, 2014 2:41 am

“This isn’t a problem scientifically because the temperature records do not depend on the precise location of each station.”
Now remind me, how do we know if a station’s record is affected by uhi?

johnmarshall
February 6, 2014 2:50 am

How do we know if this data is altered or raw data? Only raw data will do.

Patrick
February 6, 2014 2:57 am

“Nick Stokes says:
February 6, 2014 at 2:09 am”
Well, nowhere. That’s the point.

Alan the Brit
February 6, 2014 3:00 am

I hate to sound so a Grumpy Old Man, but I think the lines spoken by the late, great, Trevor Howard (actor), in a scene from the movie Battle of Britain may ring true……”the bastards are up to something!” It just sounds a little too good to be true at the moment, for me anyway.

Nick Stokes
February 6, 2014 3:04 am

I see that CRU has archived versions back to 1998, if you want to see what changes have been made.

troe
February 6, 2014 3:08 am

Anthony
You have to post the “Professor living in a dumpster for a year in Austin, TX”story up on Climate Depot. It really can’t wait till Friday.

Nick Stokes
February 6, 2014 3:14 am

Patrick says: February 6, 2014 at 2:57 am
“Well, nowhere. That’s the point.”

Well, my point is that grid averaged data is necessarily processed. You at least need to anomalise to average.
Anywhere, their paper is here. They describe homogenisation in Sec 2.2. In fact, it seems they don’t do much now. They did in early days. So the underlying station data, which they show, is rarely changed.

February 6, 2014 3:52 am

I notice Skeptical Science now has “Google Earth: how much has global warming raised temperatures near you?” [It hasn’t]
Local station seems to have values from before it came into existence, otherwise the trend looks believable. 1 degree rise in average, airport site over 50+ years, a few prop aircraft to numerous jets, UHI.

wayne
February 6, 2014 4:17 am

Here is one example of the “adjustments” I am speaking of, in a town in my state, randomly chosen:
Went to http://cdiac.ornl.gov/epubs/ndp/ushcn/usa_monthly.html, scrolled down to map of states to pick mine.
Get the raw monthly minimums for example:
http://cdiac.ornl.gov/cgi-bin/broker?id=343821&_PROGRAM=prog.gplot_meanclim_mon_yr2012.sas&_SERVICE=default&param=TMINRAW&minyear=1892&maxyear=2012
Get the “adjusted” monthly minimums:
http://cdiac.ornl.gov/cgi-bin/broker?id=343821&_PROGRAM=prog.gplot_meanclim_mon_yr2012.sas&_SERVICE=default&param=TMIN&minyear=1892&maxyear=2012
Notice the difference? You should! Seems 1900’s temp was moved all of the way down from 51°F down to 47°F, just four degrees, that’s all. Same for close years to 1900. THAT is what I mean when I said the adjustments are overwhelmingly upward but they are being applied inversely, that is downward, to the far past years. The further back you go, the larger the negative adjustment is applied to the ABSOLUTE values! This is not only seen in anomalies.
I rest my case.
Try some towns about your state. It is quite easy.

February 6, 2014 4:50 am

Now any monkey can get raw data on historical temps around the globe, but only a “Climate Scientist” knows how to make the data do what he wants.

February 6, 2014 5:21 am

I actually like the GISTEMP colored maps, very useful for making a trends since-then, or comparing some period against any selected reference period. Pity that data itself are crap.
http://data.giss.nasa.gov/gistemp/maps/

Editor
February 6, 2014 5:21 am

If you want to compare adjusted (raw) and adjusted data via Google Earth, KML files have been available since 2010 for GHCNv2 and GHCNv3 (beta)
http://diggingintheclay.wordpress.com/2010/10/06/google-earth-kml-files-spot-the-global-warming/
http://diggingintheclay.wordpress.com/2010/10/08/kml-maps-slideshow/
Note though that these are snapshots of the data in time and have not been updated. It is however instructive to see the trends of the individual stations and their variability.
Concerning gridded data, I am not a fan. I agree some adjustments are necessary (basically agree with evanmjones’ comments here: http://wattsupwiththat.com/2014/01/29/important-study-on-temperature-adjustments-homogenization-can-lead-to-a-significant-overestimate-of-rising-trends-of-surface-air-temperature/) but as soon as you homogenise to produce gridded data you mix well sited stations, with badly sited ones, and methods to pick out station moves etc are far from perfect.

RichardLH
February 6, 2014 5:39 am

Nick Stokes says:
February 6, 2014 at 2:09 am
“OK, Patrick, where would you expect to find raw data for a grid cell?”
Why would you want grid cell, partially interpolated, information in the first place?
That is just an exercise in trying to re-create a 2D field when it would appear that you do not have the Nyquist level of sampling required to do so accurately.
It just, in effect, creates a set of weighting factors that are then applied to the sampling point records themselves.
You could just track the changes in the sampling points directly and achieve a higher overall level of accuracy.

richardscourtney
February 6, 2014 5:47 am

RichardLH:
At February 6, 2014 at 5:39 am you write

Why would you want grid cell, partially interpolated, information in the first place?
That is just an exercise in trying to re-create a 2D field when it would appear that you do not have the Nyquist level of sampling required to do so accurately.
It just, in effect, creates a set of weighting factors that are then applied to the sampling point records themselves.
You could just track the changes in the sampling points directly and achieve a higher overall level of accuracy.

Yes! Well said!
I have repeated it as emphasis and in hope that this will catch the attention of any who missed it when you wrote it.
Richard

Bill from Nevada
February 6, 2014 5:55 am

Here is what the writer of the now
legendary file in the climate gate emails called “Harry_Read_me.txt
had to say about CRU and the shape of the information they are in control of.
If you haven’t really, pored through the Climate Gate Emails, go online to some of the many sites where the emails are highlighted
and the background, explained.
Someone who was a computer modeler doing global climate research put notes into the ‘remarks’ of one of the climate models he was building.
In programming, you can insert ”remarks” which give details, or important fundamentals, regarding the program or whatever, and since they are annotated AS ”remarks”,
the program when running simply ignores those lines. But later on people working with the computer program can read: and discover for themselves, whatever is documented about places the program performs well, or performs poorly, or really just about anything.
Here is a quick list of some of the most telling things said about CRU and it’s data manipulation, in the Harry_Read_Me.txt file.
If you’ve already seen it all then it’s old news.
But every day there’s a whole group of people checking into this the first time.
Obviously the lines lifted from the Harry_Read_me.txt file below are highlights.
I googled the file, opened some tabs, and grabbed some excerpts.
=======
“But what are all those monthly files? DON’T KNOW, UNDOCUMENTED. Wherever I look, there are data files, no info about what they are other than their names. And that’s useless …” (Page 17)
– “It’s botch after botch after botch.” (Page 18)
“Am I the first person to attempt to get the CRU databases in working order?!!” (Page 47)
– “COBAR AIRPORT AWS (data from an Australian weather station) cannot start in 1962, it didn’t open until 1993!” (Page 71)
“What the hell is supposed to happen here? Oh yeah — there is no ‘supposed,’ I can make it up. So I have : – )” (Page 98)
– “You can’t imagine what this has cost me — to actually allow the operator to assign false WMO (World Meteorological Organization) codes!! But what else is there in such situations? Especially when dealing with a ‘Master’ database of dubious provenance …” (Page 98)
– “So with a somewhat cynical shrug, I added the nuclear option — to match every WMO possible, and turn the rest into new stations … In other words what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to become bad …” (Pages 98-9)
– “OH F— THIS. It’s Sunday evening, I’ve worked all weekend, and just when I thought it was done, I’m hitting yet another problem that’s based on the hopeless state of our databases.” (Page 241).
– “This whole project is SUCH A MESS …” (Page 266)

RichardLH
February 6, 2014 6:04 am

richardscourtney says:
February 6, 2014 at 5:47 am
Thanks. Just a simple engineering point of view 🙂

Greg
February 6, 2014 6:19 am

“The move is part of an ongoing effort to make data about past climate and climate change as accessible and transparent as possible.”
Where’s the “transparency” in all this? CRUTemX is all based on unverified adjustments to non existant data.
Has anyone forgotten Prof. Phil Jones’ famous “why should I give our data you only want to find something wrong with it” ?
Or the “oops the dog ate it” excuses for not having the original data ANYWHERE?
Or “if I did have to hand over the file I think I’d rather destroy it ” ?
Or the Infomation Commissioner’s decision that there probalby was grounds for procecuting a criminal breach of FOIA , except that they were smart enough to procrastinate long enought for the statutory time limit to run out on the offence?
Sorry guys but this is not more “transparent” than it was last week. They’ve just made thier data, which has no grounding in observable records (they’re gone), more readily available to that peopel can be more easily duped into thinking it is an objective scientific record.

February 6, 2014 6:37 am

Now we can finally see for ourselves how much the parking lots of the world have warmed.

Greg
February 6, 2014 6:41 am

Bill from Nevada says:
February 6, 2014 at 5:55 am
Thanks for the helpful tips from “Harry”. , we can see why Phil Jones would rather destroy thier files than let someone rigorous like Steve McIntyre get a look at them.

Greg
February 6, 2014 6:48 am

Juraj V says:
February 6, 2014 at 5:21 am
I actually like the GISTEMP colored maps, very useful for making a trends since-then, or comparing some period against any selected reference period. Pity that data itself are crap.
http://data.giss.nasa.gov/gistemp/maps/
===
Indeed, this whole exercise of putting a fancy hi-tech front-end on a corrupt and non-verifiable database is like have jacked-up suspension, and a custom paint job on car with a worn out Lada engine.
It’s masking mess that is inside and trying to fool the observer.

ferdberple
February 6, 2014 7:05 am

Something doesn’t add up. A quick look at the GE data shows warming on the south west coast of BC Canada. The weather records from Environment Canada show no such warming.

ferdberple
February 6, 2014 7:26 am

RichardLH says:
February 6, 2014 at 5:39 am
You could just track the changes in the sampling points directly and achieve a higher overall level of accuracy.
============
Assuming that accuracy was the objective.

Pamela Gray
February 6, 2014 7:30 am

Gridded data is convenient. For all the wrong reasons scientifically but for all the right reasons politically. Station data is inconvenient. For all the right reasons scientifically but for all the wrong reasons politically.

JJ
February 6, 2014 7:33 am

“This isn’t a problem scientifically because the temperature records do not depend on the precise location of each station.”
This is the mindset that ultimately succumbs to the notion that it isn’t a problem scientifically that the temperature records do not depend on the precise temperature of each station.
Of course, they don’t subscribe to the notion that it is a problem scientifically that the things they call “temperature records” are not temperature records. The rest follows.

Verified by MonsterInsights