The "great dying of thermometers" – helping GISS find the undead thermometers, complete with code

Guest post by Walter Dnes.

There was phenomenon known as “the great dying”, where most of the Canadian stations being used for GISS world monthly temperatures disappeared, at least according to GISS.

In reality, many of these “dead” stations are still there, putting out data every month.  This

post is about finding additional Canadian monthly mean temperatures and anomalies from the Environment Canada website.

First, some administrivia…

– I’m a retired Environment Canada employee.  I do not speak for

Environment Canada.  This is all done in my capacity as a private

citizen.

– the following is all based on publicly available data from the

Environment Canada website as of late July, 2011.

– there are 2 versions of the code and data and scripts.  Each

clisum?.zip file is approximately 5 megabytes.  You will only need

one.  Both zipfiles contain BASIC source code, some scripts, and a

subdirectory with over 300 data files.  The  BASIC code is generic

enough that it should run under most BASIC versions on most platforms.

– The linux files are the definitive version.  I did a translation to

MS-DOS because most people still use Windows.  I haven’t been able to

fully test the DOS version, but I hope it works.  All file and

directory names comply with the old DOS 8.3 filename spec, for maximum

backwards compatability

– The files in clisumd.zip assume an MS-DOS or Windows environment.

This includes old-fashioned DOS, as well as a Windows DOS box command

prompt.  The files all have MS-DOS end-of-line, and the .BAT files

are written to be executed under COMMAND.COM or CMD.EXE.  I don’t

have a Windows or DOS machine, so I can’t be sure the BASIC code is

correct for QBASIC, or whatever version you may have.  Be sure to

edit the .BAT files to replace the command “bas” with whatever BASIC

interpreter/compiler you’re actually using.  You may need to tweak

some of the BASIC programs for minor syntax differences.  Click here

clisumd.zip to download the MS-DOS files if you run

DOS or Windows.

– The files in clisumu.zip assume a unix/linux/bsd environment.  They

all have unix end-of-line, and the scripts are written to be

executed under bash.  The BASIC programs were written for the “bas”

BASIC interpreter.  “bas” can be installed using your linux distro’s

standard install comand (e.g. “apt-get” in Debian/Ubuntu and

derivatives; “emerge” in Gentoo).  If your distro doesn’t have “bas”,

see http://www.moria.de/~michael/bas/ to download the source tarball

and build it manually.  Click here clisumu.zip to download the unix/linux/bsd files if you run any of unix/linux/bsd.

– there are links to various free BASIC interpreters and compilers at

http://www.thefreecountry.com/compilers/basic.shtml

Getting code and data

The first step is to download clisumd.zip (for Windows/DOS users) or clisumu.zip (for unix/linux/bsd users) and unzip it.  The result is a directory called either clisumd or clisumd.  Inside that directory are 13 or 14 files and a subdirectory “csvfiles” with over 300 CSV data files.

The next step is to download a monthly climate summary in text format.  With javascript enabled, go to webpage:

http://climate.weatheroffice.gc.ca/prods_servs/cdn_climate_summary_e.html

Select the desired month/year, Povince=”All”, Format=”Plain Text”. You should see something like this screenshot:

Once you get to this point, click on “Submit”

Save the resulting text webpage to a textfile in the clisumu or clisumd directory.  Since I used May 2011 data I named the textfile cs201105.txt. I also downloaded June as cs201106.txt.  You’ll want to download the latest month every month.  The data is generally available 7 to 10 days after the end of the month.

************************************************************************

** The following portion only needs to be run once for initial setup  **

** You do not have to do the next portion, including downloading 300+ **

** data files.  I’ve done it already and included the output in the   **

** zipfiles.  The following instructions are documentation for the    **

** sake of the scientific method, in case anybody wants to duplicate  **

** this the hard way.  The most likely use is in the case of manual   **

** over-rides, which I’ve found one case for so far.  There may be    **

** be other cases.                                                    **

************************************************************************

Creating a list of candidate stations with normals data

=======================================================

The next step is to create a subset file containing only sites with data in the most recent normals.  We actually want 1951-1980 normals for comparison to GISS.  Stations with current normals are candidates for having 1951-1980 normals, and their data will be downloaded.

We need to pick out only the lines with values in the “D” (Departure from normal) field, and copy only relevant data fields to a subset file. The program subset.bas is launched by the script “subset” in linux, or the batch file “subset.bat” in DOS.  The script/batchfile sets up the name of the input and output files as environment variables before launching subset.bas.

The program “subset.bas” scans for lines with column 64 == “.”.  This signals the presence of some temperature normals data for the period 1971-2000.  For only those lines, the climate ID, monthly mean temp, count of days with missing mean temp, and station name are extracted and written to a second file.  In the case of May 2011, subset.txt has

monthly mean temperatures for 313 sites which have normals for 1971-2000 to compute anomalies against.  In this example, I’ve called the output file “subset5.txt” to remind me that it’s for May.

The DOS batch file is invoked as…

subset

and the bash script is invoked as…

./subset

Because this only needs to be run once, I hardcoded the filenames into the batch/script files.

Downloading monthly data in CSV format

======================================

Unlike the 1961-1990 and 1971-2000 normals, the 1951-1980 Canadian climate normals do not appear to be on the web.  But since the monthly data is available online for downloading, we can do the calculations ourselves, after downloading the monthly data.  Here is how the data was downloaded…

We search by station name.  The first line in subset5.txt is…

“1012055”,9.5,17,48.829,-124.052,”LAKE COWICHAN”

The climate data advanced search page is at…

http://www.climate.weatheroffice.gc.ca/advanceSearch/searchHistoricData_e.html

Use the “Search by Station Name:” menu as shown in the screenshot:

Enter the name, or a portion thereof, as shown in the red rectangle.  Note that upper/lower case doesn’t matter, and spaces are ignored.  Thus “lakecow” matches “LAKE COWICHAN”.  Then click on the “Search” button as shown by the red arrow in the

screenshot.  Alternately, you can press “Enter” on your keyboard.  This takes you to the search results page as shown in the screenshot:

We run into a problem here… there are two stations named “LAKE COWICHAN”, which does happen on occasion.  It’s not until you actually select a station that you find out if you’ve got the right one.  To select monthly data, you must first select “Monthly” in the drop-down menu under “Data Interval”, and then click on the “Go” button

corresponding to the station you want.  You’ll get a display similar to the screenshot:

I’ve highlighted a couple of areas.  At the upper left is the climate ID in a red rectangle.  This must match the climate ID at the beginning of the line in the subset file, unless you’re doing a manual over-ride (more about this later).

The red arrow at the bottom right corner points to the link for downloading the data in CSV format.  I right-clicked on that link and saved the file to the csvfiles directory.  My convention is to name the file after the climate ID.  Thus, this one would be “1012055.csv”.  Note that this is merely a label for convenience only.  The files could be assigned any legal filenames, and the BASIC programs would still work, because they read the climate ID from data inside the csv data files.

Rinse/lather/repeat the above for all 300+ lines in the subset file. Now you know why you don’t want to repeat this yourself.

Now for the manual over-ride example.  Look in the climate summary file cs201106.txt.  Station “WINNIPEG RICHARDSON AWOS” with climate ID “5023226” has a mean monthly temperature, but it does not have normals data.  Searching for “winnipeg” in the climate data advanced search page yields several Winnipeg sites.  If you click “Go” on “WINNIPEG RICHARDSON AWOS” you’ll see that it’s located at 49 55’N and 97 14’W and

elevation 238.7 m.  Go back to the Winnipeg search results page, select “Monthly” and click “Go” for “WINNIPEG RICHARDSON INT’L A”.  You’ll notice that it’s located at 49 55’N and 97 14’W and elevation 238.7 m. They’re at EXACTLY the same location.  Why the split reporting, I don’t know.  Anyhow, I downloaded the CSV monthly data with filename

“5023222.csv” to the csvfiles directory.  Then I opened it with a text editor, and changed the 6th line from

“Climate Identifier”,”5023222″

to

“Climate Identifier”,”5023226″

This causes the BASIC programs to treat the monthly data as belonging to the AWOS site when computing monthly normals.  Thus we will get monthly temperature anomalies versus 1951-1980 for the AWOS site, even though it’s relatively new.

Calculating the monthly normals

===============================

The normals batch/script file needs to be run only when the contents of the csvfiles subdirectory change.  This includes individual files being added, deleted, or edited.

The program normals.bas opens a CSV file for input, and the normals file in append mode.  It then calculates the normal temperature for one station, appends one line of data and exits.  It is called serially by a FOR loop in the normals shell script or normals.bat batchfile, once for each file in the csvfiles directory.  Since lines are always being appended to normals.txt, the script deletes the normals file before starting the loop.  This starts off with a clean slate.  The script then sets the name of the normals file, and the value of the normals start and end years, and then loops through all the files in the csvfiles

directory that match the spec “*.csv”.  The file is invoked in unix/linux/bsd as…

./normals

and in a DOS box (including Windows) as…

normals

Because of limitations in the DOS FOR command, normals.bat has a couple of extra steps…

1) The bash FOR command sorts filenames when evaluating “*.csv”, which results in the file normals.txt being in sorted order.  The DOS FOR command doesn’t do this.  The workaround is to write output to a scratch file (normals.000) and sort that file to normals.txt at the end.

2) The bash FOR command accepts multiple commands in a DO/DONE block.

The DOS FOR command doesn’t do this.  It has to be a “one-line-wonder”. The workaround is to make the one command a CALL to a 2nd DOS batch file, namely normals2.bat.  normals2.bat has the multiple commands to execute.

Note that normals and normals.bat set STARTYR=1951 and ENDYR=1980. This is because the immediate goal of this project is to generate normals to compare against GISS, which happens to use 1951..1980 as its base period.  There’s nothing preventing anybody from using 1961..1990, or any other random base period for that matter.

The output format for the normals file is… Columns 1-10  The 7-character climate ID in quotes, followed by comma

This is followed by 12 repetitions (1 for each month) of…

SNN.N,NNN,

Where “SNN.N” is the monthly mean temp, with a minus sign if needed,

“NNN” is the number of years of data for that month

************************************************************************

** This finishes the portion that only needs to be run once for       **

** initial setup.  The following is run every month after downloading **

** the latest monthly climate summary from Environment Canada.        **

************************************************************************

Calculating the monthly temperature anomalies

=============================================

Because the monthly data will be calculated using different filenames and months, the anomaly batch/script files accept parameters.  The first parameter is the month as a number from 1 (January) to 12 (December). The second parameter is the name of the monthly climate summary file that you’ve downloaded from the Environment Canada website.  Note that the program *ALWAYS WRITES TO THE SAME OUTPUT FILE NAMES*.  If you want to keep anomaly output files, and not have them overwritten, rename them before doing the next run of anomaly files.  I’ve included 2 sample

monthly climate summaries, cs201105.txt for May 2011, and cs201106.txt for June 2011.  An example invocation for June 2011 data is, in DOS…

anomaly 6 cs201106.txt

and in bash

./anomaly 6 cs201106.txt

There are 2 output files.  anomaly0.txt has output for every station with a monthly mean temperature in the climate summary, and a line in the normals file.  anomaly1.txt only has those lines for which…

a) the month’s summary data shows zero days missing, and

b) there are 30 years of data for this month in the normals

This is intended as a filter to list only the best data for consideration.  You can relax the criteria if you wish, by modifying anomaly.bas.

An example line from the anomaly outputs is…

“1021830”, 15.1, 0, 15.0, 30,  0.1,49.717,-124.900,”COMOX A”

The fields are…

Climate ID

Actual monthly mean temperature this month

Number of missing days this month

Normal monthly mean temperature

Number of years of normals data for this month

Temperature anomaly versus the normal value

Station latitude

Station longitude

Station name

This output is suitable for importing into a spreadsheet, and

especially into a GIS program for plotting.

***********************************************************************

0 0 votes
Article Rating
70 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
July 24, 2011 1:22 am

That’s funny. i had a reader ask me if I could get this canadian data for him.
There is actually a script –a scraper– that does this all automagically. That scraper is posted
http://scraperwiki.com/scrapers/canada-temperature-data/
Looks like a little python maintenance is required. So no dos, no bat files, no manually going to pages. scrape it.
Ah yes, and make sure you DONT TAKE any of the data from stations that dont meet WMO standards. lots of those there

July 24, 2011 1:43 am

If you want to know who wrote the scraper.. its drJ. of clearclimatecode. these wizards actually got Gisstemp up and running, turned it into python and benchmarked against gisstemp. i’m betting it replaces the fortran code of hansen in the near future…
Then, this canadian data was formated into GHCN style formats and fed into GISSTEMP!
Yup. So what happens when we add all this extra canadian data.. to GISSTEMP?
So with MORE stations from canada.. what do we expect? remember we are just looking at changes or departures from the mean.. it might be colder in canada, but we dont make the mistake of using “unnormalized” data. every station gets normalized by its mean..when we normalize data then a station that warms from 10 to 11 gets a value of +1. and a station that warms from
-30 to -29 gets a value of 1.
Will more stations change the global mean? ONLY IF… only iff the TREND at those stations is Higher than the average station or lower than the average stations. its the trend that matters.
Well, here is what you see if you add more data for canada
http://clearclimatecode.org/analysis-of-canada-data/

Editor
Reply to  steven mosher
July 24, 2011 3:50 am

I had looked at Canadian data back in 2010 (Part 1; Part 2; Part 3; Part 4; Part 5) comparing Environment Canada with GISS. drJ did ask me if I had all the data, but, working in Excel I had only downloaded a few individual stations.
What puzzled me (Part 2) was that there was an offset between the station temperatures in GISS and those reported by Env Canada. While this doesn’t matter when working with anomalies I still wonder why it occurred, although I did resolve how (Part 4).
Part 5 looked at the curious phenomenon that the rate of warming increases dramatically when you cross the US-Canadian Border. In particular see this map http://diggingintheclay.files.wordpress.com/2010/05/49thparallelmap26key.png

“Why, when you go North of the 49th parallel, is there a sudden increase in the rate of warming?”

While I agree that more stations do not change the rate of warming, I think we have to look very carefully at the rate of change in those that are reporting post 1970. We know land use change and urban growth can increase temperatures and although it may never be possible to tease out the effects of land use changes definitively, I’d like to encourage folks to use Walter’s information and look in more detail at Canada.
If you look at the graph produced by CCC http://clearclimatecode.org/wp-content/uploads/2010/11/Zone+64+90-440.png indeed there is no difference when the Environment Canada data is used (with more stations). But look at the rates of warming ~1920-1940 cf 1980-2009. It is possible that land use change and UHI growth are contributary factors to the increased warming in the more recent period.

Hoser
July 24, 2011 1:51 am

PERL

Brian H
July 24, 2011 1:58 am

Gah. What we (some) go thru for 20-yr. deep “backwards compatibility”! But it’s amazing how much data and program code in text and Basic you get for 5 MB.
How many stations are represented?

July 24, 2011 2:21 am

Very good walk through of the data process, I just went through the total zip file of all data and extracted it to all daily values for all stations for the entire periods of record (started in March and was finished in May) with sorting the data into tables for each parameter, for each month, for each year, with reference to Station ID Long and Lat + values for all stations active for each date for the entire period of record.
Once the data was tabled I produced csv files for each date that represents the same date 6558 days apart for four cycles, and i am in the process of adding the Canadian data to Alaskan and USA 48 contiguous states, to generate a map based upon the composite of the last four cycles of the 6558 day repeat period of the 240 lunar declination and solar rotation periods (both 27.32 days) as an upgrade to my weather forecast site aerology.com Site revision for the inclusion of the new maps will be finished soon, greater detail with finer grid resolution of 3 miles squares rather than the 90 miles square grid resolution as is currently displayed.
conversion from metric to SAE was kind of a pain to get it all in the right format, and retain the decimal place in the right place, in the different sets of data due to different scales for snowfall tenths, precipitation in hundredths, and the converted temps rounded to tenths of degrees F. There are many broken sets of station data of various lengths, and few records of snow on ground before 1955 then more after about 1962-5 as the idea caught on.
I just used all of the valid station daily values that appeared in the record, so in areas or times when the data points drop out, the definition of the resultant data maps just gets a little grainier in those areas, with changes from day to day as the record gaps come and go, in areas of high record density it is not noticeable, but in areas where the station locations are more than 3 to 5 degrees apart there is some winking as you scroll through the dates of the map coverage of the forecast. The handling of the missing weekend data was new they just carry the three day total for all three days, so there are flashes of detached precipitation on the days when no other stations showed precip nearby.
Hope to have the whole rebuild of the site on line soon, Gawd I love high speed internet and access to data that gives anyone who has the time and inclination to look at what is really there.

Bob in Castlemaine
July 24, 2011 2:33 am

Certainly shows the disappearances we’d expect and the very limited geographical extent of the ground based temperature data record.
Perhaps another approach that might be enlightening. Someone familiar with the data maybe could generate a pair of graphs showing the number of complete temperature data sets still in the mix that are favourable to the AGW orthodoxy (rising temp) and another showing the number of data sets that are not favourable (falling temp) for years 1880 – 2007. Also a pair of graphs showing the number incomplete data sets that have been the subject of adjustment along and another graph showing the total number of data sets still being considered for years 1880 – 2007?.

Krishna Gans
July 24, 2011 2:40 am

“Global Warming” was made in USA

Patrick Davis
July 24, 2011 4:08 am

Interesting. Appears to be a large rash of red apparently across all Australia, mostly SE corner, circa 1992, *ALL* estimates? And then almost NOTHING from there on in. Cearly, the science (Of estimating) is settled.

Dave in Delaware
July 24, 2011 4:38 am

Please keep in mind that TEMPERATURE is only an approximate indicator of ENERGY. The Energy is what is important, and the energy in the atmosphere changes significantly as the humidity changes.
So when Mosher July 24, 2011 at 1:43 am says – …. “every station gets normalized by its mean..when we normalize data then a station that warms from 10 to 11 gets a value of +1. and a station that warms from -30 to -29 gets a value of 1. ”
That is only true if you take one whale and one minnow and count them each as ‘one fish’.
Yes I know, whales are mammals – you are starting to get this. 🙂
In fact, 20 Deg F (about 11 C) lower temperature can have 15% higher energy, as in this example:
==========================================================
from WUWT
http://wattsupwiththat.com/2010/06/07/some-people-claim-that-theres-a-human-to-blame/#more-20260
Max Hugoson says June 7, 2010 at 9:49 am
But to all the people playing “average temperature”, and in the spirit of trying to do GOOD ENGINEERING WORK… “average temperature” is a FICTION and MEANINGLESS. Here is why: Go to any online psychometric calculator. (Heh, heh, I use the old English units, if you are fixated on Metric, get a calculator!)
Put in 105 F and 15% R.H. That’s Phoenix on a typical June day.
Then put in 85 F and 70% RH. That’s MN on many spring/summer days.
What’s the ENERGY CONTENT per cubic foot of air? 33 BTU for the PHX sample and 38 BTU for the MN sample.
So the LOWER TEMPERATURE has the higher amount of energy.
Thus, without knowledge of HUMIDITY we have NO CLUE as to atmospheric energy balances.

Editor
Reply to  Dave in Delaware
July 24, 2011 5:09 am

Dave in Delaware,
Thank you for that. That is what everyone forgets (myself included at times). The properties of water in the atmosphere, as well as in the oceans, modulate the temperature of our planet.

Alex
July 24, 2011 4:38 am

Wow, Textfiles, short filenames? Do we still have 1990?

George V
July 24, 2011 4:49 am

It looks like Canada isn’t the only country with missing temp records. China, Australia, South America all blank out about 1990….
I have minimal knowledge of statistical analysis. So can someone explain to me how you get an average temperature of something without having any data? Or is the GISS temperature record essentially just an extrapolation from 20 years ago or maybe made up out of thin air?
I also find it interesting that the data vanishes from this record shortly after the first US Congressional hearing on AGW.
George V.

Editor
July 24, 2011 4:55 am

Hoser says:
July 24, 2011 at 1:51 am
> PERL
Go ahead! When I decided to look for a language other than C for a lot of my personal work, I was delighted to discover Python. The oldest source file I see is from 2002. Hmm, I’m still using one from 2003. Ah, there’s the first program – April 2001.
Perl is great for some applications, and that includes some of this work, but I use Python for just about everything except OS internals and client side web scripts. (Whatever happened to Iron Python, anyway?)
steven mosher says:
July 24, 2011 at 1:43 am
> If you want to know who wrote the scraper.. its drJ. of clearclimatecode. these wizards actually got Gisstemp up and running, turned it into python and benchmarked against gisstemp. i’m betting it replaces the fortran code of hansen in the near future…
One thing I like about GISS is that they like Python. Don’t break into a MIT network without it!

Editor
July 24, 2011 4:57 am

steven mosher says:
July 24, 2011 at 1:22 am
> Ah yes, and make sure you DONT TAKE any of the data from stations that dont meet WMO standards. lots of those there
How many USHCN stations don’t meet WMO standards?

richard verney
July 24, 2011 6:04 am

in Delaware says: July 24, 2011 at 4:38 am
//////////////////////////////////////////////////////////////////////
Absolutely, and that is why ocean temps are the only relevant metric (and of course, oceans represent more than 70% of the globe and 99% of the latent heat content/capacity).

Steve McIntyre
July 24, 2011 6:48 am

Here’s a quick script in R to download and extract station info – that took about 5 minutes to write.
get.normal=function() {
download.file(“http://surfacestations.org/dnes/clisumd.zip”,”temp.zip”,mode=”wb”)
loc=”clisumd/normals.txt”
handle=unz( “d:/temp/temp.zip”,loc ,”r”)
x=scan(handle,sep=”,”,what=c(“”,rep(0,24)))#Read 259950 items # 1733 columns
close(handle)
x=t(array(x,dim=c(25,length(x)/25) ) )
x=data.frame(x)
for(i in 2:25) x[,i]=as.numeric(as.character(x[,i]))
return(x)
}
normal=get.normal()
id=as.character(normal[,1])
get.info=function(station_id){
loc=paste(“clisumd/csvfiles/”,station_id,”.csv”,sep=””)
handle=unz( “temp.zip”,loc ,”r”)
x=scan(handle,sep=”,”,what=””)#Read 259950 items # 1733 columns
close(handle)
info=x[1:56]
x=x[57:length(x)]; n=length(x)
x=t(array(x,dim=c(25,n/25) ) )
x=data.frame(x)
names(x)=info[32:56]
for(i in 2:25) x[,i]=as.numeric(as.character(x[,i]))
return(x)
}
station=get.info(id[1])

Steve McIntyre
July 24, 2011 6:55 am

Re – the great dying of GISS thermometers. In all the discussion of this, people have overlooked why GISS loses track of so many stations. I noticed this when I got CRU station information via the mole – I don’t remember whether this was discussed at the time. CRU was able to locate stations that GISS wasn’t able to find.
The reason is the interaction between GISS’ reference station method and the changeover from historical WWD sources and CLIMAT monthly. GISS requires a 24 month overlap or something like that (I don’t recall the precise number offhand) between scribal versions. It treated WWD and CLIMAT as different sources. For most of the “missing” stations, the overlap between WWD and CLIMAT versions was “too short” for the unsupervised GISS methodology. Needless to say, the relationship between versions could and should have been determined by questionnaires to the various meteorological agencies. I’m sure that Environment Canada could have clarified whether the WWD station and CLIMAT stations were the same. However, this obvious clerical work doesn’t seem to have been done. Instead,Hansen introduced adjustments at the changeover.

Steve McIntyre
July 24, 2011 6:57 am

BTW, Walter, thank you very much for this service. On a couple of occasions in the past, I tried to wade through the Env Canada jave scripts to get a proper database and gave up in frustration. You’ve done a good deed.

July 24, 2011 7:03 am

Last year I downloaded all the daily data from EC website, using the VB script at this site: http://cdnsurfacetemps.wordpress.com/2010/02/27/how-stations-will-be-presented/
Took 3 months.
My analysis of that data is here: http://cdnsurfacetemps.wordpress.com
Summer TMax in Canada is dropping. There are fewer heat waves in Canada now than there was in the 1930’s Winters are less cold, which is what is driving the average up.

The Sage
July 24, 2011 7:05 am

>2011
>using BASIC
Anyone with a Windows 7 system will have PowerShell installed as part of the bundle; so I’ve ported the .bas/.bat combinations to that language and put them up on pastebincom
subset.ps1 at /rKWXsQ6e
normals.ps1 at /0gRwBWKb
anomaly.ps1 at /TL9MwrG2
having tested them to ensure that they generate the same outputs as are in the .zip bundle
@Ric Werme
It’s up to 2.7.1 (beta 1) now. see ironpythoncodeplexcom

The Sage
July 24, 2011 7:07 am

That’s pastebin[dot]com and ironpython[dot]codeplex[dot]com

Bruce
July 24, 2011 7:20 am

Mosher, thanks for the link. Since we know Canada isn’t really warming 5C per century, we have a much better idea what UHI is. And its huge.

Richard M
July 24, 2011 7:26 am

steven mosher says:
July 24, 2011 at 1:43 am

Steve provides us with a link to clearclimatecode that shows GISS landtemps records in the Arctic (64N-90N)from 1880. What I noted was that if one compares the trend pre-AGW (1880-1945) and with AGW (1945-2009) we see the latter trend is lower. Interesting.

July 24, 2011 8:32 am

Verity
“But look at the rates of warming ~1920-1940 cf 1980-2009. It is possible that land use change and UHI growth are contributary factors to the increased warming in the more recent period.”
Yes its possible. Anything is possible. Form a hypothesis. Changes in land use will lead to changes in temperatures.
1. Define “land use changes”. Objectively, what kinds of land use changes do you mean.
I can usually go find metadata IFF people can define their terms accurately.
2. Formulate a Null: Trends at Type A > trends at Type B
usually what I’ve found is hat people are unwilling to formulate their ideas in these ways. and unwilling to have their ideas tested. And if they are tested and found wanting, they change the terms of the discussion.
Just sayin.

R. Gates
July 24, 2011 8:37 am

steven mosher’s post on
July 24, 2011 at 1:43 am with this link:
http://clearclimatecode.org/analysis-of-canada-data/
Tells you all you need to know about this topic. Some skeptics will be even more disappointed with the results then they were with the Berkeley Earth Temperature Project. Result: There has been no conspiracy…just warmer temps.
REPLY: You know Mr. Gates, the is quite a boneheaded comment. Walter did something useful, I posted it strictly because it was useful. Neither of us, nor anyone commenting in this thread said anything about “conspiracy”. You did.
Knock it off – take a 24 hour timeout – Anthony

July 24, 2011 8:52 am

Ric Werme says:
July 24, 2011 at 4:57 am
> Ah yes, and make sure you DONT TAKE any of the data from stations that dont meet WMO standards. lots of those there
How many USHCN stations don’t meet WMO standards?
################
Ric, I think you misunderstand
Now what are WMO Standards? not what you think Ric,
http://www.climate.weatheroffice.gc.ca/prods_servs/normals_documentation_e.html

oldgamer56
July 24, 2011 8:59 am

If we are trying to accurately determine how much the Earth is heating or cooling, shouldn’t we be looking at the heat index vs Temp? Heat index gives the amount of energy the air is holding. I thought it was all about the amount of energy being retained that is causing Global Warming? If the temps are high with low RH, then focusing solely on the air temp is only part of the story.
Why are we not seeing historical records of heat index being compared to current heat indexes?
Is Temp just the low hanging fruit?
BTW, spent 4 years in Saudi Arabia. Hot days in the high desert much more tolerable that warm days next to the Red Sea.

July 24, 2011 9:03 am

Dave in Delaware says:
July 24, 2011 at 4:38 am
Yes dave we are all aware of that. We are all aware that the best metric is OHC, all aware that temperature is ONLY an indicator of energy.
Do you think the MWP was warmer than today? was the LIA cooler than today.
I know that ‘average’ temperature has no meaning, but what do we mean when we say that greenland in in MWP was warmer than today? do we mean that the air temperatures were LOWER?
Heres the point. OHC is the best metric. for looking at energy balances. without a doubt.
But, if somebody wants to look at temperature to answer questions like… how warm was it in greenland 1000 years ago, well then we have to work with the inferior metric of temperature.
in a nutshell. IF you want to know the best metric for energy balance? OHC.
IF you have to work with temperatures, if that’s all you have, if you have no other choice,
make sure you normalize.

July 24, 2011 9:09 am

Jr wakefield; nice work
‘Any specific stations presented here individually will be chosen based on the length of data they contain. The criteria will be that a station must have at least 80% of the days from Jan 1, 1900 to Dec 31, 2009. Because so few stations fit into that criteria, there will hence be few stations that will fill the criteria.”
When I get a chance perhaps I’ll apply RomanM’s method to this data. and you wont have to make decisions about 80% or 77% or 91% or whatever number.

Pamela Gray
July 24, 2011 9:16 am

Some thoughts of mine.
30 year average periods do not span warm and cold phases of various atmospheric and oceanic oscillations. At most they span one half of a complete cycle. Comparisons of monthly and yearly trends to the “mean” should use data extending to a 60 year period to calculate the mean.
Using anomalies from the mean is not an indicator of climate change. It is an indicator of weather pattern variations. The extremes are where the gravy lies in terms of climate change.
Our standard climate zones (hardiness zones) are not nearly fine enough. The USDA climate zone hardiness labels are insufficient in describing regional and local climate. Sunset Magazine climate zones are much better. For example, in NE Oregon, the climate zone designations are different between the two sources. If you want to know about climate, stick to the Sunset magazine version. When those zones change, you know you have climate change going on because they are set up by local extremes, not by regional weather pattern average anomalies.
http://www.sunset.com/garden/climate-zones/climate-zones-intro-us-map-00400000036421/
http://www.usna.usda.gov/Hardzone/ushzmap.html

July 24, 2011 9:20 am

R. Gates says:
“Result: There has been no conspiracy…just naturally warmer temps.”
There. Fixed it for you.

Jeff Alberts
July 24, 2011 9:26 am

steven mosher
July 24, 2011 at 9:03 am
Why would you work with something that is irrelevant to what you’re trying to find out?

Matt G
July 24, 2011 9:27 am

Richard M says:
July 24, 2011 at 7:26 am
I did an average of all land based GISS temps in the Arctic (64N-83N) post 1930’s because too much data was missing from them before this period. (There aren’t any land based temps above 83N)
http://img141.imageshack.us/img141/7617/arctictempstrend.png
It shows the warming period up until 2008 was literally no different from the 1930’s/1940’s. There were no weighting bias used because wanted to show how the actual data had changed over this period.

Bystander
July 24, 2011 9:50 am

@ Matt G says “There were no weighting bias used because wanted to show how the actual data had changed over this period.”
It doesn’t make any sense to discard the corrections – they are there / publicly documented and explained – for a reason.

codehead
July 24, 2011 9:53 am

Typo in article: “The result is a directory called either clisumd or clisumd.”—I think one of those should be “clisumu”.

Doug in Seattle
July 24, 2011 10:29 am

Not trying to be OT, but R.Gates mentioned BEST. Wasn’t that supposed to be released by now?

Matt G
July 24, 2011 11:09 am

Bystander says:
July 24, 2011 at 9:50 am
“It doesn’t make any sense to discard the corrections – they are there / publicly documented and explained – for a reason.”
The data is directly from the GISS, the weighting bias refers to placing them in zonal regions from the pole and treating them equally (distance around the pole). This can cause just one or two stations to influence 8 to 10 percent of the overall trend. It is unrealistic to expect this one or two stations in very isolated places to reflect this whole zonal region.

RACookPE1978
Editor
July 24, 2011 11:43 am

DMI reports openly their daily temperature values for “80 degree north,” and has data for every year going back to 1958.
Where are their thermometers? And, why do they record a STEADY and then increasingly DECLINING temperature for summer months (the only days above freezing at 80 north) for the years 1958 through 2010, when GISS/Hansen claims a 4 degree increase for the “Arctic”?

Tim Ball
July 24, 2011 12:14 pm

I appreciate Anthony publishing this article but its purpose puzzles me. As someone involved with Climate studies in Canada, including temperature reconstruction’s and heat island impacts, as far back as the early 1970s I am familiar with the history of Environment Canada (EC). I realize the author is explaining that EC maintains many more stations than are used to produce climate data. Why does the data of a nation require such elaborate explanation? He made a point of saying he was retired from EC, but why did he not produce this when he worked there?
Obviously it was just as necessary then as now. Is it a response to the article on this web site critical of the agency? http://wattsupwiththat.com/2010/08/23/government-report-canadian-climate-data-quality-disturbing/
I ask these questions because of my experience with EC. It was no coincidence that an Assistant Deputy Minister of EC chaired the pivotal meeting of the IPCC in Villach, Austria in 1985. I believe they made the entire agency (EC) and process political from that time.
The author will recall the “dog and pony” show carried out across Canada by a private agency hired by EC to offset serious criticisms of the poor service, cut backs and problems associated with EC. These centred around funding redirected from weather stations and other activities to climate change research that, in my opinion, was very singular and anti-science in its objective. They even tried to charge for data already paid for by the taxpayer. They competed with private agencies for contracts and charged them for the data while they got it for free (can you say unfair competition?).
A couple of years ago I spoke with a former EC employee who resigned in protest to the direction the agency was going. The Auditor General estimated in 2005 that $6.8 billion over 7 years were directed to climate change research. In my opinion this took them a long way from their fundamental purpose of data collection. They replaced many weather stations with AWOS that were severely criticized for inaccuracy. When NavCanada was formed to run airports they initially refused to take them because they were so unreliable. This triggered a Canadian Senate investigation led by Senator Pat Carney. I was told of a former employee hired to go out early each morning in Winnipeg to compare the AWOS temperature with a hand held thermometer. I know there are many other stories.
Maurice Strong (a Canadian) set up the IPCC through the World Meteorological Organization (WMO). This meant all national weather agencies controlled who was appointed to the IPCC. This meant EC controlled almost all climate research funding, which was different than virtually all other science research funding kept at arms length from political interference.
As an active participant and promoter EC accepted the findings of the IPCC without question. It put control of climate research and funding in complete bureaucratic control. It also meant they were on a treadmill of proving the hypothesis in complete contradiction to the scientific method that requires it is disproved.
Incidentally, why was Eureka chosen as representative of the Arctic? Years ago (1980s) I worked with Sylvia Edlund, a botanist and Bea Alt a climatologist both involved in the Polar Shelf Project both looking at the unique climate of Eureka manifest in the botanic refugia. I have spoken with former EC employees who said they were surprised when the weather station was set up there because people knew it was a unique microclimate.
There are a multitude of other questions that this strange article triggers and that require answers. Maybe the author didn’t speak out until retirement because of an experience I had with EC employees. A couple of years ago I spoke in Winnipeg and afterward three EC employees identified themselves and said they agreed with me but were afraid to say anything.

Matt G
July 24, 2011 12:21 pm

racookpe1978 says:
July 24, 2011 at 11:43 am
My link shown in the previous post does show a 4 degree C increase from the late 1960’s. Just that there was also a 4 degree decrease from the 1930’s/1940’s. Notice DMI is 80N+ whereas the GISS data is from 64N-83N) The GISS has being recently trying to correct this by wrongly infilling 83N+ with observed data <83N. This can't realistically work because the Arctic temperatures where sea ice remains all year hardly changes during Summer. This behaves much different to areas <83 N where especially no snow/ice remains with higher solar levels at the time of year.

rbateman
July 24, 2011 12:41 pm

What happens to GISStemp N. Hem. Anomalies when we feed DMI 80N into it?

jorgekafkazar
July 24, 2011 12:49 pm

Some really nice work here on the part of many, a very respectable effort! For anyone who didn’t grasp the Ric Werme / steven mosher exchange regarding WMO standards, what Mosher was saying (paraphrased) was: “…DONT TAKE any of the [station data] that [doesnt] meet WMO standards.”
Regarding “BEST”: According to their very transparent site, “As of July 6 2011, we have not yet run the analysis on 100% of the data.” The major disappointment was the premature release of preliminary, partial results of only land-based (thus the warmest) data by Dr. Richard Muller before Congress on 31 March, 2011. I shall no longer be reading his papers. (jk)
http://www.berkeleyearth.org/FAQ

ferd berple
July 24, 2011 12:49 pm

“steven mosher says:
July 24, 2011 at 9:03 am
But, if somebody wants to look at temperature to answer questions like… how warm was it in greenland 1000 years ago, well then we have to work with the inferior metric of temperature.”
Surely the species of vegetation growing in an area at the time is a better indication of past climate than temperature. Does Greenland support more tropical species than it did 1000 years ago, or are they more arctic species? This would appear to be a more reliable indicator of what the climate is actually like, than say average temperature.
For example, say the average temperature is 15C. Is that 15C all year round, or 45C in summer and -15C in winter. Both areas have the same average temperature but will certainly have entirely different climates and entirely different species of plants and animals. Now add to this water. Is the location humid or is it arid? This again will affect the climate and determine the species of plants and animals that can live there.
This is the fallacy of modern climate science. Average temperature tells you almost nothing about the climate and the suitability for specific forms of life. To use global average temperature as the main metric of climate science, while it tells you almost nothing about the climate or its suitability for live, says a lot about the quality of science that goes into climate science.
Climate science is supposed to be measuring climate. Without knowing both the amount of water present and well as the temperature, along with the ranges of water and temperature, you cannot know the climate.
To call temperature change climate change, without also measuring the amount of water present is pure nonsense. It is unscientific. If anything, the amount of water available has a much larger affect on plants and animals than does temperature. With liquid water plants and animals can typically survive much greater ranges in temperature than they can if water is limited.
As well, the amount of liquid water present serves to moderate extremes of temperature. Two regions can have identical average temperatures, but the one with the most water is likely to be more moderate. Compare coastal cities in the temperate zone with cities inland. Vancouver and Osoyoos are a few hundred miles apart with similar average annual temperatures but have entirely different climates.

Matt G
July 24, 2011 12:54 pm

Forgot to add to previous post.
Therefore during a period of warming (eg since 1970’s) some areas <83N may have had snow/sea ice before then. This melts causing local temperatures to be warmer compared before this occurred. The GISS then infills this warmer bias (only down to melted ice/snow) and places it right above 83N where there is snow and ice. This is awful incorrect science to persume this warm bias has occurred on areas of sea ice still there. The areas <83N before this ice/snow melted would have been as cool as before without this warm bias. The missing snow/ice is the very reason for this bias, hence it is very wrong to assume areas 83N+ have this warm bias while the ice still remains.

matt v.
July 24, 2011 12:57 pm

I don’t know if anyone else has noticed but EC has been revising significantly the Canadian national winter temperature departures . I don’t know when the chages were made , nor do I recall any public notification of these changes. I compared the 2002 version with 2011 issue and found that 47 of the 55 years between 1948 and 2002 were changed going back to 1948 and some as much as 0.5 C [ for example 1950 was changed from -2.6 C -3.1 C. What kind of new data would change the temperature departures so much so far back . There may be other changes but I have not checked them all.

ferd berple
July 24, 2011 1:08 pm

Vancouver is rainforest, while Osoyoos is desert. Few if any species native to Vancouver can survive in Osoyoos, and few if any species native to Osoyoos can survive in Vancouver. Yet they have similar average temperatures and are only a few hundred miles apart. This says that average temperature is a poor measure of climate. Thus, using average temperature as a metric for global climate would appear to be unscientific.

July 24, 2011 1:27 pm

“Pamela Gray says:
July 24, 2011 at 9:16 am
Some thoughts of mine.
30 year average periods do not span warm and cold phases of various atmospheric and oceanic oscillations. At most they span one half of a complete cycle. Comparisons of monthly and yearly trends to the “mean” should use data extending to a 60 year period to calculate the mean.”
The issue has more to do with reducing the variance in the estimate of the monthly mean.
Simply: when you calculate an anomaly you have to decide how many januaries you will average to come up with the mean january. does using 60 give you a different mean than 30?
That’s a testable hypothesis. What happens to the SD going from 30 to 60, well you know how to calculate that as well. So here you have a question that you can actually answer and put numbers on, rather than speculate. For grins of course I once selected stations that only had 100 years of data. then I used the entire period to normalize or scale the series. Answer? dont be surprised…
the answer doesnt change. why? law of large numbers. the wiggles change a bit here and there but the earth is getting warmer. How much is known pretty well. Why? a combination of external forcings and internal variablity. what role do each of these play? much tougher question. how bad will things get, way more difficult question.

Editor
July 24, 2011 1:31 pm

Tim Ball says:
July 24, 2011 at 12:14 pm
> Why does the data of a nation require such elaborate explanation?
There are many ways to store and present data, and they’re constantly evolving. Technical documentation is a discipline in itself. Many computer programs require more effort in writing the documentation than in writing the program. It’s the nature of the beast when you try to make complex data sets easily accessable to non-computer-savvy end-users.
> He made a point of saying he was retired from EC, but why did he not produce this when
> he worked there?
I worked in the technical end of things as a glorified end-user, passing on value-added data and analyses to consultants. I wasn’t a researcher with an advanced degree. I simply did not have the academic credentials to be taken seriously as a researcher.
I was a “data gopher” who dug up relevant data data and ran preliminary analyses for those researchers, as part of my job. My data gopher experiience is what enabled me to write this post. Although I wasn’t in the CS (Computer Science) job group, I did a fair bit of programming as part of the job, along the lines of “Harry the programmer”. I also assisted meteorologists in preparing weather/climate related values for building codes, etc.

July 24, 2011 1:41 pm

“ferd berple says:
July 24, 2011 at 1:08 pm
“Vancouver is rainforest, while Osoyoos is desert. Few if any species native to Vancouver can survive in Osoyoos, and few if any species native to Osoyoos can survive in Vancouver. Yet they have similar average temperatures and are only a few hundred miles apart. This says that average temperature is a poor measure of climate. Thus, using average temperature as a metric for global climate would appear to be unscientific.”
Understand what the “climate” is. Climate is about two things: Long term averages at different locations. In some sense these are idealizations. These measures or metrics can be useful for many purposes. Was the globe generally colder in the LIA? that looks like a scientific question. Would an answer to a scientific question that reported the average temperature be unscientific?
Hardly. What does it mean to say that the average temperature NOW is higher than the average temperature THEN? it means this: pick any spot on the earth. look at the temperature NOW ( say a 30 year average). When we say the past was cooler what we mean is that for any place you pick, your best estimate for the temperature at that place, during that past time, will be lower than it is today. So, the sentence has a meaning. its a very operational meaning. If I think it was cooler in the past I can even draw other hypotheses from that: I could predict that certain plants would have died out in certain locations. I can then test that by looking for evidence. I can predict that certain rivers might have frozen over that never froze before. I can check that. Wow, look at frost fairs in London. So there I take a general statement ” the global average was lower in the past” I propose a test ” Look for evidence of rivers freezing where they never froze before” and gosh I can test that. looks like science, but note that it’s more like historical science than EXPERIMENTAL science.

July 24, 2011 1:41 pm

in Delaware and energy content of moist atmosphere:
You find here the live graph of the enthalpy of moist air at meteoLCD (Diekirch, Luxembourg). It is true that humidity plays an important role, but if you stay at one location total energy content follows usually more or less air temperature in the sense that moist enthalpy is generally higher with high air temperature.

July 24, 2011 1:48 pm

Matt G says:
Matt there are several ways to handle the area above 83N
Out fundamental question is this: what does the TREND look like in places above 83N that have not been sampled. You can:
1. Assume that the trend is Lower ( physics suggests otherwise.. polar amplification
2. assume that the trend is the same ( Giss does this)
3. Assume the trend is higher the further north you go. physics suggest this but nobody uses this assumption.
1 and 2 both underestimate what physics suggests. In the end the difference between 1 1nd 2 is not that great. Why? the land are above 83N is frightfully small. area goes with sin() of lat

July 24, 2011 2:12 pm

rbateman says:
July 24, 2011 at 12:41 pm
What happens to GISStemp N. Hem. Anomalies when we feed DMI 80N into it?
############
DMI 80N.. you mean the model data?
“The daily mean temperature of the Arctic area north of the 80th northern parallel is estimated from the average of the 00z and 12z analysis for all model grid points inside that area. The ERA40 reanalysis data set from ECMWF, has been applied to calculate daily mean temperatures for the period from 1958 to 2002, from 2002 to 2006 data from the global NWP model T511 is used and from 2006 to present the T799 model data are used.”
Here is a question: you want to use data from a model. Fine. That model (ECMWF) has physics in it which predict what the weather would have looked like in unsampled places. Do you accept the physics of that model? If you want to use that data, then understand that data was created by feeding data into a physics model. You are logically commited then to accept the physics as accurate. You should also be aware that the model uses data assimilation and takes ‘data’ from satellites. these satillites ALSO use physics models to create their data. Do you accept those physics?
http://www.ecmwf.int/research/ifsdocs/CY25r1/Physics/Physics-03-3.html
Err. that would be the same physics used by GCMS and the same physics that says c02 warms the planet.
So you’ve suggested using data from a model that has GCM physics in it to improve the estimate of temperatures north of 80. in short INFILL with model data driven by GCM physics

July 24, 2011 2:14 pm

I’ll just remind people that I wrote a program last year to scrape Canadian climate data directly from the site. I was interested in hourly and daily data and it seems that the web based access only gives a year or two before it hangs (not sure if this is a bug in the code or a deliberate attempt to prevent people from downloading too much data). My program is written in VB6 and just had a look at it and it’s fairly basic but it does work. I posted the material to WUWT on 24 or 25 April 2010 but my blog software mangled the program file address initially. That’s been fixed and the program is available at:
http://drgimbarzevsky.com/Downloads.html
And it’s the climateScraper program. There’s another version which just scrapes data on Weather Underground sites.
NOTE: the Canadian webserver is very simple minded and if no data exists for the year in question it will give lots of blank files of daily and hourly data so make sure you have the right years.
Documentation can be found at:
http://drgimbarzevsky.com/Boris_blog/archives/04-01-2010_04-30-2010.html
It’s not that clear and I’ll try to clean it up but it’s a beautiful day here in Kamloops and I plan on enjoying it.

July 24, 2011 2:21 pm

jorgekafkazar
Ya its funny how people
1. scrutinize every record in GHCN and complain
2. run off and include data without doing a proper QA
It’s a very long process to vet new data. That’s one reason why GISS uses GHCN. We criticize them for not doing an independent QA. But then, we run off and get additional data without any QA. Some times it makes sense to be consistent.

Matt G
July 24, 2011 3:04 pm

steven mosher says:
July 24, 2011 at 1:48 pm
83N+ is a very small area and the physics suggests it should be warming closer to the poles compared with the tropics. GISS does assume the trend is the same, but it is not the same when the environment is different. (ie water (sea or ocean), land, ice, snow, tundra, desert, forest, grassland, urban etc) Temperatures just above the ocean behave much different to just above land, especially when there is still ice there. Even when the terrain is the same, the trend is different when there is snow/ice there compared where there isn’t. Despite this very small area GISS has stilled warmed considerally more than it’s surface and satellite rivals. (in this context regarding such small changes over the decades, globally) Hence, there has to be a lot of warming to go off in tangent from just a very small area that the DMI doesn’t show or Arctic ocean based buoys 80+N.
Data sources from the Arctic ocean here (80+N) at the peak of Summer are always in the range between 0c and 3c every year. This hasn’t changed for many years and is always kept cool by yearly melting ice or snow/ice. This is despite high pressure been common over recent weeks here, but looks like this will change very soon.
http://www.uni-koeln.de/math-nat-fak/geomet/meteo/winfos/synNNWWarctis.gif

TomRude
July 24, 2011 3:07 pm

@ Mosher, @ Fred here is a quote about climate:
“(…) basic knowledge (…) about the real mechanisms of meteorological phenomena, and about the processes whereby climatic modifications are transmitted, is necessary for the analysis and understanding of climatic evolution, across all scales of intensity, space and time. ”
Marcel Leroux
It’s a tad more than computing average temperatures…

OK S.
July 24, 2011 3:15 pm

Pamela Gray says @ July 24, 2011 at 9:16 am:

Sunset Magazine climate zones are much better. For example, in NE Oregon, the climate zone designations are different between the two sources. If you want to know about climate, stick to the Sunset magazine version. When those zones change, you know you have climate change going on because they are set up by local extremes, not by regional weather pattern average anomalies.

Thanks for the Sunset link. Looks promising. Here on the Southern Plains (Zone 33 in Sunset), almost nothing recommended by most gardening books survives very long. Even for a season. Ever. And even native plants.
OK S.

Alberta Slim
July 24, 2011 3:34 pm

Tim Ball says:
July 24, 2011 at 12:14 pm
“I appreciate ……… etc..”
Your comment is very interesting. Thanks.
It seems that EC should be investigated now that there is a majority gov’t.
Can you, or would you contact your MP and make a request to look into this?

July 24, 2011 4:15 pm

steven mosher says:
July 24, 2011 at 9:03 am
Do you think the MWP was warmer than today? was the LIA cooler than today.
I know that ‘average’ temperature has no meaning, but what do we mean when we say that greenland in in MWP was warmer than today? do we mean that the air temperatures were LOWER?
———-
You can have a “warmer” MWP simply by having short mild winters, with no change in summers. There is no proxy that I’m aware of that can show us that winters are less cold and shorter during the MWP than during the LIA. The LIA could very well have had short hot summers, with deep long cold winters.
That’s the problem with average yearly temp, it doesn’t tell us anything about the specifics of each season.

July 24, 2011 4:19 pm

Boris Gimbarzevsky says:
July 24, 2011 at 2:14 pm
I’ll just remind people that I wrote a program last year to scrape Canadian climate data directly from the site.
————-
I had the same problem with it hanging. Just had to add a bit of code to make sure it continued from whence it left off. I managed to download the entire dataset. Yes, it does include nulls for days of no data. I parsed all the data into Access databases, one for each station and a master mdb file to keep track of them all. Part of the parsing script deleted such missing records.

July 24, 2011 4:31 pm

steven mosher says:
July 24, 2011 at 1:41 pm
Understand what the “climate” is. Climate is about two things: Long term averages at different locations. In some sense these are idealizations. These measures or metrics can be useful for many purposes. Was the globe generally colder in the LIA? that looks like a scientific question. Would an answer to a scientific question that reported the average temperature be unscientific?
Hardly. What does it mean to say that the average temperature NOW is higher than the average temperature THEN? it means this: pick any spot on the earth. look at the temperature NOW ( say a 30 year average). When we say the past was cooler what we mean is that for any place you pick, your best estimate for the temperature at that place, during that past time, will be lower than it is today. So, the sentence has a meaning. its a very operational meaning. If I think it was cooler in the past I can even draw other hypotheses from that: I could predict that certain plants would have died out in certain locations. I can then test that by looking for evidence. I can predict that certain rivers might have frozen over that never froze before. I can check that. Wow, look at frost fairs in London. So there I take a general statement ” the global average was lower in the past” I propose a test ” Look for evidence of rivers freezing where they never froze before” and gosh I can test that. looks like science, but note that it’s more like historical science than EXPERIMENTAL science.
——–
Not neccessarily true. The problem with average is there is no context. You can have a long mild summer, and a deep cold short winter which will have the same average temp in that year as a mild long winter and a short hot summer. A warmer average spring can mean two things, a quick end to a mild winter, or a fast beginning to a hot summer.
River ice only tells you whether or not winter was near the melting point. It cannot tell you that 200 years ago winter’s average dec temps were below -30C where are today the average is -20C. That’s “warmer”, but you can’t find a proxy for that.
That is why average is meaningless, there is no context of what each season is doing.
BTW, Anthony, this new text box really sucks, keeps flashing, and you can’t see what you are typing as it defaults back to the short window, with each character typed. The down arrow makes the cursor go up!

Amino Acids in Meteorites
July 24, 2011 4:47 pm

Here is a short talk on dropped station by Joseph D’Aleo

Amino Acids in Meteorites
July 24, 2011 4:56 pm

Here’s how stations can be dropped and using ‘anomaly’ can add warmth not looking like much in the graphs to the naked eye:
How ClimateGate scientists do the anomaly trick
Part 1

Part 2

rbateman
July 24, 2011 5:07 pm

steven mosher says:
July 24, 2011 at 2:12 pm
Where is the raw data that goes into the ‘model’.

July 24, 2011 5:25 pm

Steven Mosher says:
“Look for evidence of rivers freezing where they never froze before and gosh I can test that. looks like science, but note that it’s more like historical science than EXPERIMENTAL science.”
Some historical evidence is also solid empirical evidence. For example, Viking structures, tools and gravesites with well preserved, frozen bodies are still reappearing as the permafrost melts, indicating that Greenland was warmer during the MWP than it is now, and that the subsequent LIA drove the colony out.

Editor
July 24, 2011 6:18 pm

Alex says:
July 24, 2011 at 4:38 am
> Wow, Textfiles, short filenames? Do we still have 1990?
Yeah, and some people (like me) can grep their Email files. (At work I use Thunderbird and IMAP, I’m not quite the fossil I like to think I am.)

Editor
July 24, 2011 6:25 pm

steven mosher says:
July 24, 2011 at 8:52 am

Ric Werme says:
July 24, 2011 at 4:57 am
> Ah yes, and make sure you DONT TAKE any of the data from stations that dont meet WMO standards. lots of those there
How many USHCN stations don’t meet WMO standards?
################
Ric, I think you misunderstand
Now what are WMO Standards? not what you think Ric,
http://www.climate.weatheroffice.gc.ca/prods_servs/normals_documentation_e.html

You’re right, thanks for the link.

Editor
July 24, 2011 8:25 pm

Alex says:
July 24, 2011 at 4:38 am
> Wow, Textfiles, short filenames? Do we still have 1990?
I simply wanted to be backwards compatable to everybody. At one point I did the entire thing in bash. Let’s just say that emulating floating point division in bash is not fun. Or I could’ve used “bc” to do the calculations under bash. Either way, fewer people would’ve understood the code.

Carrick
July 25, 2011 7:34 am

Beyond the comments that Steven Mosher has been making, it’s important to note that it is difficult to accurately measure surface air temperature in the high arctic. One one hand, if you leave the sensor too close to the surface (e.g., 1.5-m height as is typical of the sensors being discussed here), wintertime conditions threaten to have the sensor get covered by snow…so you’re no longer defacto measuring surface air temperature, but something else.
Secondly, if you move the sensor to far from the surface (in an attempt to avoid having it be covered in snow), in the high arctic, the wintertime vertical temperature profile can be very shallow—just a few meters tall in windless conditions, so you end up with a shallow inversion and a temperature maximum between your sensor and the ground. Most likely we’ve all seen this phenomenon before, it’s typically the reason for a thin surface layer fog bank, however the conditions for creating this during the long (sunless) high arctic winter are much more prevalent than for lower latitudinal positions.
One could rely on model based data, like using ECMWF, to infill missing stations, but it’s again not clear what you are really getting here, if the data themselves are unreliable. I think it’s a problem that needs work, but for now, you might be better off doing a ±82.5° average, and exclude the more northern/southern polar regions from the average≥