Guest post by Walter Dnes.
There was phenomenon known as “the great dying”, where most of the Canadian stations being used for GISS world monthly temperatures disappeared, at least according to GISS.
In reality, many of these “dead” stations are still there, putting out data every month. This
post is about finding additional Canadian monthly mean temperatures and anomalies from the Environment Canada website.
First, some administrivia…
– I’m a retired Environment Canada employee. I do not speak for
Environment Canada. This is all done in my capacity as a private
citizen.
– the following is all based on publicly available data from the
Environment Canada website as of late July, 2011.
– there are 2 versions of the code and data and scripts. Each
clisum?.zip file is approximately 5 megabytes. You will only need
one. Both zipfiles contain BASIC source code, some scripts, and a
subdirectory with over 300 data files. The BASIC code is generic
enough that it should run under most BASIC versions on most platforms.
– The linux files are the definitive version. I did a translation to
MS-DOS because most people still use Windows. I haven’t been able to
fully test the DOS version, but I hope it works. All file and
directory names comply with the old DOS 8.3 filename spec, for maximum
backwards compatability
– The files in clisumd.zip assume an MS-DOS or Windows environment.
This includes old-fashioned DOS, as well as a Windows DOS box command
prompt. The files all have MS-DOS end-of-line, and the .BAT files
are written to be executed under COMMAND.COM or CMD.EXE. I don’t
have a Windows or DOS machine, so I can’t be sure the BASIC code is
correct for QBASIC, or whatever version you may have. Be sure to
edit the .BAT files to replace the command “bas” with whatever BASIC
interpreter/compiler you’re actually using. You may need to tweak
some of the BASIC programs for minor syntax differences. Click here
clisumd.zip to download the MS-DOS files if you run
DOS or Windows.
– The files in clisumu.zip assume a unix/linux/bsd environment. They
all have unix end-of-line, and the scripts are written to be
executed under bash. The BASIC programs were written for the “bas”
BASIC interpreter. “bas” can be installed using your linux distro’s
standard install comand (e.g. “apt-get” in Debian/Ubuntu and
derivatives; “emerge” in Gentoo). If your distro doesn’t have “bas”,
see http://www.moria.de/~michael/bas/ to download the source tarball
and build it manually. Click here clisumu.zip to download the unix/linux/bsd files if you run any of unix/linux/bsd.
– there are links to various free BASIC interpreters and compilers at
http://www.thefreecountry.com/compilers/basic.shtml
Getting code and data
The first step is to download clisumd.zip (for Windows/DOS users) or clisumu.zip (for unix/linux/bsd users) and unzip it. The result is a directory called either clisumd or clisumd. Inside that directory are 13 or 14 files and a subdirectory “csvfiles” with over 300 CSV data files.
The next step is to download a monthly climate summary in text format. With javascript enabled, go to webpage:
http://climate.weatheroffice.gc.ca/prods_servs/cdn_climate_summary_e.html
Select the desired month/year, Povince=”All”, Format=”Plain Text”. You should see something like this screenshot:
Once you get to this point, click on “Submit”
Save the resulting text webpage to a textfile in the clisumu or clisumd directory. Since I used May 2011 data I named the textfile cs201105.txt. I also downloaded June as cs201106.txt. You’ll want to download the latest month every month. The data is generally available 7 to 10 days after the end of the month.
************************************************************************
** The following portion only needs to be run once for initial setup **
** You do not have to do the next portion, including downloading 300+ **
** data files. I’ve done it already and included the output in the **
** zipfiles. The following instructions are documentation for the **
** sake of the scientific method, in case anybody wants to duplicate **
** this the hard way. The most likely use is in the case of manual **
** over-rides, which I’ve found one case for so far. There may be **
** be other cases. **
************************************************************************
Creating a list of candidate stations with normals data
=======================================================
The next step is to create a subset file containing only sites with data in the most recent normals. We actually want 1951-1980 normals for comparison to GISS. Stations with current normals are candidates for having 1951-1980 normals, and their data will be downloaded.
We need to pick out only the lines with values in the “D” (Departure from normal) field, and copy only relevant data fields to a subset file. The program subset.bas is launched by the script “subset” in linux, or the batch file “subset.bat” in DOS. The script/batchfile sets up the name of the input and output files as environment variables before launching subset.bas.
The program “subset.bas” scans for lines with column 64 == “.”. This signals the presence of some temperature normals data for the period 1971-2000. For only those lines, the climate ID, monthly mean temp, count of days with missing mean temp, and station name are extracted and written to a second file. In the case of May 2011, subset.txt has
monthly mean temperatures for 313 sites which have normals for 1971-2000 to compute anomalies against. In this example, I’ve called the output file “subset5.txt” to remind me that it’s for May.
The DOS batch file is invoked as…
subset
and the bash script is invoked as…
./subset
Because this only needs to be run once, I hardcoded the filenames into the batch/script files.
Downloading monthly data in CSV format
======================================
Unlike the 1961-1990 and 1971-2000 normals, the 1951-1980 Canadian climate normals do not appear to be on the web. But since the monthly data is available online for downloading, we can do the calculations ourselves, after downloading the monthly data. Here is how the data was downloaded…
We search by station name. The first line in subset5.txt is…
“1012055”,9.5,17,48.829,-124.052,”LAKE COWICHAN”
The climate data advanced search page is at…
http://www.climate.weatheroffice.gc.ca/advanceSearch/searchHistoricData_e.html
Use the “Search by Station Name:” menu as shown in the screenshot:
Enter the name, or a portion thereof, as shown in the red rectangle. Note that upper/lower case doesn’t matter, and spaces are ignored. Thus “lakecow” matches “LAKE COWICHAN”. Then click on the “Search” button as shown by the red arrow in the
screenshot. Alternately, you can press “Enter” on your keyboard. This takes you to the search results page as shown in the screenshot:
We run into a problem here… there are two stations named “LAKE COWICHAN”, which does happen on occasion. It’s not until you actually select a station that you find out if you’ve got the right one. To select monthly data, you must first select “Monthly” in the drop-down menu under “Data Interval”, and then click on the “Go” button
corresponding to the station you want. You’ll get a display similar to the screenshot:
I’ve highlighted a couple of areas. At the upper left is the climate ID in a red rectangle. This must match the climate ID at the beginning of the line in the subset file, unless you’re doing a manual over-ride (more about this later).
The red arrow at the bottom right corner points to the link for downloading the data in CSV format. I right-clicked on that link and saved the file to the csvfiles directory. My convention is to name the file after the climate ID. Thus, this one would be “1012055.csv”. Note that this is merely a label for convenience only. The files could be assigned any legal filenames, and the BASIC programs would still work, because they read the climate ID from data inside the csv data files.
Rinse/lather/repeat the above for all 300+ lines in the subset file. Now you know why you don’t want to repeat this yourself.
Now for the manual over-ride example. Look in the climate summary file cs201106.txt. Station “WINNIPEG RICHARDSON AWOS” with climate ID “5023226” has a mean monthly temperature, but it does not have normals data. Searching for “winnipeg” in the climate data advanced search page yields several Winnipeg sites. If you click “Go” on “WINNIPEG RICHARDSON AWOS” you’ll see that it’s located at 49 55’N and 97 14’W and
elevation 238.7 m. Go back to the Winnipeg search results page, select “Monthly” and click “Go” for “WINNIPEG RICHARDSON INT’L A”. You’ll notice that it’s located at 49 55’N and 97 14’W and elevation 238.7 m. They’re at EXACTLY the same location. Why the split reporting, I don’t know. Anyhow, I downloaded the CSV monthly data with filename
“5023222.csv” to the csvfiles directory. Then I opened it with a text editor, and changed the 6th line from
“Climate Identifier”,”5023222″
to
“Climate Identifier”,”5023226″
This causes the BASIC programs to treat the monthly data as belonging to the AWOS site when computing monthly normals. Thus we will get monthly temperature anomalies versus 1951-1980 for the AWOS site, even though it’s relatively new.
Calculating the monthly normals
===============================
The normals batch/script file needs to be run only when the contents of the csvfiles subdirectory change. This includes individual files being added, deleted, or edited.
The program normals.bas opens a CSV file for input, and the normals file in append mode. It then calculates the normal temperature for one station, appends one line of data and exits. It is called serially by a FOR loop in the normals shell script or normals.bat batchfile, once for each file in the csvfiles directory. Since lines are always being appended to normals.txt, the script deletes the normals file before starting the loop. This starts off with a clean slate. The script then sets the name of the normals file, and the value of the normals start and end years, and then loops through all the files in the csvfiles
directory that match the spec “*.csv”. The file is invoked in unix/linux/bsd as…
./normals
and in a DOS box (including Windows) as…
normals
Because of limitations in the DOS FOR command, normals.bat has a couple of extra steps…
1) The bash FOR command sorts filenames when evaluating “*.csv”, which results in the file normals.txt being in sorted order. The DOS FOR command doesn’t do this. The workaround is to write output to a scratch file (normals.000) and sort that file to normals.txt at the end.
2) The bash FOR command accepts multiple commands in a DO/DONE block.
The DOS FOR command doesn’t do this. It has to be a “one-line-wonder”. The workaround is to make the one command a CALL to a 2nd DOS batch file, namely normals2.bat. normals2.bat has the multiple commands to execute.
Note that normals and normals.bat set STARTYR=1951 and ENDYR=1980. This is because the immediate goal of this project is to generate normals to compare against GISS, which happens to use 1951..1980 as its base period. There’s nothing preventing anybody from using 1961..1990, or any other random base period for that matter.
The output format for the normals file is… Columns 1-10 The 7-character climate ID in quotes, followed by comma
This is followed by 12 repetitions (1 for each month) of…
SNN.N,NNN,
Where “SNN.N” is the monthly mean temp, with a minus sign if needed,
“NNN” is the number of years of data for that month
************************************************************************
** This finishes the portion that only needs to be run once for **
** initial setup. The following is run every month after downloading **
** the latest monthly climate summary from Environment Canada. **
************************************************************************
Calculating the monthly temperature anomalies
=============================================
Because the monthly data will be calculated using different filenames and months, the anomaly batch/script files accept parameters. The first parameter is the month as a number from 1 (January) to 12 (December). The second parameter is the name of the monthly climate summary file that you’ve downloaded from the Environment Canada website. Note that the program *ALWAYS WRITES TO THE SAME OUTPUT FILE NAMES*. If you want to keep anomaly output files, and not have them overwritten, rename them before doing the next run of anomaly files. I’ve included 2 sample
monthly climate summaries, cs201105.txt for May 2011, and cs201106.txt for June 2011. An example invocation for June 2011 data is, in DOS…
anomaly 6 cs201106.txt
and in bash
./anomaly 6 cs201106.txt
There are 2 output files. anomaly0.txt has output for every station with a monthly mean temperature in the climate summary, and a line in the normals file. anomaly1.txt only has those lines for which…
a) the month’s summary data shows zero days missing, and
b) there are 30 years of data for this month in the normals
This is intended as a filter to list only the best data for consideration. You can relax the criteria if you wish, by modifying anomaly.bas.
An example line from the anomaly outputs is…
“1021830”, 15.1, 0, 15.0, 30, 0.1,49.717,-124.900,”COMOX A”
The fields are…
Climate ID
Actual monthly mean temperature this month
Number of missing days this month
Normal monthly mean temperature
Number of years of normals data for this month
Temperature anomaly versus the normal value
Station latitude
Station longitude
Station name
This output is suitable for importing into a spreadsheet, and
especially into a GIS program for plotting.
***********************************************************************
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.




That’s funny. i had a reader ask me if I could get this canadian data for him.
There is actually a script –a scraper– that does this all automagically. That scraper is posted
http://scraperwiki.com/scrapers/canada-temperature-data/
Looks like a little python maintenance is required. So no dos, no bat files, no manually going to pages. scrape it.
Ah yes, and make sure you DONT TAKE any of the data from stations that dont meet WMO standards. lots of those there
If you want to know who wrote the scraper.. its drJ. of clearclimatecode. these wizards actually got Gisstemp up and running, turned it into python and benchmarked against gisstemp. i’m betting it replaces the fortran code of hansen in the near future…
Then, this canadian data was formated into GHCN style formats and fed into GISSTEMP!
Yup. So what happens when we add all this extra canadian data.. to GISSTEMP?
So with MORE stations from canada.. what do we expect? remember we are just looking at changes or departures from the mean.. it might be colder in canada, but we dont make the mistake of using “unnormalized” data. every station gets normalized by its mean..when we normalize data then a station that warms from 10 to 11 gets a value of +1. and a station that warms from
-30 to -29 gets a value of 1.
Will more stations change the global mean? ONLY IF… only iff the TREND at those stations is Higher than the average station or lower than the average stations. its the trend that matters.
Well, here is what you see if you add more data for canada
http://clearclimatecode.org/analysis-of-canada-data/
I had looked at Canadian data back in 2010 (Part 1; Part 2; Part 3; Part 4; Part 5) comparing Environment Canada with GISS. drJ did ask me if I had all the data, but, working in Excel I had only downloaded a few individual stations.
What puzzled me (Part 2) was that there was an offset between the station temperatures in GISS and those reported by Env Canada. While this doesn’t matter when working with anomalies I still wonder why it occurred, although I did resolve how (Part 4).
Part 5 looked at the curious phenomenon that the rate of warming increases dramatically when you cross the US-Canadian Border. In particular see this map http://diggingintheclay.files.wordpress.com/2010/05/49thparallelmap26key.png
While I agree that more stations do not change the rate of warming, I think we have to look very carefully at the rate of change in those that are reporting post 1970. We know land use change and urban growth can increase temperatures and although it may never be possible to tease out the effects of land use changes definitively, I’d like to encourage folks to use Walter’s information and look in more detail at Canada.
If you look at the graph produced by CCC http://clearclimatecode.org/wp-content/uploads/2010/11/Zone+64+90-440.png indeed there is no difference when the Environment Canada data is used (with more stations). But look at the rates of warming ~1920-1940 cf 1980-2009. It is possible that land use change and UHI growth are contributary factors to the increased warming in the more recent period.
PERL
Gah. What we (some) go thru for 20-yr. deep “backwards compatibility”! But it’s amazing how much data and program code in text and Basic you get for 5 MB.
How many stations are represented?
Very good walk through of the data process, I just went through the total zip file of all data and extracted it to all daily values for all stations for the entire periods of record (started in March and was finished in May) with sorting the data into tables for each parameter, for each month, for each year, with reference to Station ID Long and Lat + values for all stations active for each date for the entire period of record.
Once the data was tabled I produced csv files for each date that represents the same date 6558 days apart for four cycles, and i am in the process of adding the Canadian data to Alaskan and USA 48 contiguous states, to generate a map based upon the composite of the last four cycles of the 6558 day repeat period of the 240 lunar declination and solar rotation periods (both 27.32 days) as an upgrade to my weather forecast site aerology.com Site revision for the inclusion of the new maps will be finished soon, greater detail with finer grid resolution of 3 miles squares rather than the 90 miles square grid resolution as is currently displayed.
conversion from metric to SAE was kind of a pain to get it all in the right format, and retain the decimal place in the right place, in the different sets of data due to different scales for snowfall tenths, precipitation in hundredths, and the converted temps rounded to tenths of degrees F. There are many broken sets of station data of various lengths, and few records of snow on ground before 1955 then more after about 1962-5 as the idea caught on.
I just used all of the valid station daily values that appeared in the record, so in areas or times when the data points drop out, the definition of the resultant data maps just gets a little grainier in those areas, with changes from day to day as the record gaps come and go, in areas of high record density it is not noticeable, but in areas where the station locations are more than 3 to 5 degrees apart there is some winking as you scroll through the dates of the map coverage of the forecast. The handling of the missing weekend data was new they just carry the three day total for all three days, so there are flashes of detached precipitation on the days when no other stations showed precip nearby.
Hope to have the whole rebuild of the site on line soon, Gawd I love high speed internet and access to data that gives anyone who has the time and inclination to look at what is really there.
Certainly shows the disappearances we’d expect and the very limited geographical extent of the ground based temperature data record.
Perhaps another approach that might be enlightening. Someone familiar with the data maybe could generate a pair of graphs showing the number of complete temperature data sets still in the mix that are favourable to the AGW orthodoxy (rising temp) and another showing the number of data sets that are not favourable (falling temp) for years 1880 – 2007. Also a pair of graphs showing the number incomplete data sets that have been the subject of adjustment along and another graph showing the total number of data sets still being considered for years 1880 – 2007?.
“Global Warming” was made in USA
Interesting. Appears to be a large rash of red apparently across all Australia, mostly SE corner, circa 1992, *ALL* estimates? And then almost NOTHING from there on in. Cearly, the science (Of estimating) is settled.
Please keep in mind that TEMPERATURE is only an approximate indicator of ENERGY. The Energy is what is important, and the energy in the atmosphere changes significantly as the humidity changes.
So when Mosher July 24, 2011 at 1:43 am says – …. “every station gets normalized by its mean..when we normalize data then a station that warms from 10 to 11 gets a value of +1. and a station that warms from -30 to -29 gets a value of 1. ”
That is only true if you take one whale and one minnow and count them each as ‘one fish’.
Yes I know, whales are mammals – you are starting to get this. 🙂
In fact, 20 Deg F (about 11 C) lower temperature can have 15% higher energy, as in this example:
==========================================================
from WUWT
http://wattsupwiththat.com/2010/06/07/some-people-claim-that-theres-a-human-to-blame/#more-20260
Max Hugoson says June 7, 2010 at 9:49 am
But to all the people playing “average temperature”, and in the spirit of trying to do GOOD ENGINEERING WORK… “average temperature” is a FICTION and MEANINGLESS. Here is why: Go to any online psychometric calculator. (Heh, heh, I use the old English units, if you are fixated on Metric, get a calculator!)
Put in 105 F and 15% R.H. That’s Phoenix on a typical June day.
Then put in 85 F and 70% RH. That’s MN on many spring/summer days.
What’s the ENERGY CONTENT per cubic foot of air? 33 BTU for the PHX sample and 38 BTU for the MN sample.
So the LOWER TEMPERATURE has the higher amount of energy.
Thus, without knowledge of HUMIDITY we have NO CLUE as to atmospheric energy balances.
Dave in Delaware,
Thank you for that. That is what everyone forgets (myself included at times). The properties of water in the atmosphere, as well as in the oceans, modulate the temperature of our planet.
Wow, Textfiles, short filenames? Do we still have 1990?
It looks like Canada isn’t the only country with missing temp records. China, Australia, South America all blank out about 1990….
I have minimal knowledge of statistical analysis. So can someone explain to me how you get an average temperature of something without having any data? Or is the GISS temperature record essentially just an extrapolation from 20 years ago or maybe made up out of thin air?
I also find it interesting that the data vanishes from this record shortly after the first US Congressional hearing on AGW.
George V.
Hoser says:
July 24, 2011 at 1:51 am
> PERL
Go ahead! When I decided to look for a language other than C for a lot of my personal work, I was delighted to discover Python. The oldest source file I see is from 2002. Hmm, I’m still using one from 2003. Ah, there’s the first program – April 2001.
Perl is great for some applications, and that includes some of this work, but I use Python for just about everything except OS internals and client side web scripts. (Whatever happened to Iron Python, anyway?)
steven mosher says:
July 24, 2011 at 1:43 am
> If you want to know who wrote the scraper.. its drJ. of clearclimatecode. these wizards actually got Gisstemp up and running, turned it into python and benchmarked against gisstemp. i’m betting it replaces the fortran code of hansen in the near future…
One thing I like about GISS is that they like Python. Don’t break into a MIT network without it!
steven mosher says:
July 24, 2011 at 1:22 am
> Ah yes, and make sure you DONT TAKE any of the data from stations that dont meet WMO standards. lots of those there
How many USHCN stations don’t meet WMO standards?
@Dave in Delaware says: July 24, 2011 at 4:38 am
//////////////////////////////////////////////////////////////////////
Absolutely, and that is why ocean temps are the only relevant metric (and of course, oceans represent more than 70% of the globe and 99% of the latent heat content/capacity).
Here’s a quick script in R to download and extract station info – that took about 5 minutes to write.
get.normal=function() {
download.file(“http://surfacestations.org/dnes/clisumd.zip”,”temp.zip”,mode=”wb”)
loc=”clisumd/normals.txt”
handle=unz( “d:/temp/temp.zip”,loc ,”r”)
x=scan(handle,sep=”,”,what=c(“”,rep(0,24)))#Read 259950 items # 1733 columns
close(handle)
x=t(array(x,dim=c(25,length(x)/25) ) )
x=data.frame(x)
for(i in 2:25) x[,i]=as.numeric(as.character(x[,i]))
return(x)
}
normal=get.normal()
id=as.character(normal[,1])
get.info=function(station_id){
loc=paste(“clisumd/csvfiles/”,station_id,”.csv”,sep=””)
handle=unz( “temp.zip”,loc ,”r”)
x=scan(handle,sep=”,”,what=””)#Read 259950 items # 1733 columns
close(handle)
info=x[1:56]
x=x[57:length(x)]; n=length(x)
x=t(array(x,dim=c(25,n/25) ) )
x=data.frame(x)
names(x)=info[32:56]
for(i in 2:25) x[,i]=as.numeric(as.character(x[,i]))
return(x)
}
station=get.info(id[1])
Re – the great dying of GISS thermometers. In all the discussion of this, people have overlooked why GISS loses track of so many stations. I noticed this when I got CRU station information via the mole – I don’t remember whether this was discussed at the time. CRU was able to locate stations that GISS wasn’t able to find.
The reason is the interaction between GISS’ reference station method and the changeover from historical WWD sources and CLIMAT monthly. GISS requires a 24 month overlap or something like that (I don’t recall the precise number offhand) between scribal versions. It treated WWD and CLIMAT as different sources. For most of the “missing” stations, the overlap between WWD and CLIMAT versions was “too short” for the unsupervised GISS methodology. Needless to say, the relationship between versions could and should have been determined by questionnaires to the various meteorological agencies. I’m sure that Environment Canada could have clarified whether the WWD station and CLIMAT stations were the same. However, this obvious clerical work doesn’t seem to have been done. Instead,Hansen introduced adjustments at the changeover.
BTW, Walter, thank you very much for this service. On a couple of occasions in the past, I tried to wade through the Env Canada jave scripts to get a proper database and gave up in frustration. You’ve done a good deed.
Last year I downloaded all the daily data from EC website, using the VB script at this site: http://cdnsurfacetemps.wordpress.com/2010/02/27/how-stations-will-be-presented/
Took 3 months.
My analysis of that data is here: http://cdnsurfacetemps.wordpress.com
Summer TMax in Canada is dropping. There are fewer heat waves in Canada now than there was in the 1930’s Winters are less cold, which is what is driving the average up.
>2011
>using BASIC
Anyone with a Windows 7 system will have PowerShell installed as part of the bundle; so I’ve ported the .bas/.bat combinations to that language and put them up on pastebincom
subset.ps1 at /rKWXsQ6e
normals.ps1 at /0gRwBWKb
anomaly.ps1 at /TL9MwrG2
having tested them to ensure that they generate the same outputs as are in the .zip bundle
@Ric Werme
It’s up to 2.7.1 (beta 1) now. see ironpythoncodeplexcom
That’s pastebin[dot]com and ironpython[dot]codeplex[dot]com
Mosher, thanks for the link. Since we know Canada isn’t really warming 5C per century, we have a much better idea what UHI is. And its huge.
steven mosher says:
July 24, 2011 at 1:43 am
Steve provides us with a link to clearclimatecode that shows GISS landtemps records in the Arctic (64N-90N)from 1880. What I noted was that if one compares the trend pre-AGW (1880-1945) and with AGW (1945-2009) we see the latter trend is lower. Interesting.
Verity
“But look at the rates of warming ~1920-1940 cf 1980-2009. It is possible that land use change and UHI growth are contributary factors to the increased warming in the more recent period.”
Yes its possible. Anything is possible. Form a hypothesis. Changes in land use will lead to changes in temperatures.
1. Define “land use changes”. Objectively, what kinds of land use changes do you mean.
I can usually go find metadata IFF people can define their terms accurately.
2. Formulate a Null: Trends at Type A > trends at Type B
usually what I’ve found is hat people are unwilling to formulate their ideas in these ways. and unwilling to have their ideas tested. And if they are tested and found wanting, they change the terms of the discussion.
Just sayin.
steven mosher’s post on
July 24, 2011 at 1:43 am with this link:
http://clearclimatecode.org/analysis-of-canada-data/
Tells you all you need to know about this topic. Some skeptics will be even more disappointed with the results then they were with the Berkeley Earth Temperature Project. Result: There has been no conspiracy…just warmer temps.
REPLY: You know Mr. Gates, the is quite a boneheaded comment. Walter did something useful, I posted it strictly because it was useful. Neither of us, nor anyone commenting in this thread said anything about “conspiracy”. You did.
Knock it off – take a 24 hour timeout – Anthony
Ric Werme says:
July 24, 2011 at 4:57 am
> Ah yes, and make sure you DONT TAKE any of the data from stations that dont meet WMO standards. lots of those there
How many USHCN stations don’t meet WMO standards?
################
Ric, I think you misunderstand
Now what are WMO Standards? not what you think Ric,
http://www.climate.weatheroffice.gc.ca/prods_servs/normals_documentation_e.html