Guest post by Walter Dnes.
There was phenomenon known as “the great dying”, where most of the Canadian stations being used for GISS world monthly temperatures disappeared, at least according to GISS.
In reality, many of these “dead” stations are still there, putting out data every month. This
post is about finding additional Canadian monthly mean temperatures and anomalies from the Environment Canada website.
First, some administrivia…
– I’m a retired Environment Canada employee. I do not speak for
Environment Canada. This is all done in my capacity as a private
citizen.
– the following is all based on publicly available data from the
Environment Canada website as of late July, 2011.
– there are 2 versions of the code and data and scripts. Each
clisum?.zip file is approximately 5 megabytes. You will only need
one. Both zipfiles contain BASIC source code, some scripts, and a
subdirectory with over 300 data files. The BASIC code is generic
enough that it should run under most BASIC versions on most platforms.
– The linux files are the definitive version. I did a translation to
MS-DOS because most people still use Windows. I haven’t been able to
fully test the DOS version, but I hope it works. All file and
directory names comply with the old DOS 8.3 filename spec, for maximum
backwards compatability
– The files in clisumd.zip assume an MS-DOS or Windows environment.
This includes old-fashioned DOS, as well as a Windows DOS box command
prompt. The files all have MS-DOS end-of-line, and the .BAT files
are written to be executed under COMMAND.COM or CMD.EXE. I don’t
have a Windows or DOS machine, so I can’t be sure the BASIC code is
correct for QBASIC, or whatever version you may have. Be sure to
edit the .BAT files to replace the command “bas” with whatever BASIC
interpreter/compiler you’re actually using. You may need to tweak
some of the BASIC programs for minor syntax differences. Click here
clisumd.zip to download the MS-DOS files if you run
DOS or Windows.
– The files in clisumu.zip assume a unix/linux/bsd environment. They
all have unix end-of-line, and the scripts are written to be
executed under bash. The BASIC programs were written for the “bas”
BASIC interpreter. “bas” can be installed using your linux distro’s
standard install comand (e.g. “apt-get” in Debian/Ubuntu and
derivatives; “emerge” in Gentoo). If your distro doesn’t have “bas”,
see http://www.moria.de/~michael/bas/ to download the source tarball
and build it manually. Click here clisumu.zip to download the unix/linux/bsd files if you run any of unix/linux/bsd.
– there are links to various free BASIC interpreters and compilers at
http://www.thefreecountry.com/compilers/basic.shtml
Getting code and data
The first step is to download clisumd.zip (for Windows/DOS users) or clisumu.zip (for unix/linux/bsd users) and unzip it. The result is a directory called either clisumd or clisumd. Inside that directory are 13 or 14 files and a subdirectory “csvfiles” with over 300 CSV data files.
The next step is to download a monthly climate summary in text format. With javascript enabled, go to webpage:
http://climate.weatheroffice.gc.ca/prods_servs/cdn_climate_summary_e.html
Select the desired month/year, Povince=”All”, Format=”Plain Text”. You should see something like this screenshot:
Once you get to this point, click on “Submit”
Save the resulting text webpage to a textfile in the clisumu or clisumd directory. Since I used May 2011 data I named the textfile cs201105.txt. I also downloaded June as cs201106.txt. You’ll want to download the latest month every month. The data is generally available 7 to 10 days after the end of the month.
************************************************************************
** The following portion only needs to be run once for initial setup **
** You do not have to do the next portion, including downloading 300+ **
** data files. I’ve done it already and included the output in the **
** zipfiles. The following instructions are documentation for the **
** sake of the scientific method, in case anybody wants to duplicate **
** this the hard way. The most likely use is in the case of manual **
** over-rides, which I’ve found one case for so far. There may be **
** be other cases. **
************************************************************************
Creating a list of candidate stations with normals data
=======================================================
The next step is to create a subset file containing only sites with data in the most recent normals. We actually want 1951-1980 normals for comparison to GISS. Stations with current normals are candidates for having 1951-1980 normals, and their data will be downloaded.
We need to pick out only the lines with values in the “D” (Departure from normal) field, and copy only relevant data fields to a subset file. The program subset.bas is launched by the script “subset” in linux, or the batch file “subset.bat” in DOS. The script/batchfile sets up the name of the input and output files as environment variables before launching subset.bas.
The program “subset.bas” scans for lines with column 64 == “.”. This signals the presence of some temperature normals data for the period 1971-2000. For only those lines, the climate ID, monthly mean temp, count of days with missing mean temp, and station name are extracted and written to a second file. In the case of May 2011, subset.txt has
monthly mean temperatures for 313 sites which have normals for 1971-2000 to compute anomalies against. In this example, I’ve called the output file “subset5.txt” to remind me that it’s for May.
The DOS batch file is invoked as…
subset
and the bash script is invoked as…
./subset
Because this only needs to be run once, I hardcoded the filenames into the batch/script files.
Downloading monthly data in CSV format
======================================
Unlike the 1961-1990 and 1971-2000 normals, the 1951-1980 Canadian climate normals do not appear to be on the web. But since the monthly data is available online for downloading, we can do the calculations ourselves, after downloading the monthly data. Here is how the data was downloaded…
We search by station name. The first line in subset5.txt is…
“1012055”,9.5,17,48.829,-124.052,”LAKE COWICHAN”
The climate data advanced search page is at…
http://www.climate.weatheroffice.gc.ca/advanceSearch/searchHistoricData_e.html
Use the “Search by Station Name:” menu as shown in the screenshot:
Enter the name, or a portion thereof, as shown in the red rectangle. Note that upper/lower case doesn’t matter, and spaces are ignored. Thus “lakecow” matches “LAKE COWICHAN”. Then click on the “Search” button as shown by the red arrow in the
screenshot. Alternately, you can press “Enter” on your keyboard. This takes you to the search results page as shown in the screenshot:
We run into a problem here… there are two stations named “LAKE COWICHAN”, which does happen on occasion. It’s not until you actually select a station that you find out if you’ve got the right one. To select monthly data, you must first select “Monthly” in the drop-down menu under “Data Interval”, and then click on the “Go” button
corresponding to the station you want. You’ll get a display similar to the screenshot:
I’ve highlighted a couple of areas. At the upper left is the climate ID in a red rectangle. This must match the climate ID at the beginning of the line in the subset file, unless you’re doing a manual over-ride (more about this later).
The red arrow at the bottom right corner points to the link for downloading the data in CSV format. I right-clicked on that link and saved the file to the csvfiles directory. My convention is to name the file after the climate ID. Thus, this one would be “1012055.csv”. Note that this is merely a label for convenience only. The files could be assigned any legal filenames, and the BASIC programs would still work, because they read the climate ID from data inside the csv data files.
Rinse/lather/repeat the above for all 300+ lines in the subset file. Now you know why you don’t want to repeat this yourself.
Now for the manual over-ride example. Look in the climate summary file cs201106.txt. Station “WINNIPEG RICHARDSON AWOS” with climate ID “5023226” has a mean monthly temperature, but it does not have normals data. Searching for “winnipeg” in the climate data advanced search page yields several Winnipeg sites. If you click “Go” on “WINNIPEG RICHARDSON AWOS” you’ll see that it’s located at 49 55’N and 97 14’W and
elevation 238.7 m. Go back to the Winnipeg search results page, select “Monthly” and click “Go” for “WINNIPEG RICHARDSON INT’L A”. You’ll notice that it’s located at 49 55’N and 97 14’W and elevation 238.7 m. They’re at EXACTLY the same location. Why the split reporting, I don’t know. Anyhow, I downloaded the CSV monthly data with filename
“5023222.csv” to the csvfiles directory. Then I opened it with a text editor, and changed the 6th line from
“Climate Identifier”,”5023222″
to
“Climate Identifier”,”5023226″
This causes the BASIC programs to treat the monthly data as belonging to the AWOS site when computing monthly normals. Thus we will get monthly temperature anomalies versus 1951-1980 for the AWOS site, even though it’s relatively new.
Calculating the monthly normals
===============================
The normals batch/script file needs to be run only when the contents of the csvfiles subdirectory change. This includes individual files being added, deleted, or edited.
The program normals.bas opens a CSV file for input, and the normals file in append mode. It then calculates the normal temperature for one station, appends one line of data and exits. It is called serially by a FOR loop in the normals shell script or normals.bat batchfile, once for each file in the csvfiles directory. Since lines are always being appended to normals.txt, the script deletes the normals file before starting the loop. This starts off with a clean slate. The script then sets the name of the normals file, and the value of the normals start and end years, and then loops through all the files in the csvfiles
directory that match the spec “*.csv”. The file is invoked in unix/linux/bsd as…
./normals
and in a DOS box (including Windows) as…
normals
Because of limitations in the DOS FOR command, normals.bat has a couple of extra steps…
1) The bash FOR command sorts filenames when evaluating “*.csv”, which results in the file normals.txt being in sorted order. The DOS FOR command doesn’t do this. The workaround is to write output to a scratch file (normals.000) and sort that file to normals.txt at the end.
2) The bash FOR command accepts multiple commands in a DO/DONE block.
The DOS FOR command doesn’t do this. It has to be a “one-line-wonder”. The workaround is to make the one command a CALL to a 2nd DOS batch file, namely normals2.bat. normals2.bat has the multiple commands to execute.
Note that normals and normals.bat set STARTYR=1951 and ENDYR=1980. This is because the immediate goal of this project is to generate normals to compare against GISS, which happens to use 1951..1980 as its base period. There’s nothing preventing anybody from using 1961..1990, or any other random base period for that matter.
The output format for the normals file is… Columns 1-10 The 7-character climate ID in quotes, followed by comma
This is followed by 12 repetitions (1 for each month) of…
SNN.N,NNN,
Where “SNN.N” is the monthly mean temp, with a minus sign if needed,
“NNN” is the number of years of data for that month
************************************************************************
** This finishes the portion that only needs to be run once for **
** initial setup. The following is run every month after downloading **
** the latest monthly climate summary from Environment Canada. **
************************************************************************
Calculating the monthly temperature anomalies
=============================================
Because the monthly data will be calculated using different filenames and months, the anomaly batch/script files accept parameters. The first parameter is the month as a number from 1 (January) to 12 (December). The second parameter is the name of the monthly climate summary file that you’ve downloaded from the Environment Canada website. Note that the program *ALWAYS WRITES TO THE SAME OUTPUT FILE NAMES*. If you want to keep anomaly output files, and not have them overwritten, rename them before doing the next run of anomaly files. I’ve included 2 sample
monthly climate summaries, cs201105.txt for May 2011, and cs201106.txt for June 2011. An example invocation for June 2011 data is, in DOS…
anomaly 6 cs201106.txt
and in bash
./anomaly 6 cs201106.txt
There are 2 output files. anomaly0.txt has output for every station with a monthly mean temperature in the climate summary, and a line in the normals file. anomaly1.txt only has those lines for which…
a) the month’s summary data shows zero days missing, and
b) there are 30 years of data for this month in the normals
This is intended as a filter to list only the best data for consideration. You can relax the criteria if you wish, by modifying anomaly.bas.
An example line from the anomaly outputs is…
“1021830”, 15.1, 0, 15.0, 30, 0.1,49.717,-124.900,”COMOX A”
The fields are…
Climate ID
Actual monthly mean temperature this month
Number of missing days this month
Normal monthly mean temperature
Number of years of normals data for this month
Temperature anomaly versus the normal value
Station latitude
Station longitude
Station name
This output is suitable for importing into a spreadsheet, and
especially into a GIS program for plotting.
***********************************************************************
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.




If we are trying to accurately determine how much the Earth is heating or cooling, shouldn’t we be looking at the heat index vs Temp? Heat index gives the amount of energy the air is holding. I thought it was all about the amount of energy being retained that is causing Global Warming? If the temps are high with low RH, then focusing solely on the air temp is only part of the story.
Why are we not seeing historical records of heat index being compared to current heat indexes?
Is Temp just the low hanging fruit?
BTW, spent 4 years in Saudi Arabia. Hot days in the high desert much more tolerable that warm days next to the Red Sea.
Dave in Delaware says:
July 24, 2011 at 4:38 am
Yes dave we are all aware of that. We are all aware that the best metric is OHC, all aware that temperature is ONLY an indicator of energy.
Do you think the MWP was warmer than today? was the LIA cooler than today.
I know that ‘average’ temperature has no meaning, but what do we mean when we say that greenland in in MWP was warmer than today? do we mean that the air temperatures were LOWER?
Heres the point. OHC is the best metric. for looking at energy balances. without a doubt.
But, if somebody wants to look at temperature to answer questions like… how warm was it in greenland 1000 years ago, well then we have to work with the inferior metric of temperature.
in a nutshell. IF you want to know the best metric for energy balance? OHC.
IF you have to work with temperatures, if that’s all you have, if you have no other choice,
make sure you normalize.
Jr wakefield; nice work
‘Any specific stations presented here individually will be chosen based on the length of data they contain. The criteria will be that a station must have at least 80% of the days from Jan 1, 1900 to Dec 31, 2009. Because so few stations fit into that criteria, there will hence be few stations that will fill the criteria.”
When I get a chance perhaps I’ll apply RomanM’s method to this data. and you wont have to make decisions about 80% or 77% or 91% or whatever number.
Some thoughts of mine.
30 year average periods do not span warm and cold phases of various atmospheric and oceanic oscillations. At most they span one half of a complete cycle. Comparisons of monthly and yearly trends to the “mean” should use data extending to a 60 year period to calculate the mean.
Using anomalies from the mean is not an indicator of climate change. It is an indicator of weather pattern variations. The extremes are where the gravy lies in terms of climate change.
Our standard climate zones (hardiness zones) are not nearly fine enough. The USDA climate zone hardiness labels are insufficient in describing regional and local climate. Sunset Magazine climate zones are much better. For example, in NE Oregon, the climate zone designations are different between the two sources. If you want to know about climate, stick to the Sunset magazine version. When those zones change, you know you have climate change going on because they are set up by local extremes, not by regional weather pattern average anomalies.
http://www.sunset.com/garden/climate-zones/climate-zones-intro-us-map-00400000036421/
http://www.usna.usda.gov/Hardzone/ushzmap.html
R. Gates says:
“Result: There has been no conspiracy…just naturally warmer temps.”
There. Fixed it for you.
steven mosher
July 24, 2011 at 9:03 am
Why would you work with something that is irrelevant to what you’re trying to find out?
Richard M says:
July 24, 2011 at 7:26 am
I did an average of all land based GISS temps in the Arctic (64N-83N) post 1930’s because too much data was missing from them before this period. (There aren’t any land based temps above 83N)
http://img141.imageshack.us/img141/7617/arctictempstrend.png
It shows the warming period up until 2008 was literally no different from the 1930’s/1940’s. There were no weighting bias used because wanted to show how the actual data had changed over this period.
@ur momisugly Matt G says “There were no weighting bias used because wanted to show how the actual data had changed over this period.”
It doesn’t make any sense to discard the corrections – they are there / publicly documented and explained – for a reason.
Typo in article: “The result is a directory called either clisumd or clisumd.”—I think one of those should be “clisumu”.
Not trying to be OT, but R.Gates mentioned BEST. Wasn’t that supposed to be released by now?
Bystander says:
July 24, 2011 at 9:50 am
“It doesn’t make any sense to discard the corrections – they are there / publicly documented and explained – for a reason.”
The data is directly from the GISS, the weighting bias refers to placing them in zonal regions from the pole and treating them equally (distance around the pole). This can cause just one or two stations to influence 8 to 10 percent of the overall trend. It is unrealistic to expect this one or two stations in very isolated places to reflect this whole zonal region.
DMI reports openly their daily temperature values for “80 degree north,” and has data for every year going back to 1958.
Where are their thermometers? And, why do they record a STEADY and then increasingly DECLINING temperature for summer months (the only days above freezing at 80 north) for the years 1958 through 2010, when GISS/Hansen claims a 4 degree increase for the “Arctic”?
I appreciate Anthony publishing this article but its purpose puzzles me. As someone involved with Climate studies in Canada, including temperature reconstruction’s and heat island impacts, as far back as the early 1970s I am familiar with the history of Environment Canada (EC). I realize the author is explaining that EC maintains many more stations than are used to produce climate data. Why does the data of a nation require such elaborate explanation? He made a point of saying he was retired from EC, but why did he not produce this when he worked there?
Obviously it was just as necessary then as now. Is it a response to the article on this web site critical of the agency? http://wattsupwiththat.com/2010/08/23/government-report-canadian-climate-data-quality-disturbing/
I ask these questions because of my experience with EC. It was no coincidence that an Assistant Deputy Minister of EC chaired the pivotal meeting of the IPCC in Villach, Austria in 1985. I believe they made the entire agency (EC) and process political from that time.
The author will recall the “dog and pony” show carried out across Canada by a private agency hired by EC to offset serious criticisms of the poor service, cut backs and problems associated with EC. These centred around funding redirected from weather stations and other activities to climate change research that, in my opinion, was very singular and anti-science in its objective. They even tried to charge for data already paid for by the taxpayer. They competed with private agencies for contracts and charged them for the data while they got it for free (can you say unfair competition?).
A couple of years ago I spoke with a former EC employee who resigned in protest to the direction the agency was going. The Auditor General estimated in 2005 that $6.8 billion over 7 years were directed to climate change research. In my opinion this took them a long way from their fundamental purpose of data collection. They replaced many weather stations with AWOS that were severely criticized for inaccuracy. When NavCanada was formed to run airports they initially refused to take them because they were so unreliable. This triggered a Canadian Senate investigation led by Senator Pat Carney. I was told of a former employee hired to go out early each morning in Winnipeg to compare the AWOS temperature with a hand held thermometer. I know there are many other stories.
Maurice Strong (a Canadian) set up the IPCC through the World Meteorological Organization (WMO). This meant all national weather agencies controlled who was appointed to the IPCC. This meant EC controlled almost all climate research funding, which was different than virtually all other science research funding kept at arms length from political interference.
As an active participant and promoter EC accepted the findings of the IPCC without question. It put control of climate research and funding in complete bureaucratic control. It also meant they were on a treadmill of proving the hypothesis in complete contradiction to the scientific method that requires it is disproved.
Incidentally, why was Eureka chosen as representative of the Arctic? Years ago (1980s) I worked with Sylvia Edlund, a botanist and Bea Alt a climatologist both involved in the Polar Shelf Project both looking at the unique climate of Eureka manifest in the botanic refugia. I have spoken with former EC employees who said they were surprised when the weather station was set up there because people knew it was a unique microclimate.
There are a multitude of other questions that this strange article triggers and that require answers. Maybe the author didn’t speak out until retirement because of an experience I had with EC employees. A couple of years ago I spoke in Winnipeg and afterward three EC employees identified themselves and said they agreed with me but were afraid to say anything.
racookpe1978 says:
July 24, 2011 at 11:43 am
My link shown in the previous post does show a 4 degree C increase from the late 1960’s. Just that there was also a 4 degree decrease from the 1930’s/1940’s. Notice DMI is 80N+ whereas the GISS data is from 64N-83N) The GISS has being recently trying to correct this by wrongly infilling 83N+ with observed data <83N. This can't realistically work because the Arctic temperatures where sea ice remains all year hardly changes during Summer. This behaves much different to areas <83 N where especially no snow/ice remains with higher solar levels at the time of year.
What happens to GISStemp N. Hem. Anomalies when we feed DMI 80N into it?
Some really nice work here on the part of many, a very respectable effort! For anyone who didn’t grasp the Ric Werme / steven mosher exchange regarding WMO standards, what Mosher was saying (paraphrased) was: “…DONT TAKE any of the [station data] that [doesnt] meet WMO standards.”
Regarding “BEST”: According to their very transparent site, “As of July 6 2011, we have not yet run the analysis on 100% of the data.” The major disappointment was the premature release of preliminary, partial results of only land-based (thus the warmest) data by Dr. Richard Muller before Congress on 31 March, 2011. I shall no longer be reading his papers. (jk)
http://www.berkeleyearth.org/FAQ
“steven mosher says:
July 24, 2011 at 9:03 am
But, if somebody wants to look at temperature to answer questions like… how warm was it in greenland 1000 years ago, well then we have to work with the inferior metric of temperature.”
Surely the species of vegetation growing in an area at the time is a better indication of past climate than temperature. Does Greenland support more tropical species than it did 1000 years ago, or are they more arctic species? This would appear to be a more reliable indicator of what the climate is actually like, than say average temperature.
For example, say the average temperature is 15C. Is that 15C all year round, or 45C in summer and -15C in winter. Both areas have the same average temperature but will certainly have entirely different climates and entirely different species of plants and animals. Now add to this water. Is the location humid or is it arid? This again will affect the climate and determine the species of plants and animals that can live there.
This is the fallacy of modern climate science. Average temperature tells you almost nothing about the climate and the suitability for specific forms of life. To use global average temperature as the main metric of climate science, while it tells you almost nothing about the climate or its suitability for live, says a lot about the quality of science that goes into climate science.
Climate science is supposed to be measuring climate. Without knowing both the amount of water present and well as the temperature, along with the ranges of water and temperature, you cannot know the climate.
To call temperature change climate change, without also measuring the amount of water present is pure nonsense. It is unscientific. If anything, the amount of water available has a much larger affect on plants and animals than does temperature. With liquid water plants and animals can typically survive much greater ranges in temperature than they can if water is limited.
As well, the amount of liquid water present serves to moderate extremes of temperature. Two regions can have identical average temperatures, but the one with the most water is likely to be more moderate. Compare coastal cities in the temperate zone with cities inland. Vancouver and Osoyoos are a few hundred miles apart with similar average annual temperatures but have entirely different climates.
Forgot to add to previous post.
Therefore during a period of warming (eg since 1970’s) some areas <83N may have had snow/sea ice before then. This melts causing local temperatures to be warmer compared before this occurred. The GISS then infills this warmer bias (only down to melted ice/snow) and places it right above 83N where there is snow and ice. This is awful incorrect science to persume this warm bias has occurred on areas of sea ice still there. The areas <83N before this ice/snow melted would have been as cool as before without this warm bias. The missing snow/ice is the very reason for this bias, hence it is very wrong to assume areas 83N+ have this warm bias while the ice still remains.
I don’t know if anyone else has noticed but EC has been revising significantly the Canadian national winter temperature departures . I don’t know when the chages were made , nor do I recall any public notification of these changes. I compared the 2002 version with 2011 issue and found that 47 of the 55 years between 1948 and 2002 were changed going back to 1948 and some as much as 0.5 C [ for example 1950 was changed from -2.6 C -3.1 C. What kind of new data would change the temperature departures so much so far back . There may be other changes but I have not checked them all.
Vancouver is rainforest, while Osoyoos is desert. Few if any species native to Vancouver can survive in Osoyoos, and few if any species native to Osoyoos can survive in Vancouver. Yet they have similar average temperatures and are only a few hundred miles apart. This says that average temperature is a poor measure of climate. Thus, using average temperature as a metric for global climate would appear to be unscientific.
“Pamela Gray says:
July 24, 2011 at 9:16 am
Some thoughts of mine.
30 year average periods do not span warm and cold phases of various atmospheric and oceanic oscillations. At most they span one half of a complete cycle. Comparisons of monthly and yearly trends to the “mean” should use data extending to a 60 year period to calculate the mean.”
The issue has more to do with reducing the variance in the estimate of the monthly mean.
Simply: when you calculate an anomaly you have to decide how many januaries you will average to come up with the mean january. does using 60 give you a different mean than 30?
That’s a testable hypothesis. What happens to the SD going from 30 to 60, well you know how to calculate that as well. So here you have a question that you can actually answer and put numbers on, rather than speculate. For grins of course I once selected stations that only had 100 years of data. then I used the entire period to normalize or scale the series. Answer? dont be surprised…
the answer doesnt change. why? law of large numbers. the wiggles change a bit here and there but the earth is getting warmer. How much is known pretty well. Why? a combination of external forcings and internal variablity. what role do each of these play? much tougher question. how bad will things get, way more difficult question.
Tim Ball says:
July 24, 2011 at 12:14 pm
> Why does the data of a nation require such elaborate explanation?
There are many ways to store and present data, and they’re constantly evolving. Technical documentation is a discipline in itself. Many computer programs require more effort in writing the documentation than in writing the program. It’s the nature of the beast when you try to make complex data sets easily accessable to non-computer-savvy end-users.
> He made a point of saying he was retired from EC, but why did he not produce this when
> he worked there?
I worked in the technical end of things as a glorified end-user, passing on value-added data and analyses to consultants. I wasn’t a researcher with an advanced degree. I simply did not have the academic credentials to be taken seriously as a researcher.
I was a “data gopher” who dug up relevant data data and ran preliminary analyses for those researchers, as part of my job. My data gopher experiience is what enabled me to write this post. Although I wasn’t in the CS (Computer Science) job group, I did a fair bit of programming as part of the job, along the lines of “Harry the programmer”. I also assisted meteorologists in preparing weather/climate related values for building codes, etc.
“ferd berple says:
July 24, 2011 at 1:08 pm
“Vancouver is rainforest, while Osoyoos is desert. Few if any species native to Vancouver can survive in Osoyoos, and few if any species native to Osoyoos can survive in Vancouver. Yet they have similar average temperatures and are only a few hundred miles apart. This says that average temperature is a poor measure of climate. Thus, using average temperature as a metric for global climate would appear to be unscientific.”
Understand what the “climate” is. Climate is about two things: Long term averages at different locations. In some sense these are idealizations. These measures or metrics can be useful for many purposes. Was the globe generally colder in the LIA? that looks like a scientific question. Would an answer to a scientific question that reported the average temperature be unscientific?
Hardly. What does it mean to say that the average temperature NOW is higher than the average temperature THEN? it means this: pick any spot on the earth. look at the temperature NOW ( say a 30 year average). When we say the past was cooler what we mean is that for any place you pick, your best estimate for the temperature at that place, during that past time, will be lower than it is today. So, the sentence has a meaning. its a very operational meaning. If I think it was cooler in the past I can even draw other hypotheses from that: I could predict that certain plants would have died out in certain locations. I can then test that by looking for evidence. I can predict that certain rivers might have frozen over that never froze before. I can check that. Wow, look at frost fairs in London. So there I take a general statement ” the global average was lower in the past” I propose a test ” Look for evidence of rivers freezing where they never froze before” and gosh I can test that. looks like science, but note that it’s more like historical science than EXPERIMENTAL science.
@Dave in Delaware and energy content of moist atmosphere:
You find here the live graph of the enthalpy of moist air at meteoLCD (Diekirch, Luxembourg). It is true that humidity plays an important role, but if you stay at one location total energy content follows usually more or less air temperature in the sense that moist enthalpy is generally higher with high air temperature.
Matt G says:
Matt there are several ways to handle the area above 83N
Out fundamental question is this: what does the TREND look like in places above 83N that have not been sampled. You can:
1. Assume that the trend is Lower ( physics suggests otherwise.. polar amplification
2. assume that the trend is the same ( Giss does this)
3. Assume the trend is higher the further north you go. physics suggest this but nobody uses this assumption.
1 and 2 both underestimate what physics suggests. In the end the difference between 1 1nd 2 is not that great. Why? the land are above 83N is frightfully small. area goes with sin() of lat