DayRec: An Interface for Exploring United States Record-Maximum/Minimum Daily Temperatures
Essay by Greg Kent
Foreword: There is a new resource for obtaining high/low temperature extremes. The DOE released the DayRec tool that has requirements for long and complete station records. I think this is a nifty source of information for cutting through the BS since so many stations have short histories or have very incomplete records. I’ve done a little analysis of the data as it is presented to see the distribution of max/min records by decade over the last century by the 424 stations with the most complete records. But there is a lot more that could be done with the data by someone else who can use tools more sophisticated than pivot tables. I’m hoping that your regular readers will find this useful.
The NCDC website http://www.ncdc.noaa.gov/cdo-web/datatools/records tracks the number of “record” daily high TMax or low TMin temperatures for each of the US stations.
To be included as a record, the NCDC has a rather loose requirement for including stations. Their website has some problems witht he map plotter at the moment, but if I recall correctly, there a station can be included if it has a 30-year record and each year has coverage of at least 182 days. Therefore, any station that has recorded temperature data for half of each year since 1984 will be considered as setting a new record.
Such as loose definition, however, is likely very misleading to the media, politicians, and the general public. When most people hear that a new “record high temperature” has been set, they usually have a longer period of time in mind. I suspect that most people would react differently to an announcement of a “record high temperature” if they understood that the records began when Ronald Reagan was president. Of course, it is convenient for political reasons to maximize the number of “record high temperatures” because it fits the mandatory narrative about global warming or climate change.
Skeptics have been quick to point out that many of the stations in question have short records lasting only relatively short time scales, so these so-called record temperatures are relatively meaningless in the context of the longer times that most people are thinking about. Skeptics are also quick to point out when even the skewed NCDC records go the wrong way. For instance, the fact that 2013 had more daily record low temperatures than daily high records was fairly well communicated on skeptic blogs. (BTW, this trend has also continued for January and February of 2014, with more record low records than record highs.)
But if the NCDC’s records website is of limited value, it would still be interesting and informative to understand how “extreme” recent years’ temperatures have been in the context of a longer time period. And wouldn’t it be useful if this was an “official” set of temperature records that everyone (warmists and skeptics alike) could agree on. In fact, such a tool has recently been released.
The Carbon Dioxide Information Analysis Center of the Department Energy has released a tool that is intended to provide more meaningful analysis of extreme temperatures over time. The DayRec tool, as it is known, is described at the following link: http://cdiac.ornl.gov/climate/temp/us_recordtemps/dayrec.html
What makes this tool and its corresponding dataset useful is that it has more stringent (and realistic) criteria for determining which stations should be tracked for record temperatures. Instead of the 30 years used by NCDC, the DayRec tool requires a 100 year temperature record, from 1911-2010. This is much more meaningful timeframe to use for the context of speaking about records. The DayRec tool also has stringent criteria for the amount of data that can missing. Only a small amount of missing data is permitted and those “missing observations [must be] be spread out relatively evenly over time (both seasonally and over the full period of record), so as to avoid time-dependent bias that could on its own give misleading impressions of changes in the frequency of record-setting temperatures over time at a given station.” Only those GHCN stations that meet these criteria are included in the DayRec analysis. As a result of these criteria, only those stations that have the longest records with the most complete and least distorted records are considered for extreme high and low temperatures. This makes the DayRec records much more meaningful than the NCDC records website.
There are two versions of DayRec. The first version was released in 2012 and had very stringent missing data requirements (98.5% complete data). This contained 200 stations, but was not geographically well distributed. The desert southwest, for example, was completely excluded. In November 2013, a new version was released contained 224 additional stations. The aggregate 424 stations provides more thorough coverage geographically, but has slightly more missing data (97.6% complete data). This link shows the geographic distribution of the stations.
http://cdiac.ornl.gov/climate/temp/us_recordtemps/ui.html
The blue markers indicate the original 200 stations with the most complete records. The additional 224 stations are with less complete records are shown in the green markers. These help fill in the gaps. The 424 stations cover most of the lost 48 states (except CT or DE).
The primary intended use appears to be through a GUI to select a single station and generate reports. However, data files are available with all of the records for each day for each station. The files and layouts are located here:
http://cdiac.ornl.gov/ftp/us_recordtemps/sta424/ and http://cdiac.ornl.gov/ftp/us_recordtemps/sta200
The DayRec tools that are provided on the website are intended to provide analysis of daily record temperatures for a single station. However, using the data files makes it possible to analyze all of the stations.
When the record daily high temperature across all 424 stations are analyzed, it’s easy to determine if it has gotten hotter in recent decades. When record temperatures are put into buckets by decade, it’s easy to see that most the 1930s was by far the hottest decade across the continental United States. The number of record highs is the 1930s is double the number of record highs in the 2000s. The 2nd, 3rd, 4th, and 5th hottest decades were also in the early period, 1911-1960. The most recent decade, which has been affected the longest by global warming and is supposed to have the most “climate change”, is fairly anemic in 7th place with respect to record high temperatures. At least in the United States, the 2000s were not very extreme at all.
The same message holds for record high temperatures during the summer months. When the average person hears about global warming and record high temperatures, their minds likely go to a sweltering July day. The good news is the high temperature records for the summer months (JJA) are even more skewed to older decades. Whereas 60% of all daily record highs occurred before 1960, 70% of all record highs for the summer months occurred before 1960. The hottest days at the hottest time of the year were much more likely to occur in the past than in the present. The 1990s and 2000s set relatively few summer-time record highs. The vast majority of high temperature records in recent decades were in cold or cool season. For example, in the 1990s, more than twice as many daily record high temperatures were set in the winter months (4708) than were set in the summer months (2473). To the extent that global warming is happening, it is raising winter temperatures a few degrees, which is something most of the public would not be terribly concerned about. In fact, many of folks would consider this a good thing.
Taking a more granular look at individual years gives largely the same picture. The year with the most record high temperatures was 1936. The next few hottest years were also in the 1930s. Analyzing the 15 years with the most high temperature records shows that every single one of them occurs before the era of global warming began in the 1970s. The more recent years, which are supposed to have been particularly extreme, are dwarfed by the number of records set in 1911, 1925, and 1930s. Using a century-long scale makes recent years look neither particular hot nor particularly extreme.
What about record low temperatures? The data shows that there has been a reduction in the number of record lows in recent decades, especially in the winter months. The last two decades, for example, had very few record low temperatures. In other words, winter nights have become a little warmer over the last two decades. Most people would likely consider this a good thing, or at least not as something to be especially worried about. (Note: The number of low temperature records is shown as a negative number for orientation purposes to differentiate from high temperature records.)
Comparing the number of record highs and record lows across decades actually gives a good clue of what “global warming” actually means in the continental United States. Global warming does not mean that summer days are getting warmer in comparison to the last century. Instead, it means that winter nights are getting less cold and therefore setting fewer low temperature records. In the 2000s there were half as many record cold winter night-time lows (1693) as there were record hot summer day-time highs (3151). In the context of other decades, this doesn’t mean that there was a lot of high temperature records – as discussed above, the number of the high temperature records in the 2000s was rather low by historical standards. Rather, it is because the number of low temperature records was disproportionately low.
The DayRec website discusses that “changes through time in record high and low temperatures (extremes) are also an important manifestation of climate change.” It goes on to discuss the Meehl et al (2009) finding that “twice as many high temperature records are being set as low temperature records.” Analyzing the complete DayRec data set sheds light on this finding. The common interpretation of Meehl et al is that the number of record high temperatures has increased. The DayRec data (the longest term, most complete data sets available) prove that this is incorrect. The reason for Meehl’s finding is that the although the number of record high temperatures is lower than in the first half of the century, the number of record low temperatures has decreased even further in recent years.
To illustrate the types of reports that are available, the DayRec website includes a sample plot of high and low records for one example station. The text says: “Decadal frequency of record-setting Tmax’s and Tmin’s for Fort Collins, Colorado. The plot reflects record temperatures set over all days of the year. Note that over 2001-2010 there were just three record-low Tmin’s set, while there were 104 record-high Tmax’s set.” The plot is shown below:
The Fort Collins graph shows a definite stair-step pattern with an increasing number of high temperature records and a decreasing number of low temperature records. This plot corresponds remarkably well with the global warming/climate change narrative, and the stair-step rapid rise in the number of high records in the 2000s is almost frightening. Two relevant facts, though, can put our mind at ease. First, Fort Collins (station ID 53005) was rated by the surface stations project and assigned a CRN rating of 4, which indicates poor siting that can distort temperature readings by up to 2 degrees. Second, the distribution of high temperature records of this station is not representative of the entire population of 424 sites. Comparing the Fort Collins graph to the graph shown earlier of all 424 stations shows how extreme Fort Collins temperature record is. As mentioned previously, the decade of 2001-2010 ranked 7th overall in terms of number of high temperature records. Clearly, Fort Collins is an outlier. The Fort Collins station has the 5th highest number of records in the 2000s (104) out of all 424 stations. In comparison, the average (mean) number of high temperature records in the 2000s for all 424 stations is 39, and the median is 35.
The DayRec website states the public “would benefit from additional ways to get climate extremes information … and assess it.” That is certainly true. The trouble for the warmist establishment is that the data goes against the common global warming/climate change narrative. Of course, that is true only if you consider the data on extreme temperatures for all 424 stations. An analysis of the original 200 stations included in the DayRec database tells largely the same story. This is in contrast to the impression given by the Fort Collins plot that is shown on the webpage. It is intriguing that this particular station, which so nicely fits with the global warming narrative, was chosen for the webpage, even though it is an outlier. One wonders what might have motivated this choice.
*************************************
Further avenues for analysis for the DayRec stations:
1) Updating the data with for records set in 2011-2013 (2012 will likely have a bunch of highs, but 2013 will have a bunch of lows)
2) Looking at the full record for these 424 stations and calculating a yearly temperature and trend vs. the “official” NCDC contiguous US temperature trend
3) Looking at the missing data and in-filling. As pointed out in a recent paper, skipping missing data implicitly assumes that the missing data is the average of the data that is present. In-filling where possible might give a more accurate result.
4) Correlating these 424 station histories with their CRN score (or Leroy ratings) and seeing what the relationships are
Smoking Frog
Do you know for a fact that they are records as of now,
Yes. If you check the Fort Collins example, you get (at an eyeball count) around 450 high max records, whereas if you counted everytime a record had been set, you would probably be into thousands. (This also answers my question – yes it does include ties).
You can check this out with an individual database, such as this one in Alabama.
http://cdiac.ornl.gov/cgi-bin/broker?id=013816&_PROGRAM=prog.tempdoy_data_d9k_424.sas&_SERVICE=default&_DEBUG=0&tempvar=htxctx
Click on Data File, and you get the list of daily records. There is one for each day, except where there are ties.
In other words, these are all time daily records.
fwiw, the world’s all-time hottest temperature, 56.7C, 134F, was set in 1913, and remains unbroken. Second-hottest appears to be 55C, 131F, in 1931. Odd that catastrophic warming would not yet have broken either.
And the coldest, -89.2C, -128.6F, was set in 1983; second-coldest, 82.5C, 116.5F, in 2005.
Interesting note on this page:
http://cdiac.ornl.gov/climate/temp/us_recordtemps/ui.html
Note: Currently, DayRec does not include any stations in Connecticut and Delaware.
Also, New York has 9 stations listed, but not from the areas I expected.
Interesting you chose Ft Collins, CO as an example. Having moved there in January of 1972, with about 24,000 people it is now 150,000. Something people may not understand back east and in other high density population areas where towns are joined together, often most of the population is in the county, just outside town limits. So population growth is not limited to the town proper, and would cause something of a UHI effect beyond or differently from a scenerio of just a growing town. I now live on the western slope, in a valley with four towns of about 5,000 people each. Another 30,000 people live in the county, outside of the towns, in a 75 sq. mile area.
Numbers of hi-lo are not necessarily indicative:
1.) Early on in a series there will be tend to be far more records (high or low) than later, for reasons that should be obvious.
2.) If we are an a high, but non-increasing plateau, we will tend to see more warm records than cold because (on the whole) from up on that “plateau”, you have to get just a teeny nudge up to get a warm record, but need a big drop to get record cool.
In my opinion, this whole “record temperatures” fooferaw is about warmie PR, not science – because quantity is removed, and science is only quantity. After all, a “record” could be anywhere from a millidegree to a kilodegree above the previous “record,” so what does a “record” tell us about the heat energy content of the biosphere?
Patrick: to my knowledge, there is nothing wrong with the data, as with all data it’s the way it’s handled that can be problematical.
It can be both, of course.
I suggested 1910 because before that, temperature data becomes “iffy” due to instrument exposure issues.
Around four out of five of today’s USHCN (the “crown jewel” of the HCN) is “iffy” because of instrument exposure issues.
Furthermore, the data from a particular station is worse than useless without the accompanying metadata. If there is no record of TOBS and station moves, the data from the station essentially conveys no meaning.
But the problem arises that there is no decent or consistent metadata for the USHCN going back more than 40-50 years or so. We don’t know how well they were located. We don’t know the time of observation history (utterly critical). And, for that matter, we don’t know the history of station moves. And the USHCN is the best. For the rest of the world (except maybe Aus/NZ), it’s even worse.
So they are pulling those earlier trends out of their behinds, anyway.
With fewer records in either direction being set in the most recent decade, doesn’t that firmly contradict the recent press releases stating that we’re seeing more extreme weather? Every time I hear the “warm means cold” explanation, I think that it’s an extraordinary claim that requires extraordinary proof. This data seems to show that it’s just not true.
The lack of “cold records” reflects the fact that many of the stations only date back to after the war.
Therefore, they include plenty of cold records from the 1960’s and 70’s, but have none of the hot records from the 1930’s and 40’s.
The lack of more cold records could easily be due to UHI. UHI is much stronger at night (where I live) with very little seen at the daytime highs. It’s too bad something like this doesn’t exist for global data. The alarmists will simply dismiss this as regional.
steveta_uk
As has been noted before, yes records are easier to set at the start of a record than later, however, what is important is that those records have not been broken, they are standing records, not just records that were broken at the time.
I do understand your confusion however as every measurement at the start of a time series is a record.
DaveE.
D Carroll
These are standing records, not since beaten
DaveE.
D Carroll
These are standing records, not since beaten
DaveE.
Some kind mod please delete previous comment with messed up link. Thanks. DE
Not sure how useful the number of max records in a year/decade really is. Saying that more records were broken on a daily bases in the beginning of the record keeping than in the end doesn’t seem to say much (am I missing something here?). As noted above you will see many more early on and fewer as the max temp gets higher. I can see if you list by year/decade the number of existing record Tmax values of the 424 stations so that there are 424 (excluding ties) data points you might get some information from the numbers. Alternatively, showing how many days the Tmax was above the average for the whole period might be of interest.
I note links above to BOM data sets which have issues. The rounding which occurred in the pre-Celcius era is of great concern, and is new to me. I note also that the BOM’s own stations appear in the “good” column i.e. no rounding, either in F or C. Some of those stations are “climate reference stations”, I have no idea what other (i.e. “volunteer” or “contract”) stations are used for calculating averages etc. Presumably, stations with evidence of rounding would be left out. As for the metadata, all stations have a filed paper history, but not all are accurate as early record-keeping was not always efficient.
I do have an issue with Oz maxima, especially for the eastern states. It’s not well known, but maximum thermometers aren’t reset until 9am (which is actually 8am, in DST) so one could have a mild day followed by a hot one. It’s not unusual for the temperature at 8am the following day to be higher than the previous day’s max temp, by 9am it’s well above… the maximum to 9am is recorded for the previous day. So, as an example, it’s 26C max on Tuesday, by 9am Wednesday the temp is 29C… that’s what goes into the record for Tuesday, 29C, not 26C. Ridiculous, really, but as far as I know it has always been done this way so the whole record is skewed. A similar thing can happen with minima in winter, the minimum thermometer is reset at 9am, it gets colder during the day, then warms up through the night (cloud comes over, or whatever). It might get down to 9C during the day, but nothing colder than 12C overnight- but 9C is the minimum temperature for the next day despite the fact it might have occurred at 10am the previous day. Given the fact that observers around the world would rarely work until midnight (when the 24 hour maximum clock should stop) this may be an issue in other countries’ data sets as well.
Records Australia – Before the Stevenson screen ost used the Glaisher stand. Apparently BOM don’t like it as the records vary. However from Warwick Hughes page –
‘A pdf of the above can be downloaded
BoM swatting me down. But note on page 709 they conclude – Over the year, the mean temperatures were about 0.2 degrees C warmer in the Glaisher stand, relative to the Stevenson screen. – So it would seem that any questionable pre-1910 Australian data could be used after all in “global warming” studies, with this minor adjustment. I could be happy with that.’
extract: http://www.warwickhughes.com/papers/ozstev.htm
‘In 1991-92 when I asked Australian Bureau of Meteorology (BoM) people about the late 19C warmth, I was told that this was due to older style open thermometer stands being used before the BoM took over weather recording for the Commonwealth in 1907 (now they say the BoM was formed in 1908), when the Stevenson Screen was introduced across Australia.
This lead me in 1991 to researching in the BoM library to see what documentary evidence there was for this and I was unable to confirm the BoM version of history. I did find some proceedings of late 19C Intercolonial Conferences held in 1879, 1881 and 1888. It was clear that the scientists running the meteorological observatories for the various Colonies (that now make up Australia), were all aware of the Stevenson Screen and there was much evidence of its use, or the use of a similar local variant.’
Same source,
So if BOM was formed in 07-08, why do the records only go back to 1910- let alone the previous record keeping?
Paul Homewood says:
March 11, 2014 at 5:52 am
Thanks.
Using records proves nothing… as time goes on the chance of reaching a new record get’s less likely as we would need either a higher Tmax or a lower for Tmin than the previous to set a “new” record… sort of self defeating isn’t it? In reality that means as the dataset’s time record grows, new records would get harder and harder to reach each time a new one is set. Unless we do what they do in sport these days and start recording temperatures to thousandths of a degree?