GOOD NEWS AND BAD NEWS
Guest post by Craig Loehle, Ph.D.
I have good news and bad news. The good news is that USA rainfall is, according to NOAA, rising at 6.5 inches per century. The bad news is that this number is wildly wrong. I also have some observations about temperature
Data from http://www.ncdc.noaa.gov/cag/time-series/us on March 20. Graphs from http://www.cpc.ncep.noaa.gov/trendusa.gif on March 20.
First the good news. In Figure 1, we have the precipitation trend according to NOAA.
Figure 1. Precipitation trends for lower 48 US state region according to NOAA.
I am not happy that they don’t give the time period for the trend value, nor confidence intervals, but the problem is worse than that. Precipitation is rising for every month and for the year is rising at 0.65 inches per decade. This is 6.5 inches per century. Since the mean annual value for the region is 29.14 inches according to the data link page, this is a 26% rise over the period since 1895 (length of data record here). Wow! This is exciting! Good for farmers and gardeners! Except no one has ever noticed or mentioned this rise. Have you heard about it? Aren’t we supposedly getting more drought? Contradictions make me itchy, so I downloaded the underlying data from http://www.ncdc.noaa.gov/cag/time-series/us on March 20 (Fig. 2).
Figure 2. Annual total mean precipitation (anomalies vs 1901-2000 mean) for lower 48 USA. This graph matches the one from the website.
The linear trend for the period is 1.6 inches per century (c.i. 0.5 to 2.8in) or 5.5% per century, not 6.5 inches. If you pick the period 1976 to 2012, as people like to do, you get an even lower trend of 0.58 inches per century. Picking 1976 to 2005, as some have done and as NOAA appears to have done according to the NOAA email below, is even worse, at 0.35 inches per century which is 18 times lower than the bar chart. Also note that 2012 (remember? We were all going to die?) was not particularly dry compared to many previous years, at least in terms of total precipitation.
In order to check my work, I emailed NOAA to ask about the discrepancy. I quickly got the following reply:
Hi Dr. Loehle,
There are several major factors which lead to the differences. The most important are:
- According to the notes, the Climate Prediction Center bar graph depicts the trend from 1976-2005, while the data from NCDC (“Climate at a Glance”) is for the entire period of record (1895-2012). I do not know when the bar graph was created or if it was updated. The enveloping web pages were created on Jan 5, 2005.
- NCDC’s underlying dataset is an area-weighted aggregate of underlying climate division data derived from stations. I believe (but do not know) that CPC uses a “reanalysis” dataset as their underlying data. I do not know whether their national value is constructed from climate division data or is a gridded representation.
- NCDC uses a simple linear regression to determine trend, while CPC uses a “best fit” approach. I do not know the specifics of the “best fit” approach but the CPC notes explicitly note it is different than the linear trend.
Clearly the NOAA staff member who replied (name withheld to protect the innocent) is trying his best to answer and is not responsible for creating either the 2005 maps and bar charts or the methods used, so please do not pick on him. Following the links back, including the Livezey and Smith 1999 reference (available at Scholar.google.com free), I do not see how even a more sophisticated “fitting” algorithm could get a trend of 6.5 inches/century out of this data unless it simply took the very low 1976 value (-3.52in) and the very high 2004 value (3.74in) and somehow let the fancy trend be governed by them, with perhaps some sort of polynomial or spline fitting, which IMO is unjustified for such short period of data as 1976 to 2005. In any case, we can consider this a prediction as of 2005 which has turned out to be false, since extrapolating from the 2005 prediction through 2012 we find 0.58 inches/century for 1976 to 2012, not 6.5. So the forecast tested 7 years later is off by a factor of 10. If the difference is due to different data being used, then either: the data used for the trends is wrong since they are so different, or the trend data is right, the underlying data needs to be made public, and the public needs to hear about the good news of rapidly increasing precipitation.
Next, I downloaded the maps of regional trends from www.cpc.ncep.noaa.gov/anltrend.gif.
Figure 3. National map of annual mean precipitation trends by region. Trends (apparently) from 1976 to 2005.
This figure looks consistent with the 6.5 inches per century rate from the bar chart, but not from the underlying data, with most of the country medium green or darker. So this map is wrong too. Notice the blue regions which are asserted to be gaining rainfall at the rate of 15 inches (or more) per century. Really? Do they have any idea how much 15 inches of rain is? It is equivalent to a hurricane landfall direct hit. 6.5 inches of rain would most assuredly be noticed in the dry western states. We can also ask why they have not updated this map since 2005 if it is truly based on data up through 2005 (and is not merely a failure to update the legends). That was more than 7 years ago. If the regional trends for 1976 through 2005 are correctly reflected in this data (which I can’t test but doubt) this says that the regional data is not adequate to making a forecast. Note that this map also does not show the recent drought in the Southwest so it clearly is not based on very recent (say past 5 yr) data.
For comparison, I found the EPA precipitation map (their fig 3 from http://www.epa.gov/climatechange/science/indicators/weather-climate/precipitation.html and supposedly based on the NOAA data).
This is a 110 year change map (note: gray is zero change), and they give 5.9% increase per 100 years for the country as a whole (including Alaska and Hawaii), which is very close to what I got above from the underlying NOAA data for the lower 48. This map is more sensible and has fewer extreme values. I am not saying the details are right, of course.
For completeness I also looked at the temperature data from the same sites.
Figure 4. Annual temperature trends (Deg F).
The annual trend is 0.4 F per decade or 4 degrees per century. There is no statement about the time period used and no confidence intervals on the results. This is disconcerting. However, if we plot the data from their site, we get the following (Fig. 5):
Figure 5. Temperatures (mean annual, deg F) since 1895 for lower 48 states. Matches graph from the NOAA website.
The linear trend (ignoring autocorrelation) is 1.3 deg F/century (c.i. 0.87 to 1.73), which closely matches the global rise over this period. The very high 2012 value seems unlikely and the post 1998 trend to 2011 would have been negative without it. The value of 4 deg F in the bar chart clearly depends on cherry picking recent decades. If we use 1976 to 2012 the slope is 5.6 (c.i. 3.1 to 8.1) deg F/century. If we use 1976 to 2005 to match the figure legends for the maps, we get 6.4 (c.i. 3.2 to 9.6) deg F/century. So I fail to see a way to get 4 deg F warming trend from their own data. Perhaps they have not updated the maps and bar charts since the new versions of GHCN and so on that recently came out but the data dump is updated. The 4 degree (or 6 degree) trend is so out of line with the 118 years long trend of only 1.3 deg F/century, and is matched by a warming period in the early part of the century and interrupted by cooling periods, that only the assumption that the system has radically changed could allow this number to be accepted at face value. What about the map?
Figure 6. National map of temperature trends (deg F).
This map is broadly consistent with the high rate of rise figures from the recent data. If the legend is correct, we might again wonder why this has not been updated in 7 plus years. We can also note the rate of rise in the southwest of over 10 degrees per century. Surely these would be uninhabitable by now since they were already hot. I wonder about how the sparseness of thermometers in the West and their sensitivity to placement might influence these results. The neutral trend or even cooling in the South is notable and is more prominent in other maps I have seen but cannot locate right now.
The data, trends, and maps of NOAA are used without question by EPA, other government agencies and in summary reports on climate change, as well as in countless research studies and student reports. A simple script should be able to update these maps more often than since 2005. This degree of error is not exactly comforting. The precipitation trend plot and map is simply impossible to believe and does not match the underlying data provided by NOAA. So something is very wrong there. These maps and trend estimates are crucial to discussions of future impacts on agriculture, forestry, and ecosystems and contradict claims of impending droughts (either with or without the errors). The extreme values for precipitation and warming trends in certain regions surely would suggest that reexamination of the data or methods is warranted. Bar charts of trend for a national database site should really point to specific documentation of methods and data, and include confidence limits. Even the general public knows what a confidence interval is, since they see them on opinion polls and voting projections ad nauseum.
Note: all my analyses and data are turnkey in Mathematica notebooks and are available to anyone requesting them: craigloehl at ncasi dot org.