With the close of 2010, NOAA’s National Climatic Data Center (NCDC) has commenced calculation of the 1981–2010 Normals. Climate Normals are the latest three-decade averages of climatological variables, including temperature and precipitation. This new product will be replacing the current 1971–2000 Normals product. NCDC is still acquiring data for the year 2010. We expect the final input data will be ingested and processed by the end of April 2011. NCDC will be releasing the new Normals in two phases. The most widely used Normals will be released by July 2011, and all other Normals will be made available in a supplemental release by the end of 2011.
FAQs
1. What are Normals?
In the strictest sense, a “normal” of a particular variable (e.g., temperature) is defined as the 30-year average. For example, the minimum temperature normal in January for a station in Chicago, Illinois, would be computed by taking the average of the 30 January values of monthly-averaged minimum temperatures from 1981 to 2010. Each of the 30 monthly values was in turn derived from averaging the daily observations of minimum temperature for the station. In practice, however, much more goes into NCDC’s Normals product than simple 30-year averages. Procedures are put in place to deal with missing and suspect data values. In addition, Normals include quantities other than averages such as degree days, probabilities, standard deviations, etc. Normals are a large suite of data products that provide users with many tools to understand typical climate conditions for thousands of locations across the United States.
2. When will the 1981–2010 Normals be available?
The new Normals will be made available in two releases. Core Normals will be made available by July 2011, and supplemental Normals will be available by January 2012. Initial access to both releases will be via file transfer protocol (FTP), with links to the FTP access on NCDC’s Web site. We expect to provide more advanced (and user-friendly) Web services and selection capabilities to the new Normals from NCDC’s Web site by November 2011 for the core Normals and April 2012 for the supplemental Normals.
3. What are considered “core” Normals?
The core 1981–2010 Normals are the most-widely used Normals as identified by NCDC in close consultation with the National Weather Service (NWS) and a wide array of climate data users. Specifically, core Normals refer to the daily and monthly station-based Normals of temperature, precipitation, snowfall, snow depth, and heating and cooling degree days. Generally, this coincides with the key products produced for each observation station called CLIM81 and CLIM84 released for the 1971–2000 Normals (except for snowfall and snow depth Normals).
4. What are “supplemental” Normals?
Supplemental Normals are a catchall category for all Normals products that will not be released in the core Normals release. An example is our population-weighted degree day normals product, which cannot be computed
until the U.S. Census Bureau releases its final population figures.
5. Why aren’t the Normals released sooner?
The key factor determining when Normals will be made available is the acquisition of data from 2010 (especially December 2010). Currently, some station time series are “keyed” manually and/or are not fully incorporated into daily/monthly datasets for three to four months after being observed. NOAA waits until all relevant data have been incorporated before issuing the new set of Normals. However, the timelines for making the 1981–2010 Normals available described above represent a substantial reduction in development time compared to previous installments of Normals.
6. Where can I find more information of NOAA’s Normals?
Information on the current version of NOAA’s Normals, as well as the history of the Normals, can be found here:
http://www.ncdc.noaa.gov/oa/climate/normals/usnormals.html
The most current information on the development of 1981–2010 Normals is an extended abstract by Arguez et al. (2011) and can be accessed here:
ftp://ftp.ncdc.noaa.gov/pub/data/aarguez/Normals/1981-2010/Arguez-Extended-Normals-AMS2011.pdf
7. Why does NOAA produce Normals?
NOAA’s computation of climate Normals is in accordance with the recommendation of the World Meteorological Organization (WMO), of which the United States is a member. While the WMO mandates each member nation to compute 30-year averages of meteorological quantities at least every 30 years (1931–1960, 1961–1990, 1991–2020, etc.), the WMO recommends a decadal update, in part to incorporate newer weather stations. NOAA’s 1971–2000 Normals were calculated in accordance with this recommendation, and the 1981–2010 Normals will be as well. Further, NOAA’s NCDC has a responsibility to fulfill the mandate of Congress “… to establish and record the climatic conditions of the United States.” This responsibility stems from a provision of the Organic Act of October 1, 1890, which established the Weather Bureau as a civilian agency (15 U.S.C. 311).
8. What are Normals used for?
Meteorologists and climatologists regularly use Normals for placing recent climate conditions into a historical context. NOAA’s Normals are commonly seen on local weather news segments for comparisons with the day’s weather conditions. In addition to weather and climate comparisons, Normals are utilized in seemingly countless applications across a variety of sectors. These include: regulation of power companies, energy load forecasting, crop selection and planting times, construction planning, building design, and many others.
9. What changes are being made in the computation of the 1981–2010 Normals versus previous versions?
Several changes and additions are being incorporated into the 1981–2010 Normals. Monthly temperature and precipitation Normals will be based on underlying data values that have undergone additional quality control.
Monthly temperatures have also undergone enhanced bias corrections to account for the effects of station moves, changes in instrumentation, etc. These enhancements are described in more detail in the following
peer-reviewed papers:
ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2/monthly/menne-etal2009.pdf and
ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2/monthly/menne-williams2009.pdf.
Unlike the 1971–2000 Normals, daily data will be used extensively in the computation of daily temperature and precipitation Normals as well as heating and cooling degree day Normals, providing greater precision of intraseasonal features. In previous installments, daily precipitation Normals were computed as a spline fit through the monthly values. For 1981–2010, this metric will be replaced with a suite of metrics, including daily probabilities of precipitation as well as month-to-date and year-to-date precipitation Normals. New for 1981–2010 will be climate Normals derived from hourly data values as well as from radiosonde observations. More details can be found in Arguez et al. (2011) (see question 6 for access).
10. What qualifies or disqualifies a station to be included in Normals products?
Normals are computed for as many NWS stations as reasonably possible. Some stations do not have sufficient data over the 1981–2010 period to be included in Normals, and this is the primary reason a station may not be
included. Normals are computed for stations that are part of the NWS’s Cooperative Observer Program (COOP) Network. Some additional stations are included that have a Weather Bureau – Army – Navy (WBAN) station
identification number. Normals are only computed for stations in the United States, including Alaska, Hawaii, and U.S. territories.
11. How many stations will be included in the normals?
For 1981–2010, we plan to compute precipitation normals for almost 8,000 precipitation stations; a fraction of these will have snowfall and snow depth normals as well. Temperature normals (including derived products
such as degree days) will be computed for about 6,000 stations.
12. How do the 1981–2010 Normals compare with the 1971–2000 Normals?
It is common and expected for the Normals to change with each decadal installment. This is due to random processes such as sampling variability as well as systematic processes such as station moves and changes in methodology. Further, climate change can also lead to coordinated shifts in normal values. Once the 1981– 2010 Normals are made available, relevant comparisons between the new version and previous versions will be highlighted. This will include direct comparisons between the 1971–2000 and the 1981–2010 Normals, as well as so-called “apples-to-apples” comparisons that account for changes in methodology between the two installments. However, observational evidence shows that the 2000–2009 timeframe is the warmest decade on record. See the State of the Climate in 2009, a special supplement to the July 2010 Bulletin of the American Meteorological Society, which can be accessed here: http://www.ncdc.noaa.gov/bams-state-of-the-climate/2009.php. In light of the observed warming, we expect that the new Normals will generally be warmer on average for many stations but not uniformly warmer for all stations and all seasons. In fact, some station Normals in certain seasons will be cooler this time. However, we expect the number of warmer normal values to significantly exceed the number of cooler normal values. In the case of precipitation, preliminary results suggest that the differences between the 1971–2000 and 1981–2010 periods, while spatially coherent within certain regions, are highly variable from region to region and season to season.
13. What do climate Normals tell us about global warming or climate change?
Climate Normals were never designed to be metrics of climate change. In fact, when the widespread practice of computing climate Normals commenced in the 1930s, the generally-accepted notion of the climate was that underlying long-term averages of climate time series were constant. Changes from one installment of climate Normals to the next do, nonetheless, provide a crude metric of climate change impacts. However, care must be taken when interpreting changes between one Normals period and the other. For instance, a +0.2°F change may not be statistically significant. Further, changes from the 1971–2000 Normals to the 1981–2010 Normals may be due to station moves, changes in methodology, urban heating, etc. that are not reflective of real changes in the underlying climate signal.
14. How can I obtain historic Normals from previous Normal periods?
To obtain 1961–1990 climate Normals or earlier versions, please contact NCDC’s User Engagement & Services Branch.
15. What are Heating and Cooling Degree Days? What are Growing Degree Days?
Heating and cooling degree days are metrics of energy demand associated with the variation of mean temperature across space and time. Growing degree days are metrics of agricultural output, also as a function of mean temperature. The computation of degree days involves certain threshold temperatures, e.g., 65°F for heating and cooling degree days. These thresholds are referred to as base temperatures.
16. How can I obtain Heating and Cooling Degree Day Normals set to different base temperatures? And for Growing Degree Units?
While NCDC utilizes 65°F as the base temperature for the standard calculation of heating and cooling degree days, NCDC’s climate normal products include alternative computations of heating and cooling degree days for various base temperatures. In addition, growing degree days are computed for various crop-specific base temperatures. Please contact NCDC’s User Engagement & Services Branch for more information.
17. How can I obtain hourly, daily, and monthly Normals for additional weather elements such as dew point, sea level pressure, and wind?
The vast majority of weather stations utilized in Normals only routinely report air temperature and precipitation. A smaller set of stations have fairly complete records of additional variables such as dew point temperature,
sea level pressure, and wind speed and direction. For about 200–300 of these stations, we are able to provide hourly Normals of temperature, dew point temperature, sea level pressure, and wind.
18. How are the new hourly Normals computed, and how are hourly data used to calculate daily and monthly Normals?
The computation of hourly climate normals is still in development, so a final description of the methodology is not available at this time. This information will be finalized and available when the hourly normals are released.
19. How does the transition to ASOS affect the computation of Normals?
Automated Surface Observing System (ASOS) stations were implemented in the mid-1990s, largely replacing human observers. As a result, there are inhomogeneities in the 1981–2010 records due to changes in observing practices. These inhomogeneities will be addressed to some degree. More information on this will be made available when the core Normals are released.
20. How do the Normals compare to Alternative Normals and Dynamic Normals?
In response to observed climate change, NOAA’s NCDC has been investigating a suite of experimental products that attempt to provide a better estimate of “normal” than the traditional 30-year average Normals of temperature and precipitation. This project is known as Alternative Normals. This project is parallel to the computation of NOAA’s official 1981–2010 Normals and is ongoing. There are no plans to discontinue the computation of official Normals every ten years in response to results obtained from the Alternative Normals project. For more information on Alternative Normals, please contact NCDC’s Anthony Arguez. Dynamic Normals refers to a tool available on NCDC’s Web site that allows users to create their own Normals for a particular station by selecting customized start and end years for the averages. This tool has not been updated since 2001 and there are no plans to update this tool in the foreseeable future. For more information on Dynamic Normals, please contact NCDC’s User Engagement & Services Branch.
21. NOAA’s Climate Prediction Center has already changed their Normals to the 1971- 2010 base period? Why are those Normals not available?
Many organizations, including NOAA’s Climate Prediction Center (CPC), develop their own averages and change base periods for internal use. However, NCDC’s climate Normals are the official United States Normals as recognized by the World Meteorological Organization and the main Normals made available for a variety of variables. Below is a brief summary of changes to the CPC products due to the change in climate base period from 1971–2000 to 1981–2010:
Climate Monitoring:
• In January 2011, the CPC completed development of new climate normals based on the 1981–2010 period. This effort was done for all of the Climate Data Assimilation System (CDAS) and Global Ocean Data Assimilation (GODAS) data products that are used for real-time monitoring of the global climate system.
• This new climate base period was used to prepare numerous operational climate monitoring products, including the Climate Diagnostics Bulletin (CDB) and ocean monitoring products in February 2011. For example, the CDB and ocean products released in February 2011 that describe conditions during January, 2011 use climate anomalies based on the new climate base period.
• A notification of this change to the CPC normals was placed on the CPC website prior to the change in January 2011.
Climate Prediction:
• CPC normals for stations and climate divisions, which are used in CPC’s operational forecasts, will be officially updated in mid-May.
• CPC normals for heating and cooling degree days will be updated in mid-June. 22. How can I reach NCDC’s User Engagement & Services Branch?
Who can I contact for more information?
For general questions about Normals or help accessing the 1971–2000 product, please contact NCDC’s User Engagement & Services Branch at 828-271-4800, option 2. For questions regarding the development of the
1981–2010 Normals, please contact Anthony Arguez (Anthony.Arguez@noaa.gov).
==============================================
This document is also available as a PDF file here: NOAANormalsFAQs
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

Further, changes from the 1971–2000 Normals to the 1981–2010 Normals may be due to station moves, changes in methodology, urban heating, etc. that are not reflective of real changes in the underlying climate signal.
Whoa, they actually mention that? Quelle surprise!
First thing Monday morning I see
“NOAA” and “normal” in the same sentence!
I’m going to need another cup of coffee before I read the second sentence.
🙂
Seriously, I would assume it is normal to adjust to a new “normal” each decade.
From the article:
However, care must be taken when interpreting changes between one Normals period and the other. For instance, a +0.2°F change may not be statistically significant.
But, but… if that happens every decade, in 100 years it would be +2.0° F. Wouldn’t that be significant? Presumably, a -0.2°F change wouldn’t be statistically significant either.
Further, changes from the 1971–2000 Normals to the 1981–2010 Normals may be due to station moves, changes in methodology, urban heating, etc. that are not reflective of real changes in the underlying climate signal.
The new Normal may not be reflective of the real climate signal? Was the old Normal? It will, however, be reflective of the different way NOAA now determines it. Um, OK.
I guess I’ll just sit back like I normally do and wait to see the new, improved, updated, very much fooled around with, Normal.
🙂
…And SMHI here in Sweden {aka Ultima Thule…} are still using 1961-1990…for some very mysterious reasons…
For some reason, we are supposed to be upset about some arbitrary 30 year “normal”.
Their constant obsession with making wrong predictions, in that short 30 year time frame….
…is more upsetting
…since the 1930s when we thought climate was stable in the long term. Why stick to such a slavish metric. Now that we know the climate varies in cycles over 50-60 years, it would seem more intelligent to select a range that starts and ends on appropriate points on the cycle. This is how we got into this burning in hell projections from the last “normals”.
The farse continues…
With the US climate under the influence of PDO variations, I would like to see the normals determined over a longer time period, at least 60 years.
So we’ve got normal Normals, alternative Normals, and Dynamic Normals. Clear as mud what normal is then, isn’t it?
Here in the western MD rain-shadow (just east of the Allegheny Front), and despite all the AGW angst, the avg annual precipitation has risen a significant 15-20% from 1920-1990 values — from ~35″ to 40″ (my avg for the last 7 yrs measured here is almost 42″ (1067 mm)). But this is just getting back to ~1900 values (during a cold period).
Why 30? Why year?
What is so special about that?
Any average is meaningless without a measure of variance. I have yet to see anyone refer to climate “Normals” and include a measure of variance – even a simple standard deviation or standard error of the mean would enable people to have an idea whether a particular measurement was notably high or low.
By using the term “normal” to mean an arithmetic mean, we give the impression that this actually has some significance in predicting what the temperature will be on a particular day. Thus, we end up with far too many commentators (often the amateur meteorologists on TV) saying todays temperature is above what it “should” be – based on nothing more than the simplest (and most misleading) of statistical treatments!
Sorry for the rant, but this makes my blood boil. I will accept that I can be a rather pedantic Englishman when it comes to language, but this isn’t poor english – it is a complete misrepresentation of what the “normal” stands for and is at the bottom of of the misunderstanding of science that passes for public discussion.
OK, I will now go and have a cup of tea and a lie down in a darkened room!
WOW, they just keep digging, looking for the discredited global warming hoax that they can somehow sell to ignorant people.
You should have seen the local SWFL weather trying to hide the cold front that went through yesterday, hit Cuba and then the Bahamas.
Does anyone know why heating and cooling degree days are population weighted? What does that matter to the basic calculation?
Like all things connected with man-made global warming, even the idea that there is a climate ‘normal’ is nonsensical. Climate is driven by deterministic chaos and using averages to try to understand what is happening has no meaning.
Only by understanding how the individual climate processes combine to produce the non-linear quasi-cyclic behaviour we observe will progress be made. The current statistical methods used by so called climate scientists will never produce meaningful results about future outcomes and are a complete waste of money.
Is the ice still “screaming” I wonder? I only ask because cryosphere has been clearly showing central melt and refreeze zones and NOAA ESRL temps are showing low temps.
Will the normals be lowered to show a current warming where no such warming exsits? They certainly tried to lower the 30s to show a rising trend post war. I guess if the data does not show warming the data must be wrong and it must be adjusted(tortured) until it fesses up.
30-year “normals” are nonsensical anyway considering that the PDO greatly influences US climate and it has a 30 year cycle. So we end up with 30 years above “normal” and then 30 years below “normal” over and over.
They should switch to a 60-year normal to capture most of a warm and cold cycle.
Tomorrow our third winter storm in May will hit LA.
May rain has always been a rare fluke here, either an early monsoon or a late Pineapple Express storm from the southwest.
This year, however, ‘climate change’ has brought unprecedented winter storms from Alaska, the chill of which will soon become the ‘Norm’ in the looming Global Colding.
All they have to do is wait forty years and their precious Global Warming will return.
Has anyone seen the latest weekly CO2 level?
ftp://ftp.cmdl.noaa.gov/ccg/co2/trends/co2_weekly_mlo.txt
2011 5 8 2011.3493 392.94 7 392.51 373.88 109.68
Warning: last value appears low, could be a measurement problem
It’s rather low, to the extent that NOAA have flagged it as a possible measurement problem.
Some of you may recall a debate about CO2 levels and the fact it follows ENSO.
This is interesting. I wonder whether May which is typically the high point in the CO2 calendar will be barely up on last year.
Of course it may just be a measurement error, but we should be watching this with great interest, especially if the ocean cooling continues to intensify over the next couple of years.
The “Normals” are a lose-lose situation for the Warmista. The higher the Normals, the more dramatic the decline in the coming years. Lose. The higher the acceleration in temperature increase during the Normals, the greater the deceleration during the coming years. Lose. The fact that they have their highest ever Normals adds nothing to their existing hype. It’s an accountant’s poetic justice. The very fact that you keep records works against your hype. Or “Hype is a rush but records are forever.”
Simple question – why 30 years? why not 50, or 100, or even 1000? (Ok, I know that we don’t have 1000 years of data!)
But seriously, on the subject of ‘normal’s’ it seems that for a given parameter or metric – anything that has happened before could/should be considered normal, in the climate variability sense. One could argue that Ice ages are ‘normal’ since we’ve had a few in the last million years or whatever. If there was only one – maybe it could be considered an outlier – but really, any ‘repeat’ event must fall within the ‘normal variability’ category?
we talk of 100 yr and 300 year ‘flood’ events – why should temp/climate be any different?
It is sensible and acceptable to generate some baseline from which to make assessments of changes, but the term ‘normal’ suggests that it is, well, normal! – and clearly, the extremes of temperature over the last few decades would stand out like sore thumbs against this background ‘normal’ average, when in truth, these may well be themselves ‘normal’, just less FREQUENT!
So I guess my main objection is the lack of clarity in explaining said situation to Joe Public and of course the use of the term ‘Normal’.
Instead of having a straight average figure, any subsequent comparison of say, a given months temperature should be compared to the ‘normal’ AND the ‘range’ of previous actual averages for that month. So for example, a plotted temp of say 11.5C against a ‘normal’ baseline of 10C but say with a shaded background showing the range of previous max and min average month temps (e.g. 5C to 15C). Of course, such a plot would give the real information to the casual observer and prevent the alarmism getting portrayed, so I guess it won’t happen!
The Royal Meteorological Institute did the same thing for Belgium.
http://www.meteo.be/meteo/view/fr/123763-Ce+mois-ci.html
Given that the last 30 years included the very cold decade of the 1970’s, aren’t the new numbers going to be considerably warming on average? Even with the stagnation of temps during the last decade, we’ll see an effect in the “Normals” reflecting the warming from 1980 – 1999. Is that a good thing or a bad thing? Will we be comparing the coming additional cooling to a warmer dataset thus making the cooling look even more “alarming”, or will people be coming the previous normals to the “New Normal” and lamenting how much worse things really are?
Well now that the normals contain the ’82 and ’98 Ninos, I’d expect an immediate cooling crisis to set in.
Where are the “post-normals?”
Gary says:
May 16, 2011 at 10:20 am
Where are the “Post-Normals?”
I refer you to Real Climate.