The New NOAA Climate “Normals”

With the close of 2010, NOAA’s National Climatic Data Center (NCDC) has commenced calculation of the 1981–2010 Normals. Climate Normals are the latest three-decade averages of climatological variables, including temperature and precipitation. This new product will be replacing the current 1971–2000 Normals product. NCDC is still acquiring data for the year 2010. We expect the final input data will be ingested and processed by the end of April 2011. NCDC will be releasing the new Normals in two phases. The most widely used Normals will be released by July 2011, and all other Normals will be made available in a supplemental release by the end of 2011.

FAQs

1. What are Normals?

In the strictest sense, a “normal” of a particular variable (e.g., temperature) is defined as the 30-year average. For example, the minimum temperature normal in January for a station in Chicago, Illinois, would be computed by taking the average of the 30 January values of monthly-averaged minimum temperatures from 1981 to 2010. Each of the 30 monthly values was in turn derived from averaging the daily observations of minimum temperature for the station. In practice, however, much more goes into NCDC’s Normals product than simple 30-year averages. Procedures are put in place to deal with missing and suspect data values. In addition, Normals include quantities other than averages such as degree days, probabilities, standard deviations, etc. Normals are a large suite of data products that provide users with many tools to understand typical climate conditions for thousands of locations across the United States.

2. When will the 1981–2010 Normals be available?

The new Normals will be made available in two releases. Core Normals will be made available by July 2011, and supplemental Normals will be available by January 2012. Initial access to both releases will be via file transfer protocol (FTP), with links to the FTP access on NCDC’s Web site. We expect to provide more advanced (and user-friendly) Web services and selection capabilities to the new Normals from NCDC’s Web site by November 2011 for the core Normals and April 2012 for the supplemental Normals.

3. What are considered “core” Normals?

The core 1981–2010 Normals are the most-widely used Normals as identified by NCDC in close consultation with the National Weather Service (NWS) and a wide array of climate data users. Specifically, core Normals refer to the daily and monthly station-based Normals of temperature, precipitation, snowfall, snow depth, and heating and cooling degree days. Generally, this coincides with the key products produced for each observation station called CLIM81 and CLIM84 released for the 1971–2000 Normals (except for snowfall and snow depth Normals).

4. What are “supplemental” Normals?

Supplemental Normals are a catchall category for all Normals products that will not be released in the core Normals release. An example is our population-weighted degree day normals product, which cannot be computed

until the U.S. Census Bureau releases its final population figures.

5. Why aren’t the Normals released sooner?

The key factor determining when Normals will be made available is the acquisition of data from 2010 (especially December 2010). Currently, some station time series are “keyed” manually and/or are not fully incorporated into daily/monthly datasets for three to four months after being observed. NOAA waits until all relevant data have been incorporated before issuing the new set of Normals. However, the timelines for making the 1981–2010 Normals available described above represent a substantial reduction in development time compared to previous installments of Normals.

6. Where can I find more information of NOAA’s Normals?

Information on the current version of NOAA’s Normals, as well as the history of the Normals, can be found here:

http://www.ncdc.noaa.gov/oa/climate/normals/usnormals.html

The most current information on the development of 1981–2010 Normals is an extended abstract by Arguez et al. (2011) and can be accessed here:

ftp://ftp.ncdc.noaa.gov/pub/data/aarguez/Normals/1981-2010/Arguez-Extended-Normals-AMS2011.pdf

7. Why does NOAA produce Normals?

NOAA’s computation of climate Normals is in accordance with the recommendation of the World Meteorological Organization (WMO), of which the United States is a member. While the WMO mandates each member nation to compute 30-year averages of meteorological quantities at least every 30 years (1931–1960, 1961–1990, 1991–2020, etc.), the WMO recommends a decadal update, in part to incorporate newer weather stations. NOAA’s 1971–2000 Normals were calculated in accordance with this recommendation, and the 1981–2010 Normals will be as well. Further, NOAA’s NCDC has a responsibility to fulfill the mandate of Congress “… to establish and record the climatic conditions of the United States.” This responsibility stems from a provision of the Organic Act of October 1, 1890, which established the Weather Bureau as a civilian agency (15 U.S.C. 311).

8. What are Normals used for?

Meteorologists and climatologists regularly use Normals for placing recent climate conditions into a historical context. NOAA’s Normals are commonly seen on local weather news segments for comparisons with the day’s weather conditions. In addition to weather and climate comparisons, Normals are utilized in seemingly countless applications across a variety of sectors. These include: regulation of power companies, energy load forecasting, crop selection and planting times, construction planning, building design, and many others.

9. What changes are being made in the computation of the 1981–2010 Normals versus previous versions?

Several changes and additions are being incorporated into the 1981–2010 Normals. Monthly temperature and precipitation Normals will be based on underlying data values that have undergone additional quality control.

Monthly temperatures have also undergone enhanced bias corrections to account for the effects of station moves, changes in instrumentation, etc. These enhancements are described in more detail in the following

peer-reviewed papers:

ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2/monthly/menne-etal2009.pdf and

ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2/monthly/menne-williams2009.pdf.

Unlike the 1971–2000 Normals, daily data will be used extensively in the computation of daily temperature and precipitation Normals as well as heating and cooling degree day Normals, providing greater precision of intraseasonal features. In previous installments, daily precipitation Normals were computed as a spline fit through the monthly values. For 1981–2010, this metric will be replaced with a suite of metrics, including daily probabilities of precipitation as well as month-to-date and year-to-date precipitation Normals. New for 1981–2010 will be climate Normals derived from hourly data values as well as from radiosonde observations. More details can be found in Arguez et al. (2011) (see question 6 for access).

10. What qualifies or disqualifies a station to be included in Normals products?

Normals are computed for as many NWS stations as reasonably possible. Some stations do not have sufficient data over the 1981–2010 period to be included in Normals, and this is the primary reason a station may not be

included. Normals are computed for stations that are part of the NWS’s Cooperative Observer Program (COOP) Network. Some additional stations are included that have a Weather Bureau – Army – Navy (WBAN) station

identification number. Normals are only computed for stations in the United States, including Alaska, Hawaii, and U.S. territories.

11. How many stations will be included in the normals?

For 1981–2010, we plan to compute precipitation normals for almost 8,000 precipitation stations; a fraction of these will have snowfall and snow depth normals as well. Temperature normals (including derived products

such as degree days) will be computed for about 6,000 stations.

12. How do the 1981–2010 Normals compare with the 1971–2000 Normals?

It is common and expected for the Normals to change with each decadal installment. This is due to random processes such as sampling variability as well as systematic processes such as station moves and changes in methodology. Further, climate change can also lead to coordinated shifts in normal values. Once the 1981– 2010 Normals are made available, relevant comparisons between the new version and previous versions will be highlighted. This will include direct comparisons between the 1971–2000 and the 1981–2010 Normals, as well as so-called “apples-to-apples” comparisons that account for changes in methodology between the two installments. However, observational evidence shows that the 2000–2009 timeframe is the warmest decade on record. See the State of the Climate in 2009, a special supplement to the July 2010 Bulletin of the American Meteorological Society, which can be accessed here: http://www.ncdc.noaa.gov/bams-state-of-the-climate/2009.php. In light of the observed warming, we expect that the new Normals will generally be warmer on average for many stations but not uniformly warmer for all stations and all seasons. In fact, some station Normals in certain seasons will be cooler this time. However, we expect the number of warmer normal values to significantly exceed the number of cooler normal values. In the case of precipitation, preliminary results suggest that the differences between the 1971–2000 and 1981–2010 periods, while spatially coherent within certain regions, are highly variable from region to region and season to season.

13. What do climate Normals tell us about global warming or climate change?

Climate Normals were never designed to be metrics of climate change. In fact, when the widespread practice of computing climate Normals commenced in the 1930s, the generally-accepted notion of the climate was that underlying long-term averages of climate time series were constant. Changes from one installment of climate Normals to the next do, nonetheless, provide a crude metric of climate change impacts. However, care must be taken when interpreting changes between one Normals period and the other. For instance, a +0.2°F change may not be statistically significant. Further, changes from the 1971–2000 Normals to the 1981–2010 Normals may be due to station moves, changes in methodology, urban heating, etc. that are not reflective of real changes in the underlying climate signal.

14. How can I obtain historic Normals from previous Normal periods?

To obtain 1961–1990 climate Normals or earlier versions, please contact NCDC’s User Engagement & Services Branch.

15. What are Heating and Cooling Degree Days? What are Growing Degree Days?

Heating and cooling degree days are metrics of energy demand associated with the variation of mean temperature across space and time. Growing degree days are metrics of agricultural output, also as a function of mean temperature. The computation of degree days involves certain threshold temperatures, e.g., 65°F for heating and cooling degree days. These thresholds are referred to as base temperatures.

16. How can I obtain Heating and Cooling Degree Day Normals set to different base temperatures? And for Growing Degree Units?

While NCDC utilizes 65°F as the base temperature for the standard calculation of heating and cooling degree days, NCDC’s climate normal products include alternative computations of heating and cooling degree days for various base temperatures. In addition, growing degree days are computed for various crop-specific base temperatures. Please contact NCDC’s User Engagement & Services Branch for more information.

17. How can I obtain hourly, daily, and monthly Normals for additional weather elements such as dew point, sea level pressure, and wind?

The vast majority of weather stations utilized in Normals only routinely report air temperature and precipitation. A smaller set of stations have fairly complete records of additional variables such as dew point temperature,

sea level pressure, and wind speed and direction. For about 200–300 of these stations, we are able to provide hourly Normals of temperature, dew point temperature, sea level pressure, and wind.

18. How are the new hourly Normals computed, and how are hourly data used to calculate daily and monthly Normals?

The computation of hourly climate normals is still in development, so a final description of the methodology is not available at this time. This information will be finalized and available when the hourly normals are released.

19. How does the transition to ASOS affect the computation of Normals?

Automated Surface Observing System (ASOS) stations were implemented in the mid-1990s, largely replacing human observers. As a result, there are inhomogeneities in the 1981–2010 records due to changes in observing practices. These inhomogeneities will be addressed to some degree. More information on this will be made available when the core Normals are released.

20. How do the Normals compare to Alternative Normals and Dynamic Normals?

In response to observed climate change, NOAA’s NCDC has been investigating a suite of experimental products that attempt to provide a better estimate of “normal” than the traditional 30-year average Normals of temperature and precipitation. This project is known as Alternative Normals. This project is parallel to the computation of NOAA’s official 1981–2010 Normals and is ongoing. There are no plans to discontinue the computation of official Normals every ten years in response to results obtained from the Alternative Normals project. For more information on Alternative Normals, please contact NCDC’s Anthony Arguez. Dynamic Normals refers to a tool available on NCDC’s Web site that allows users to create their own Normals for a particular station by selecting customized start and end years for the averages. This tool has not been updated since 2001 and there are no plans to update this tool in the foreseeable future. For more information on Dynamic Normals, please contact NCDC’s User Engagement & Services Branch.

21. NOAA’s Climate Prediction Center has already changed their Normals to the 1971- 2010 base period? Why are those Normals not available?

Many organizations, including NOAA’s Climate Prediction Center (CPC), develop their own averages and change base periods for internal use. However, NCDC’s climate Normals are the official United States Normals as recognized by the World Meteorological Organization and the main Normals made available for a variety of variables. Below is a brief summary of changes to the CPC products due to the change in climate base period from 1971–2000 to 1981–2010:

Climate Monitoring:

• In January 2011, the CPC completed development of new climate normals based on the 1981–2010 period. This effort was done for all of the Climate Data Assimilation System (CDAS) and Global Ocean Data Assimilation (GODAS) data products that are used for real-time monitoring of the global climate system.

• This new climate base period was used to prepare numerous operational climate monitoring products, including the Climate Diagnostics Bulletin (CDB) and ocean monitoring products in February 2011. For example, the CDB and ocean products released in February 2011 that describe conditions during January, 2011 use climate anomalies based on the new climate base period.

• A notification of this change to the CPC normals was placed on the CPC website prior to the change in January 2011.

Climate Prediction:

• CPC normals for stations and climate divisions, which are used in CPC’s operational forecasts, will be officially updated in mid-May.

• CPC normals for heating and cooling degree days will be updated in mid-June. 22. How can I reach NCDC’s User Engagement & Services Branch?

Who can I contact for more information?

For general questions about Normals or help accessing the 1971–2000 product, please contact NCDC’s User Engagement & Services Branch at 828-271-4800, option 2. For questions regarding the development of the

1981–2010 Normals, please contact Anthony Arguez (Anthony.Arguez@noaa.gov).

==============================================

This document is also available as a PDF file here: NOAANormalsFAQs

0 0 votes
Article Rating
42 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Katherine
May 16, 2011 5:38 am

Further, changes from the 1971–2000 Normals to the 1981–2010 Normals may be due to station moves, changes in methodology, urban heating, etc. that are not reflective of real changes in the underlying climate signal.
Whoa, they actually mention that? Quelle surprise!

May 16, 2011 5:48 am

First thing Monday morning I see
“NOAA” and “normal” in the same sentence!
I’m going to need another cup of coffee before I read the second sentence.
🙂
Seriously, I would assume it is normal to adjust to a new “normal” each decade.
From the article:
However, care must be taken when interpreting changes between one Normals period and the other. For instance, a +0.2°F change may not be statistically significant.
But, but… if that happens every decade, in 100 years it would be +2.0° F. Wouldn’t that be significant? Presumably, a -0.2°F change wouldn’t be statistically significant either.
Further, changes from the 1971–2000 Normals to the 1981–2010 Normals may be due to station moves, changes in methodology, urban heating, etc. that are not reflective of real changes in the underlying climate signal.
The new Normal may not be reflective of the real climate signal? Was the old Normal? It will, however, be reflective of the different way NOAA now determines it. Um, OK.
I guess I’ll just sit back like I normally do and wait to see the new, improved, updated, very much fooled around with, Normal.
🙂

Staffan Lindström
May 16, 2011 5:58 am

…And SMHI here in Sweden {aka Ultima Thule…} are still using 1961-1990…for some very mysterious reasons…

Latitude
May 16, 2011 6:02 am

For some reason, we are supposed to be upset about some arbitrary 30 year “normal”.
Their constant obsession with making wrong predictions, in that short 30 year time frame….
…is more upsetting

Gary Pearse
May 16, 2011 6:03 am

…since the 1930s when we thought climate was stable in the long term. Why stick to such a slavish metric. Now that we know the climate varies in cycles over 50-60 years, it would seem more intelligent to select a range that starts and ends on appropriate points on the cycle. This is how we got into this burning in hell projections from the last “normals”.

Alex
May 16, 2011 6:48 am

The farse continues…

coaldust
May 16, 2011 6:56 am

With the US climate under the influence of PDO variations, I would like to see the normals determined over a longer time period, at least 60 years.

Huth
May 16, 2011 6:59 am

So we’ve got normal Normals, alternative Normals, and Dynamic Normals. Clear as mud what normal is then, isn’t it?

beng
May 16, 2011 7:08 am

Here in the western MD rain-shadow (just east of the Allegheny Front), and despite all the AGW angst, the avg annual precipitation has risen a significant 15-20% from 1920-1990 values — from ~35″ to 40″ (my avg for the last 7 yrs measured here is almost 42″ (1067 mm)). But this is just getting back to ~1900 values (during a cold period).

John Silver
May 16, 2011 7:46 am

Why 30? Why year?
What is so special about that?

Rob Potter
May 16, 2011 7:47 am

Any average is meaningless without a measure of variance. I have yet to see anyone refer to climate “Normals” and include a measure of variance – even a simple standard deviation or standard error of the mean would enable people to have an idea whether a particular measurement was notably high or low.
By using the term “normal” to mean an arithmetic mean, we give the impression that this actually has some significance in predicting what the temperature will be on a particular day. Thus, we end up with far too many commentators (often the amateur meteorologists on TV) saying todays temperature is above what it “should” be – based on nothing more than the simplest (and most misleading) of statistical treatments!
Sorry for the rant, but this makes my blood boil. I will accept that I can be a rather pedantic Englishman when it comes to language, but this isn’t poor english – it is a complete misrepresentation of what the “normal” stands for and is at the bottom of of the misunderstanding of science that passes for public discussion.
OK, I will now go and have a cup of tea and a lie down in a darkened room!

May 16, 2011 7:54 am

WOW, they just keep digging, looking for the discredited global warming hoax that they can somehow sell to ignorant people.
You should have seen the local SWFL weather trying to hide the cold front that went through yesterday, hit Cuba and then the Bahamas.

starzmom
May 16, 2011 8:25 am

Does anyone know why heating and cooling degree days are population weighted? What does that matter to the basic calculation?

Tenuc
May 16, 2011 8:28 am

Like all things connected with man-made global warming, even the idea that there is a climate ‘normal’ is nonsensical. Climate is driven by deterministic chaos and using averages to try to understand what is happening has no meaning.
Only by understanding how the individual climate processes combine to produce the non-linear quasi-cyclic behaviour we observe will progress be made. The current statistical methods used by so called climate scientists will never produce meaningful results about future outcomes and are a complete waste of money.

Cassandra King
May 16, 2011 8:35 am

Is the ice still “screaming” I wonder? I only ask because cryosphere has been clearly showing central melt and refreeze zones and NOAA ESRL temps are showing low temps.
Will the normals be lowered to show a current warming where no such warming exsits? They certainly tried to lower the 30s to show a rising trend post war. I guess if the data does not show warming the data must be wrong and it must be adjusted(tortured) until it fesses up.

crosspatch
May 16, 2011 8:47 am

30-year “normals” are nonsensical anyway considering that the PDO greatly influences US climate and it has a 30 year cycle. So we end up with 30 years above “normal” and then 30 years below “normal” over and over.
They should switch to a 60-year normal to capture most of a warm and cold cycle.

Interstellar Bill
May 16, 2011 8:51 am

Tomorrow our third winter storm in May will hit LA.
May rain has always been a rare fluke here, either an early monsoon or a late Pineapple Express storm from the southwest.
This year, however, ‘climate change’ has brought unprecedented winter storms from Alaska, the chill of which will soon become the ‘Norm’ in the looming Global Colding.
All they have to do is wait forty years and their precious Global Warming will return.

Bob from the UK
May 16, 2011 9:02 am

Has anyone seen the latest weekly CO2 level?
ftp://ftp.cmdl.noaa.gov/ccg/co2/trends/co2_weekly_mlo.txt
2011 5 8 2011.3493 392.94 7 392.51 373.88 109.68
Warning: last value appears low, could be a measurement problem
It’s rather low, to the extent that NOAA have flagged it as a possible measurement problem.
Some of you may recall a debate about CO2 levels and the fact it follows ENSO.
This is interesting. I wonder whether May which is typically the high point in the CO2 calendar will be barely up on last year.
Of course it may just be a measurement error, but we should be watching this with great interest, especially if the ocean cooling continues to intensify over the next couple of years.

Theo Goodwin
May 16, 2011 9:17 am

The “Normals” are a lose-lose situation for the Warmista. The higher the Normals, the more dramatic the decline in the coming years. Lose. The higher the acceleration in temperature increase during the Normals, the greater the deceleration during the coming years. Lose. The fact that they have their highest ever Normals adds nothing to their existing hype. It’s an accountant’s poetic justice. The very fact that you keep records works against your hype. Or “Hype is a rush but records are forever.”

Kev-in-Uk
May 16, 2011 9:47 am

Simple question – why 30 years? why not 50, or 100, or even 1000? (Ok, I know that we don’t have 1000 years of data!)
But seriously, on the subject of ‘normal’s’ it seems that for a given parameter or metric – anything that has happened before could/should be considered normal, in the climate variability sense. One could argue that Ice ages are ‘normal’ since we’ve had a few in the last million years or whatever. If there was only one – maybe it could be considered an outlier – but really, any ‘repeat’ event must fall within the ‘normal variability’ category?
we talk of 100 yr and 300 year ‘flood’ events – why should temp/climate be any different?
It is sensible and acceptable to generate some baseline from which to make assessments of changes, but the term ‘normal’ suggests that it is, well, normal! – and clearly, the extremes of temperature over the last few decades would stand out like sore thumbs against this background ‘normal’ average, when in truth, these may well be themselves ‘normal’, just less FREQUENT!
So I guess my main objection is the lack of clarity in explaining said situation to Joe Public and of course the use of the term ‘Normal’.
Instead of having a straight average figure, any subsequent comparison of say, a given months temperature should be compared to the ‘normal’ AND the ‘range’ of previous actual averages for that month. So for example, a plotted temp of say 11.5C against a ‘normal’ baseline of 10C but say with a shaded background showing the range of previous max and min average month temps (e.g. 5C to 15C). Of course, such a plot would give the real information to the casual observer and prevent the alarmism getting portrayed, so I guess it won’t happen!

May 16, 2011 9:51 am

The Royal Meteorological Institute did the same thing for Belgium.
http://www.meteo.be/meteo/view/fr/123763-Ce+mois-ci.html

Mike O
May 16, 2011 9:55 am

Given that the last 30 years included the very cold decade of the 1970’s, aren’t the new numbers going to be considerably warming on average? Even with the stagnation of temps during the last decade, we’ll see an effect in the “Normals” reflecting the warming from 1980 – 1999. Is that a good thing or a bad thing? Will we be comparing the coming additional cooling to a warmer dataset thus making the cooling look even more “alarming”, or will people be coming the previous normals to the “New Normal” and lamenting how much worse things really are?

SteveSadlov
May 16, 2011 10:01 am

Well now that the normals contain the ’82 and ’98 Ninos, I’d expect an immediate cooling crisis to set in.

Gary
May 16, 2011 10:20 am

Where are the “post-normals?”

David, UK
May 16, 2011 10:33 am

Gary says:
May 16, 2011 at 10:20 am
Where are the “Post-Normals?”

I refer you to Real Climate.

feet2thefire
May 16, 2011 10:38 am

This bothers me.
This sounds like changing the baseline. It is bad enough that they change past data downward (or in ANY direction, really). Once data is determined for certain, there should be no reason under the sun to change it. If it is set against a baseline, then every time they change the baseline, it is another opportunity to screw around and adjust the data – and there is far too much of that already. Set the darned baseline and keep it, people!
Once a baseline is set, IMHO, it needs to be adhered to indefinitely. To me, it is like changing the number of the year because a king died and a new one was crowned. A floating baseline is like not having B.C. and A.D., but moving the 0 year back and forth, just because some time has passed.
I just think this is a bad principle to be following.

Buffoon
May 16, 2011 10:51 am

30 year normal?
What about 30 years should be considered normal, in geospherical terms?

Wade
May 16, 2011 10:57 am

I too have wondered why it is a 30 year normal. Why not an average of every temperature reading on record?

Rhoda Ramirez
May 16, 2011 11:15 am

The “Normal” approach is probably directed toward an assumed public only interested in surface information. If NOAA allows access to the raw data then anyone interested in some other approach can just DO that approach.

kadaka (KD Knoebel)
May 16, 2011 11:40 am

We expect the final input data will be ingested and processed by the end of April 2011.
Warning! Freudian slippage detected.
Currently I’m ingesting and processing a microwaved burrito. I expect some will find value in both NOAA-NCDC’s final product and mine, as fertilizer.

Frank K.
May 16, 2011 11:52 am

kadaka (KD Knoebel) says:
May 16, 2011 at 11:40 am
“We expect the final input data will be ingested and processed by the end of April 2011.”
Warning! Freudian slippage detected.
Currently Im ingesting and processing a microwaved burrito. I expect some will find value in both NOAA-NCDCs final product and mine, as fertilizer.

Yes – the correct grammar for the NOAA statement should be as follows:
“We expect the final input data will be ingested and excreted by the end of April 2011.”
The data thus processed could then be rightfully referred to as “excrement”.
Moreover, information based upon NOAA “products” should be accurately referred to as “climate byproducts”.
/sarc

Hoser
May 16, 2011 12:09 pm

Theo Goodwin says:
May 16, 2011 at 9:17 am
___________________________
Their hope is to maintain hysteria long enough to solidify the AGW-based power grab. They have apparently succeeded in the UK, at least until the torches and pitchforks come out. How much will the Royal Family profit from carbon trading (like Prince Albert of Tennessee)? Prince Charles’ green con job won’t help the British people. I’m not surprised he’s a little bent. I don’t think I’d like to be an English King named Charles. The track record is pretty bad.
Charles I – Beheaded.
Charles II – “Restless he rolls from whore to whore A merry monarch, scandalous and poor”, wrote John Wilmot, 2nd Earl of Rochester.
Green socialism is simply a way to take away freedom, and make us serfs again.
Just to be fair, we have our own crop of nutters. Hopefully, most of those will be gone in 2012.
http://www.telegraph.co.uk/earth/energy/6491195/Al-Gore-could-become-worlds-first-carbon-billionaire.html

Charles Higley
May 16, 2011 12:27 pm

Besides the obvious confusion regarding the value and use of Normals, how much confidence can we have in them when we consider the source, the NCDC.? They have altered the data before and probably will again. So, why would they not incorporate their altered data into the Normals?
A ruler made of silly putty is not a ruler just because they put markings on it.

rbateman
May 16, 2011 12:35 pm

We expect the final input data will be ingested and processed by the end of April 2011.
I hope they have an archived copy of the data before the DOG eats it. /sarc

Legatus
May 16, 2011 4:08 pm

I don’t get it, 30 years is all for a “normal”? That is barely more than two sunspot cycles. The climate goes through warming and cooling based on about two sunspot cycles and whether the sun is going more or less active for those two cycles. Don’t they know about the “ice age scare” that ended in 1979, at the end of two quiet sun cycles? Talk to me about 60 year “normal” and maybe I will consider it, until then, this arbitrary 30 year “normal” is a no go in my book.
And they might want to rethink this. Compareing all our temeratures to the “normal” of 1979, when the ice age scare ended and temperatures went up for several decades is very convienient to show “warming”. If they base it on a later date, in the middle of that warming period, while we are now in the start of what looks like another two sunspot cycle ice age scare cooling period, then temeratures have nowhere to go but down. It was sooooo convenient to base all temeratures on the “normal” cold 1979 so that you can say “it’s getting warmer”. Without that cold year to compare todays temeratures to, how will you be able to say that?
What if they complete this, and it makes it look like temeratures are going down, compared to “normal”? Will they quietly drop the whole idea, or perhaps do some “adjustments” to the data to “reveal the warming” (dangerous, someone might catch you)? This whole idea is setting them up for a world of hurt. Coundn’t happen to a nicer bunch.

ew-3
May 16, 2011 5:42 pm

Hope someone has a copy of the last 30 years “normal”. Be interesting to compare with the new “normal”.
Will NOAA openly discuss how they are “tweaking” the data?

Tom in Florida
May 16, 2011 5:47 pm

I hate the use of the word normal for short periods. The average man on the street takes it to mean that is how is always has been and how always should be. They do not consider normal in the context of specified periods.

Deborah
May 16, 2011 8:48 pm

Normal? Being an artist I haven’t quite got a firm grasp on normal but I’ll give it a go. The normal temperature range of planet Earth is between -128.6 F and 136 F. Seems to be a lot of play there but what do I know? 😉

Editor
May 16, 2011 9:37 pm

Speaking of things being re-computed, I just downloaded NOAA monthly global data for April 2011. It seems to have been monkeyed with *AGAIN*. Using a 12-month running mean, and eyeballing the delta from March 2011’s data, linear trendlines for the delta would appear to be…
* starting approx +0.01 in 1880 down to -0.02 around 1920
* from -0.02 around 1920 to +0.0125 around 1965
* flat +0.0125 around 1965 to 1979
* step jump to +0.015 at 1980, and remaining there till now
Several months of this, and we’ll *REALLY* get “man-made global warming”. Winston Smith strikes again.

Edim
May 17, 2011 12:43 am

“Of course it may just be a measurement error, but we should be watching this with great interest, especially if the ocean cooling continues to intensify over the next couple of years.”
I agree with Bob from the UK.
I have said before that significant global cooling (even 30 years) might not be enough to disprove CO2-AGW if CO2 continues to rise. Too many people are living on the hysteria and from experience we know that they are good dodgers.
But, if CO2 decreases as well, which must happen IMO, then its a checkmate!
Watch for CO2 leveling and subsequent decrease!

Gator
May 17, 2011 8:58 am

Somebody please tell the folks at NOAA that there are no ‘normals’ in climate. This is just another excuse to screw with the data.

hotrod (Larry L)
May 17, 2011 8:47 pm

Although I understand that the 30 normal was agreed to a long time ago as a standard method. It is also important to recognize how the world has changed since then. In 1971 the computer power of the period and the mostly hand entered data of the period made a relatively short normal period almost mandatory to keep things manageable.
Given that 1971 approximates the time period when reliable instrumental data first became available in quantity, and we had the means to record it and also have confirming data from satellites and high altitude aircraft flights, we should begin now to shift to a longer term like 50 or 100 year normals. To do that they should simply every 10 years recompute the normals from 1971 to the current decade, until they get to a sensible time period like 100 years. There is no reason at all that they could not do both, replace the 1971–2000 30 normal with the new 1981-2010 normal but also compute the aggregate 1971 to 2010 40 year normals. Then in 10 more years do the same with a 1991-2020 normal and an aggregate 1971-2020 normal.
That would provide a sensible progression from the artificially short 30 year normals to a more reasonable period that spans multiple climatic oscillation cycles. The longer duration normal would also converge on a more realistic estimate of the true normal for the last 500 years or so, and possibly the difference between the shorter terms and the longer aggregate period would show just how poorly (or correctly) they both represent the truth.
30 year cycles have repeatedly shown to be too short for climate related decisions. One example is the Colorado River Compact 1922 which was based on the average precipitation over a relatively wet period. It was assumed that typical flow for the river system was 16.4 million acre feet per year (641 m³/s), but it has been learned that that estimate was probably high and based on a relatively wet period of study. Current estimates are that typical flow is actually closer to 13.2 million acre feet per year (516 m³/s) to 14.3 million acre feet per year (559 m3/s).
This shows the risk of determining statutory limits based on too short of an observation period. The same applies to climate legislation.
Once the basic process is in place to process the numbers the difference in staff time to process a stand alone 30 normal based on a new 10 period of data and adding that same data to the long term running normal based on 1971 year to date is trivial.
Larry