This from NOAA, which I missed last week while I was at ICCC6. But wait, there’s more! Have a look at the maps I produced below the “Continue reading>” line using NOAA’s own tools.
Average U.S. temperature increases by 0.5 degrees F
New 1981-2010 ‘normals’ to be released this week
June 29, 2011
Statewide changes in annual “normal temperatures” (1981 – 2010 compared to 1971 – 2000).
Download here. (Credit: NOAA)
According to the 1981-2010 normals to be released by NOAA’s National Climatic Data Center (NCDC) on July 1, temperatures across the United States were on average, approximately 0.5 degree F warmer than the 1971-2000 time period.
Normals serve as a 30 year baseline average of important climate variables that are used to understand average climate conditions at any location and serve as a consistent point of reference. The new normals update the 30-year averages of climatological variables, including average temperature and precipitation for more than 7,500 locations across the United States. This once-a-decade update will replace the current 1971–2000 normals.
In the continental United States, every state’s annual maximum and minimum temperature increased on average. “The climate of the 2000s is about 1.5 degree F warmer than the 1970s, so we would expect the updated 30-year normals to be warmer,” said Thomas R. Karl, L.H.D., NCDC director.
Using standards established by the World Meteorological Organization, the 30-year normals are used to compare current climate conditions with recent history. Local weathercasters traditionally use normals for comparisons with the day’s weather conditions.
In addition to their application in the weather sector, normals are used extensively by electric and gas companies for short- and long-term energy use projections. NOAA’s normals are also used by some states as the standard benchmark by which they determine the statewide rate that utilities are allowed to charge their customers.
The agricultural sector also heavily depends on normals. Farmers rely on normals to help make decisions on both crop selection and planting times. Agribusinesses use normals to monitor “departures from normal conditions” throughout the growing season and to assess past and current crop yields.
NCDC made many improvements and additions to the scientific methodology used to calculate the 1981-2010 normals. They include improved scientific quality control and statistical techniques. Comparisons to previous normals take these new techniques into account. The 1981-2010 normals provide a more comprehensive suite of precipitation and snowfall statistics. In addition, NCDC is providing hourly normals for more than 250 stations at the request of users, such as the energy industry.
Some of the key climate normals include: monthly and daily maximum temperature; monthly and daily minimum temperature; daily and monthly precipitation and snowfall statistics; and daily and monthly heating and cooling degree days. The 1981-2010 climate normals is one of the suite of climate services NOAA provides government, business and community leaders so they can make informed decisions. NOAA and its predecessor agencies have been providing updated 30-year normals once every decade since the 1921-1950 normals were released in 1956.
NOAA’s mission is to understand and predict changes in the Earth’s environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on Facebook, Twitter and our other social media channels.
==============================================================
The interesting thing about baselines, is that by choosing your baseline, you can make an anomaly from that baseline look like just about anything you want it to be. For example, at the NOAA Earth System Research Laboratory, they offer a handy dandy map creator for US Climate Divisions that allows you to choose the baseline period (from a predetermined set they offer), including the new 1981-2010 normals released on July 1st.
Here are some examples I get simply by changing the base period.
Here’s the most current (new) base period of 1981-2010. Doesn’t look so bad does it?
Now if we plot the entire period, look out, we are on fire! Global warming is running amok!
Oh, wait, look at the scale. Looks like about 0.08 to 0.13F warming. The ESRL plotting system arbitrarily assigns a range to the scale, based on data magnitude, but keeps the same color range.
The point of this? Temperature anomalies can be anything you want them to be, and by their nature of forcing a choice of baseline, forces you to cherrypick choose periods in your presentation of the data. The choice is up to the publisher.
Now the next question is, with NOAA/NCDC following standard WMO procedure (Roy Spencer at UAH did a few months back also) for a new 30 year base period, will NASA GISS start using a modern normals baseline for their publicly presented default graphs rather than the cooler and long outdated 1951-1980 for the press releases and graphs they foist on the media? Of course for a variety of reasons, they’ll tell you no. But it is interesting to see the rest of the world moving on while GISS remains stoically static. In case you are wondering, here is the offset difference based on the 1951-1990 vs the 1951-1980 base period used. This will of course change again with a more recent baseline, such as NOAA has adopted. While the slope won’t change the magnitude offset will. More here.

The issue is about public presentation of data. I figure if making such a baseline change is something NOAA does, should not NASA follow in the materials they release to the press?
Anomalies – any way you want them:






wake me when NOAA produces some science …
I am surprised that nobody ever seems to question the use of the 30-year baselines that are used so religiously in climate studies. As far as I can tell, it is merely a compromise figure that the WMO got everybody to agree to back in the day, and has no actual scientific significance whatsoever.
Given that the major oceanic cycles seem to repeat over c.65 years, surely that would be a more appropriate and significant period to use when establishing baselines.
Anyone else have thoughts on this?
I am 65. My average age is 33. I feel 65. My average age for the past 32 years is about 50. I still feel 65. My average age between 1946 and 1978 is 16. I feel 65. No matter how I spin it I don’t feel anything but 65. This doesn’t seem productive.
I’m tired… I’m going home now… -Forrest Gump
The 30 year rule was developed when the books the records were hand entered into came published in sets of 10 years, at the time the decade book became filled there were pages for the monthly averages, annual averages, and the decade average for each date.
They settled on 30 years because it was about all you could do to open three old sets of records and the current, on a 3 X 6 foot desk. You could scan through all four books a page at a time, but try opening two more books and it was a shuffle them around nightmare.
Anything is possible says (at 9:30 am) – “I am surprised that nobody ever seems to question the use of the 30-year baselines that are used so religiously in climate studies.”
I assume the baseline was picked to be 30 years as that would give you a nice statistically speaking number (n=30) to estimate simga which in turn can be used to caluculate a 95%CI.
What else about the AMO does NOAA not understand?
I think the 30 year period comes from historical perspective. Look at the temp data since 1880 to today, you see about 30 years of warming followed by 30 years of cooling. Probably the basis of the Farmers Almanac predictions. The PDO, AO, AMDO and other natural cyclical events are just window dressing. (sarcasm off)
Anthony:
Actually NCDC could be violating WMO standards and probably is if they applied these normals to GHCN data. The WMO states in their guidelines that Normal’s should NOT be changed unless there is more stations in the new normals compared to the old normals:
http://www.wmo.int/pages/prog/wcp/documents/Guide2.pdf
As shown in the above section from the WMO guide, UAH switched to a 30 year baseline as per the WMO: ” Countries should calculate climatological standard normals as soon as possible after the end of a standard normal period.”
As soon as they had 30 years matching the guidelines they made standard Normals. NCDC on the other hand did not follow WMO guidelines if they switched normals on the GHCN dataset due to the “Great Dying of Thermometers”. There was more stations during the 71-2000 baseline then there is in the 81-2010 baseline in GHCN therefore it would violate this: “Where climatological standard normals are used as a reference, there are no clear advantages to updating the normals frequently unless an update provides normals for a significantly greater number of stations. Frequent updating carries the disadvantage that it requires recalculation of not only the normals themselves but also numerous data sets that use the normals as a reference.”
At the minimum the WMO does not support this type of change in normals that NCDC is doing. Now if there is more stations in USHCN for 81-2010 then 71-2000 it is an appropriate update for that dataset, but only for that dataset.
As to GISS they should have updated form 51-80 to 61-90 because there was more stations in GHCN and again to 71-2000. However WMO does give them an out for not doing so: “However, the disadvantages of frequent updating could be considered to offset this advantage when the normals are being used for reference purposes.”
I suspect almost all of that 0.5 degrees F is the removal of the cold 1970’s which I remember well in upstate NY. I’m originally from Cincinnati, and the savage winter of 1977-78 there and throughout the midwest was one for the books. Here’s some info about where I was born during that winter- http://www.enquirer.com/editions/2000/12/31/loc_dont_look_for_river.html
Given what part of the 60-yr cycle we’re in, they’re really setting themselves (and their “clients”) up for a nasty fall. Planning on “continued warming” is a definite losing strategy.
No the 30 year period was not picked for Stats purposes as shown in the WMO guidelines:
It was picked because when they first started they only had 30 years of data. If you wanted to pick normals for Stats purposes according to the WMO 30 years is not the ideal length for temperature studies:
http://www.wmo.int/pages/prog/wcp/documents/Guide2.pdf
So there you g, the whole it has to be 30 years for it to be climate meme is based on a cherrypick (that was the length of data they had) and it is not the optimal averaging period for temperatures.
Yes, anyone can pick and choose the begin and end points of a series of data if they have an agenda to make the data seem to indicate something it may not. But if you are consistent in your choicesmof starting and ending points and use data that is gathered in a consistent manner, you should be able to see real trends in the data over a long period of time. It will be interesting to see how the decade of 2010-2019 compares to the previous decade of 2000-2009. This will be a good test for knowing much more about the sensitivity of the climate to rapid increases in CO2 brought about by human activity.
The selection of a base period for GLOBAL presentations ( like CRU and NASA do ) as the WMO recognizes is complicated.
“As to GISS they should have updated form 51-80 to 61-90 because there was more stations in GHCN and again to 71-2000. ”
That’s not exactly true.
A long while back I did a study of all possible base periods and counting the number of stations
available.
“available” does not mean “In the database” If you are using a CAM method then the stations need to fit TWO criteria. They have to be “in the database” and they have to be reporting
ENOUGH data in the reference period.
So, for example, if you are looking at GLOBAL data, the period would be 1953-1982
( as I recall)
However, if you pick that period, you will lose some southern hemisphere stations.
To maximize the southern hemispshere stations ( which has fewer) you want to pick
1961-1990. While this has a marginally smaller number of stations in the NH, you maximize the SH.
That’s important to minimize the SD in the monthly mean of the base period.
With hansen’s RSM there are also rules for counting which stations to include and which dont have enough data. So you CANNOT simply look at the number of stations in the database at any particular time. Its not that simple. Its not hard, It’s just not as simple as suggested here.
The base period has no significant effect on the trend. It CAN effect the noise. and you want to maximize the number of reports ( locally or globally as the case may be) in the base period.
Folks who want to play around with this can download the R package RghcnV3.
Maybe I’ll do a special bit of demo code to show folks how to look at these things properly.
At some point I’ll put in Giss code into the package.. but first… the berkely method which has no base period or RomanMs which has no base period.
When people see that station count and base period MAKE NO SUBSTANTIAL DIFFERENCE
then perhaps we can get the crowd focused on the key issue which is sensitivity
Most of the data is being used for presentation purposes.
That means, they need to show the greatest rise above “normal” as they can get, with the scary thick red lines to back it up.
Comparing the current values to the past decade doesn’t give them the “punch” they need to say it’s worse than we thought.
Besides, it’s the trend that matters, not the “zero”.
Or so they say…
Anything is possible says:
July 7, 2011 at 9:30 am
Given that the major oceanic cycles seem to repeat over c.65 years, surely that would be a more appropriate and significant period to use when establishing baselines.
Anyone else have thoughts on this?
But if they used a 65 yr baseline that would have made their old baseline from 1935-2000. If you look at any graphs of the temps for the last century, particularly those from the days before they began to attempt to render the same fate on the Dust Bowl that they had on the Medieval Climate Optimum, it should be intuitively obvious why that was not even a slightly viable alternative for the lads on The Team.
So looking at the three decade-long periods in the normals, by swapping out 1971-1980 for 2001-2010 while retaining 1981-1990 and 1991-2000, you gain 0.5°F.
Was the average temperature for the continental US for the 2001-2010 period actually 0.5°F more than that for 1971-1980?
Making a dream come true.
“boballab says: July 7, 2011 at 11:03 am No the 30 year period was not picked for Stats purposes as shown in the WMO guidelines:”
Thanks for the details of why n= 30 was used vs what could, or is it should, be used.
m
Nice way of putting it but even GISS admits that there was more stations in the database from 1961-90 and 71-2000 as shown in Hansen et al 1999 and their Figure 1:
http://pubs.giss.nasa.gov/docs/1999/1999_Hansen_etal.pdf
This is the same graph which can be see on the Station Data page:
As to the meme you spout about “The loss of stations doesn’t matter” is also not only contradicted by the WMO but also by GISS in that same paper. It seems you left a very important part out:
Last time I looked the arctic region doesn’t even come close to “dense coverage” and that is one area where GISS finds it’s greatest warming and where they even admit that poor coverage effects their estimates up there.
The problem with adding on a decade and then changing the normals is that you have a tweny year counterweight, The fact the last 10 years has been cooling is being downsized by the previous hot period (period of solar max activity); ten years from now, the “hottest” ten years still in the picture will probably make the cooling period appear flat. Why not have a more permanent “climatological” base of a much longer period – how else are we going to evaluate what is really going on. The reason we started with 30 year periods is because at the time there was only 30 years of data (duh). I believe 1850 – 1950 or match up a warm period on one end and a cold on the other (1900- 2000?). This will give a permanent baseline to measure what is happening going forward. The people who have been supporters of the CAGPCW (Catastrophic Anthropogenic Goal Post Changing Warmists) appear to love layering it with game changing, horizon shifting, upside down trending complications, They don’t really want to see what is really happening.
I read here somewhere that US temperatures have been noticeably declining for 15 years. I hope someone can post a chart showing that.
This is another piece of outrageous propaganda from the Climate Central Planning Office (formerly NOAA).
PS: This chart would help to debunk the claim that recent extreme weather events in the US are related to rising temperatures.
R. Gates says: “… It will be interesting to see how the decade of 2010-2019 compares to the previous decade of 2000-2009. This will be a good test for knowing much more about the sensitivity of the climate to rapid increases in CO2 brought about by human activity.”
OK, so now can you persuade the developed world’s politicians to wait for the test results, before levying huge fines on all of us.
If you do not believe station dropout has any effect on temperature, I suggest you read this…
http://scienceandpublicpolicy.org/images/stories/papers/originals/surface_temp.pdf
And if you do not want to digest the entire report, just check out page 18.