This from NOAA, which I missed last week while I was at ICCC6. But wait, there’s more! Have a look at the maps I produced below the “Continue reading>” line using NOAA’s own tools.
Average U.S. temperature increases by 0.5 degrees F
New 1981-2010 ‘normals’ to be released this week
June 29, 2011
Statewide changes in annual “normal temperatures” (1981 – 2010 compared to 1971 – 2000).
Download here. (Credit: NOAA)
According to the 1981-2010 normals to be released by NOAA’s National Climatic Data Center (NCDC) on July 1, temperatures across the United States were on average, approximately 0.5 degree F warmer than the 1971-2000 time period.
Normals serve as a 30 year baseline average of important climate variables that are used to understand average climate conditions at any location and serve as a consistent point of reference. The new normals update the 30-year averages of climatological variables, including average temperature and precipitation for more than 7,500 locations across the United States. This once-a-decade update will replace the current 1971–2000 normals.
In the continental United States, every state’s annual maximum and minimum temperature increased on average. “The climate of the 2000s is about 1.5 degree F warmer than the 1970s, so we would expect the updated 30-year normals to be warmer,” said Thomas R. Karl, L.H.D., NCDC director.
Using standards established by the World Meteorological Organization, the 30-year normals are used to compare current climate conditions with recent history. Local weathercasters traditionally use normals for comparisons with the day’s weather conditions.
In addition to their application in the weather sector, normals are used extensively by electric and gas companies for short- and long-term energy use projections. NOAA’s normals are also used by some states as the standard benchmark by which they determine the statewide rate that utilities are allowed to charge their customers.
The agricultural sector also heavily depends on normals. Farmers rely on normals to help make decisions on both crop selection and planting times. Agribusinesses use normals to monitor “departures from normal conditions” throughout the growing season and to assess past and current crop yields.
NCDC made many improvements and additions to the scientific methodology used to calculate the 1981-2010 normals. They include improved scientific quality control and statistical techniques. Comparisons to previous normals take these new techniques into account. The 1981-2010 normals provide a more comprehensive suite of precipitation and snowfall statistics. In addition, NCDC is providing hourly normals for more than 250 stations at the request of users, such as the energy industry.
Some of the key climate normals include: monthly and daily maximum temperature; monthly and daily minimum temperature; daily and monthly precipitation and snowfall statistics; and daily and monthly heating and cooling degree days. The 1981-2010 climate normals is one of the suite of climate services NOAA provides government, business and community leaders so they can make informed decisions. NOAA and its predecessor agencies have been providing updated 30-year normals once every decade since the 1921-1950 normals were released in 1956.
NOAA’s mission is to understand and predict changes in the Earth’s environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on Facebook, Twitter and our other social media channels.
==============================================================
The interesting thing about baselines, is that by choosing your baseline, you can make an anomaly from that baseline look like just about anything you want it to be. For example, at the NOAA Earth System Research Laboratory, they offer a handy dandy map creator for US Climate Divisions that allows you to choose the baseline period (from a predetermined set they offer), including the new 1981-2010 normals released on July 1st.
Here are some examples I get simply by changing the base period.
Here’s the most current (new) base period of 1981-2010. Doesn’t look so bad does it?
Now if we plot the entire period, look out, we are on fire! Global warming is running amok!
Oh, wait, look at the scale. Looks like about 0.08 to 0.13F warming. The ESRL plotting system arbitrarily assigns a range to the scale, based on data magnitude, but keeps the same color range.
The point of this? Temperature anomalies can be anything you want them to be, and by their nature of forcing a choice of baseline, forces you to cherrypick choose periods in your presentation of the data. The choice is up to the publisher.
Now the next question is, with NOAA/NCDC following standard WMO procedure (Roy Spencer at UAH did a few months back also) for a new 30 year base period, will NASA GISS start using a modern normals baseline for their publicly presented default graphs rather than the cooler and long outdated 1951-1980 for the press releases and graphs they foist on the media? Of course for a variety of reasons, they’ll tell you no. But it is interesting to see the rest of the world moving on while GISS remains stoically static. In case you are wondering, here is the offset difference based on the 1951-1990 vs the 1951-1980 base period used. This will of course change again with a more recent baseline, such as NOAA has adopted. While the slope won’t change the magnitude offset will. More here.

The issue is about public presentation of data. I figure if making such a baseline change is something NOAA does, should not NASA follow in the materials they release to the press?
Anomalies – any way you want them:






Anthony
This question is fairly boneheaded and I am almost embarrassed to ask it, but I’m having a brain fade day and am a bit confused. It’s in regard to the set of anomaly maps in your post. My question is, in the maps. which set functions as the baseline and which provides the anomaly values? When i first looked at them I assumed the longer record was the baseline and the shorter record was the anomaly, but when I came back to them the linguistic structure of the headings sort of suggest the reverse. Trying to reason it out was just adding to my already screaming headache, so I thought I’d just ask.
Dave Wendt:
The part that says “Versus the long term average” is the baseline period
– Anthony
Steven Mosher says:
July 7, 2011 at 12:24 pm
“When people see that station count and base period MAKE NO SUBSTANTIAL DIFFERENCE
then perhaps we can get the crowd focused on the key issue which is sensitivity”
=========
Discounting that Anthony clearly shows the base period does matter (at least in the U.S. ), you call the “key issue” sensitivity.
Do you think we have a good enough grasp of sensitivity, to be burdening entire economies and their taxpayers in the effort to mitigate its effects?
Personally, I have yet to be convinced.
You’re right that farmers depend on “normals.” Our conditions here in Washington state have been colder and wetter than normal for 3 years now. Thermal units are lower, the growing season has been shorter, and crops have been lost or damaged all over the state.
A +0.5F step change, my foot. Your 1981-2010 baseline version looks about right to those of us on the ground.
The funny thing is that this new warmer normal will make any up coming cooling look more significant and harder for them to hide. If the weather/climate cycles replicate the cooler 70’s when compared to the newer normals the cooling will be much more dramatic.
I will enjoy laughing at the AGW folks when they shoot themselves in the foot due to their own data manipulations.
Larry
Roger Knights says:
July 7, 2011 at 2:18 pm
I read here somewhere that US temperatures have been noticeably declining for 15 years. I hope someone can post a chart showing that.
_____
If they do, it would be a fabrication as this is most definitely not true.
Here is the US Monthly Temperature Anomaly (in Celsius and using an 11 month moving average because US temperatures on a monthly basis are quite variable – last month May, 2011 was
-0.6C but March, 2011 was +0.8C)
http://img69.imageshack.us/img69/788/ustempscmay11.png
Have a look at these – Going to the Sun Road
Summer June 2006 – http://www.flickr.com/photos/jacdupree/178695793/
Summer July 2011 – http://www.flickr.com/photos/glaciernps
Certainly doesn’t seem to be much global warming here this year.
Bill;
My eyeball trendline analyser sez the slope is -0.5°C/decade since 1995.
Purely an illusion and lie, of course. Just ask R. Gates!
u.k.(us) says:
July 7, 2011 at 3:14 pm (Edit)
Steven Mosher says:
July 7, 2011 at 12:24 pm
“When people see that station count and base period MAKE NO SUBSTANTIAL DIFFERENCE
then perhaps we can get the crowd focused on the key issue which is sensitivity”
=========
Discounting that Anthony clearly shows the base period does matter (at least in the U.S. ), you call the “key issue” sensitivity.
Do you think we have a good enough grasp of sensitivity, to be burdening entire economies and their taxpayers in the effort to mitigate its effects?
Personally, I have yet to be convinced.
##########################
Let me be clear.
The base period does not matter mathematically to the calculation of trends.
it does not matter.
You have a series of temperatures x1,x2,x3,x4,,,,,xn
To turn it into anomalies or NORMALS you subtract a series of constants.
Then you look at TRENDS
If you change the base period you are changing the constants. This has NO EFFECT WHATSOEVER on the calculation of the trends.
Now, Anthony’s point with this has always been the “marketing” aspect. If you pick a cool period for the base, you get “redder” colors. Pick a warm period, you get “bluer” colors.
BUT, it makes no difference to the MATH OF TRENDS.
My larger point is this. A lot of ink and energy is spent on these marketing issues.
That distracts people from the most important issue. that issue is sensitivity.
That is where THE BEST ARGUMENT AGAINST CAGW HAPPENS.
So I am complaining about the distractions. The IPCC doesnt have a very solid SCIENCE STORY about sensitivity. And I think it makes MORE SENSE to focus on that, then it does to foucs on the colors in charts. There is only so much focus that people can have and if they focus on tangents rather than the big issue, then thats a lost opportunity.
Same with the temperature record. Many people have focused on BIAS when the real issue is UNCERTAINTY. Of course BIAS ( they are cheating) makes for better headlines, btu the real issue is uncertainty.
Glancing through the comments so far I don’t think anyone has correctly answered Anthony’s question: The point of this? Ok, here is why it is done …
If you change the baseline to a warmer ‘normal’, then all subsequent cooling is masked or at least tamped down. This would be done in anticipation of cooling temperatures which will thereby be reduced in impact. The AGW cultists will get to say: it hasn’t really cooled, just look at this trend! It’s simple really.
Running averages with shifting baselines will merely create a ratchet effect going forward. It is a game of statistics, not science.
Steven Mosher says: July 7, 2011 at 10:35 pm
Choice of 30 year base line CAN make a difference.
For averaged yearly values it will be insignificant
For monthly or daily there MAY be some significance.
e.g. monthly: if winter months are warming faster than summer over the 2 30 year periods then if you average all 30 off Januarys for each period then these would appear as a higher figure to difference (lower trend) from each individual year if the latter period is used. Similarly if july has not warmed as much then the July trends will be comaratively greater than the Jan trends. This must skew the data at the monthly level and possibly therefore at the yearly?
o.5F sounds about right for continental US. It won’t be anywhere near that large globally of course since, contrary to urban AGW legend, greenhouse gases don’t raise ocean surface temperatures. The GHG effect is a land-only phenomenon. You CANNOT heat a body of water by shining long wave infrared radiation onto the surface. The energy is immediately carried away through evaporation. If the climate boffins would simply acknowledge that physical fact they would know why they can’t find the so-called “missing heat”. It’s missing because it never entered the ocean. Thus about 2/3’s of the theoretical surface warming from increased GHGs is absent because about 2/3’s of the earth’s surface is a body of water. You heat dirt with LWIR but you cannot heat water with it.
I tried plotting the data for some of the base periods against the base periods themselves. (The first instinct of any software tester – start with the simplest case!) Try Jan-Dec 1961-1990 versus 1961-1990 longterm average, for example.
I expected the map to be uniformly white, since the difference between any function and itself is – ZERO.
But what I got, in each case, was a mixture of yellow and white. Small discrepancies, less than +-0.03F. But still discrepancies.
Watts up with that?
Finally got the answer to my last post myself (thank you to the wonderfully helpful people on this site, which apparently are all on vacation).
The national NOAA-NCDC data are here (took forever to find):
http://www7.ncdc.noaa.gov/CDO/CDODivisionalSelect.jsp
Tossing the monthly values into a spreadsheet and averaging the periods yields the following:
1971-1980 52.5°F
1981-1990 53.2
1991-2000 53.6
2001-2010 54.0
I forgot about the “averaging by three,” there’s a 1.5°F difference, not 0.5.
So according to NOAA-NCDC, 2001-2010 was 1.5°F (0.8°C) warmer than 1971-1980. Seems high. Averaging the decadal periods together, the 1971-2000 value was 53.1°F, 1981-2010 was 53.6, showing the 0.5°F difference in normals as mentioned above.
===
From Steven Mosher on July 7, 2011 at 10:35 pm:
Dang, I was just averaging, didn’t subtract any constants. And still got the right answer.
===
From Brian H on July 7, 2011 at 8:26 pm:
Using the monthly numbers I found to make yearly averages (violating dozens of official data-massaging Climate Science™ rules, I’m sure), I graphed the results and let the spreadsheet figure out the linear fit. From 1995 to 2010 inclusive, the slope is 0.027°F/yr, 0.15°C/decade. From 1971-2010 inclusive, the range I was examining, 0.044°F/yr, 0.24°C/decade. Looked at this way, the rate of increase has sure slowed down.
If looking at just the last full decade, 2001 to 2010 inclusive, the slope is –0.093°F/yr, –0.52°C/decade. Looks like a strong cooling signal.
Perhaps your eyeball trendline analyzer just got the starting year wrong?
Steven Mosher says:
July 7, 2011 at 10:35 pm
Let me be clear……………
==========
Well said, and I agree.
Thanks for the expansive comment, it is duly noted.
Why do we have the 30 year convention?
boballab quotes WMO:
The WMO is right in giving the 30 year convention beginning in the early 20th century (when I think it was adopted by the IMO), but the data availablity argument (even if it was made at the time) requires more explanation. At the time it was commonly recognised that thermometer data availability, and quality, varied enormously across the globe. But certainly there were many European and colonial records going back much more than 30 years, and a significant set that had ticked over the 100 year mark.
We would need to see some evidence as to why the 30 years line was regarded as especially significant in terms of data availability at this time. Charles Schott, in a famous early statistical intervention in the climate change controversy, using data across a 100 year time span. That was in 1899 and in an argument against the widespread belief in localised anthropogenic climatic change.
Early statistical climatology tended to argue against climate change; against both popular belief in anthropogenic change and against the new findings of geologists supporting natural variation on an historical scale. Later, in hindsight, Hubert Lamb saw the position of the statisticians as rather too convenient. If climate varied on all scales, from the human scale to the geological scale, then finding statistical norms becames difficult. On their conclusion/assumption of no climate change, we would expect the statiticians to choose a period that could mostly flatten what they says as random yearly variation, and 30 years might have been mimimally sufficient for that.
Lamb’s account of statistical climatology as he discovered it in the 1940s as little more than the ‘book-keeping bookkeeping branch of meteorology, this fits with Richard Holle’s more pragmatic explanation of the 30 year convention:
Interesting. I would love to know more!
But perhaps there was some influence from the other side, from Geology and folklore. It was in the 1890s that Bruckner used thermometer and proxy data to presented a case for a variable 35 years global climate cycle. And it was soon noted that this revived a piece of folklore as record and supported by Francis Bacon in the early 17th century:
Of Vicissitude of Things
kadaka;
Could be, in which case, OMG it’s more worser than I thought! About -4.717K by 2100!! (Give or take about 4K.)
😉
P.S. The above was calculated strictly following the procedures vaguely outlined in the Engineering Handbook of Pretend Precision.
berniel said at 11:26pm (7/8/11)
“…. this fits with Richard Holle’s more pragmatic explanation of the 30 year convention:……..” Reminds me the joys of figuring out how much data to collect back in the days of paper records were it actually took a fair amount of time (and $) to collect historical bits of information. Data mining the old fashioned way was time consuming! Even worse for those of us that couldn’t type well was putting those bits of information into a format to evaluate it- I think I spent more time fixing the errors in my punch cards then I did in originally putting them together for the compiler/card reader. I recall giving up on punch cards once HP came out with their statistical calculator. The joy of using cricket graph vs. the alternatives for presenting information come to mind as well.
Berniel, thanks for the Bacon reference! I am sure he would find the discussions on cause and effect in the climate, and the models we develop to explain it, of interest. If anyone would know the difference between correlation vs causation it would be Bacon.