Announcing the first ever CONUS yearly average temperature from the Climate Reference Network

UPDATE: NOAA plans to release SOTC at 1PM EST today. Look for updates soon and a special report on today’s release. The map below will automatically update when we have the new December COOP Tavg value, probably later today. I’ll have another post on the differences between the CRN and COOP in the near future. – Anthony

Pursuant to our previous story showing issues with diverging data and claims over time, NCDC has updated the Climate Reference Network Data for December 2012. I’m still waiting on the NCDC State of the Climate report to come in with their number, and I’ll update the graphic (in yellow) when it is available.

Being a state of the art system, it is well sited, and requires no adjustments and the data is well spatially distributed by design so that it is representative of the CONUS. Here’s the current plot (click to enlarge):

Each (small) number in blue represents one of the NCDC operated U.S. Climate Reference Network stations in the CONUS that we use. Here’s the data reports for December and the entire year:


2012 Average Monthly Reports – text files

Source for all data:


The December report looks like this:


TOTALS Totals for T_MONTHLY_MEAN (column 8) = 296.7

Totals for T_MONTHLY_AVG (column 9) = 302.4

Total Number of CRN Stations Included in this Report = 116 out of 117 CONUS stations possible (stations with missing data excluded – see below)


Average of T_MONTHLY_MEAN Totals = 296.7 / 116 = 2.55775862068965 or 2.6° C Average of T_MONTHLY_AVG Totals = 302.4 / 116 = 2.60689655172414 or 2.6° C Average of T_MONTHLY_MEAN Totals in Fahrenheit = (2.55775862068965 * 1.8) + 32 = 36.6039655172414 or 36.6° F

Average of T_MONTHLY_AVG Totals in Fahrenheit = (2.60689655172414 * 1.8) + 32 = 36.6924137931034 or 36.7° F

SUMMARY National Average of Monthly Mean Temperatures = 2.6° C or 36.6° F National Average of Monthly Average Temperatures = 2.6° C or 36.7° F

EXCLUDED STATIONS The following stations reported no data (-9999.0) for either T_MONTHLY_MEAN or T_MONTHLY_AVG and were not used:



From the NCDC provided FTP data files we can calculate a yearly CONUS Tavg, which has never been done before by NCDC to my knowledge. Odd that is falls to somebody outside of the organization don’t you think?

Climate Reference Network Data for 2012

Month Tavg
1 36.8
2 38.1
3 50.6
4 54.8
5 63.3
6 70.8
7 75.6
8 72.9
9 65.6
10 53.9
11 43.9
12 36.7
Sum 663
/12 55.25

Therefore, from this data, the Average Annual Temperature for the Contiguous United States for 2012 is 55.25°F

Note also the value from the CRN from July 2012, 75.6°F far lower than what NCDC reported in the SOTC of 77.6°F and later in the database of 76.93°F as discussed here.

Makes you wonder why NCDC never mentions their new state of the art, well sited climate monitoring network in those press releases, doesn’t it? The CRN has been fully operational since late 2008, and we never here a peep about it in SOTC. Maybe they don’t wish to report adverse results.

I look forward to seeing what NCDC comes up with for the Cooperative Observer Network (COOP) in their “preliminary” State of the Climate Report for Dec 2012 and the year, and what the final number will be in 1-2 months when all the data from the COOP network comes in.

I’ll have more on this in the near future. I’ll be offline for the rest of the day traveling.

UPDATE: 10:30PM PST, Climatebeagle and others have been puzzled over the 117 stations used, and can’t reconcile with the larger list. Here’s the logic:

Some stations, such as the Oak Ridge, TN and Sterling, VA were removed due to them not reporting regularly or at all (they are test sites). The one CRN station in Egbert, Ontario Canada is not part of the CONUS, and is removed also. None of the stations in Alaska are used as they are also not part of the CONUS.

Here is the list: conus_stations_master_list_1-8-13 (PDF)

UPDATE2: 9:30AM PST, 1/8 Reader Lance Wallace noted a mistake, which has to do with versioning control on our end. One CRN station in Egbert Ontario was inadvertently included in the monthly code, where it was not in the daily code we run. We’ll rerun it all and update. I’m thankful for the many eyes of WUWT readers – Anthony


newest oldest most voted
Notify of

Nitpick, my apologies:

“NCDC has updated the Climate Reference Network Data for December 2012. I’m still waiting on the NCDC State of the Climate report to come in with their number, and I’ll update the graphc when it is available…”


Hmm another “Hoist with their owne Petard” moment…


You need to start putting in invoices to the US Government for all the work that they can’t be bothered to do for themselves.


Global Warming Has Left The Building!

jonny old boy

Nice Work….. So it looks like a SECOND state high temperature record being broken this century may be less likely…. 😉 Poor old South Dakota !


So impressive, Anthony, congratulations!! Looking forward to seeing the implications of this…. So the CRN is proving inconvenient to someone’s Cause….


Dont you need to multiply each monthly average by the number of days in that month, add all the months up and then divide by 365 or 366 to get an unbiased average?
REPLY: already handled in code, we took each stations Monthly Tavg (which NCDC caclulates from daily data) and calculated a CONUS monthly Tavg. All the data is there in case anyone wants to replicate it independently. – Anthony


Anthony – Did Al Gore share some of his Big Oil money with you?

Lance Wallace

For those interested in the CRN network, there is a recent article summarizing the first 10 years of operation

Mike Smith

Oh dear. This isn’t looking good. Not good at all.
Methinks someone has a lot of explaining to do.
And that should be interesting. Time to order in more popcorn!


I’ve been simple averaging the USCRN hourly data and 2012 consistently comes out as the hottest US year regardless of the stations. E.g. hottest since 2003 when only looking at stations with a complete record since 2003, hottest since 2008 when only looking at stations with a complete record since 2008 etc.
I’m not actually a great believer in averaging temps, but should a spatial weighted average be used, rather than a simple one? E.g. a couple of areas have two nearby stations rather than just one.


This will help keep the B*st*rds honest…

Not sure is if that’s how you calculate average adding the monthly values and divide by 12. If I go for the weighted average, ie including the month duration in days, I get 55.305 degrees F.


On the hourly numbers I calculated an USCRN 2012 yearly average of 12.1°C or 53.7°F, but I think my list of USCRN stations is different since I have 124. Probably at least because I’m using all the USCRN stations, thus not the same as CONUS.
However, is there a listing of WBANNO numbers for the USCRN stations? I haven’t found a simple list online, thus generated one manually and may have made mistakes.


OK am I going stupid or have I missed something? Is Antony really saying that one set of records is about 21 degrees F lower than the other(s)? I must be wrong I know as somebody must have noticed.
OK I’ll get on this link when the crack wears off and find my mistake.


A much needed corrective to NASA & NOAA’s cooked books. Appears close to one station per 26,666 sq. miles (with a few gaps), so, as you note, automatically adjusted for elevation, urban, rural & all other parameters.


Contacted my Congressman and requested a GAO audit of the NOAA/NCDC temp reporting practices. Hansen, Anthony… touch em up.

Paul Marko

December’s sun was really quiet. SSN ~ 40; 10.7 ~ 108; Ap ~ 3. Dalton or Maunder?

John F. Hultquist

The higher number you see is for July, not the year.

A man with a watch knows what time it is.
A man with two watches is never sure….
So at a minimum this says that “station selection” has a 2 F variation in it. So much for the assertion that “station dropout” doesn’t matter…
That the NCDC/SOTC data / report is 2 F warmer and the CRN stations are supposed to be ‘the best’ strongly implies that the NCDC/SOTC data are skewed high by 2 F. As that is more than the “Global Warming” they claim to have detected, that ought to mean we are colder now rather than warmer.
As I’m experiencing a colder winter than in the ’90s that accords with my ‘reality check’.
Looks to me like it’s pretty clear that “Global Warming” is an instrument error artifact.

Liberal Skeptic

Someone, Somewhere, has massively cocked SOMETHING up if there is difference as large as that between this temperature record for 2012 and the old record for 2012. (I make it about 10 degrees centigrade??)
A deeper investigation has to be done.

Liberal Skeptic

^ Actually not 10C, got confused by the previous story.
Still a major difference, enough to put global warming scare stories into doubt, at least in the united states. And the temperature record for the rest of the world can’t be of any better quality either.



Berényi Péter

using the same dataset we get annual average contiguous US temperatures for the last 5 years
2008 52.57°F
2009 52.48°F
2010 52.98°F
2011 53.21°F
2012 55.25°F
Trend is +61°F/century, truly worse than we thought
/sarc off

Lance Wallace

Anthony and Climate Beagle–
Using the CRN monthly dataset, I get rather different numbers for 2008-2011 (about 52 F) compared to Anthony’s number of 55 for 2012. I haven’t downloaded the data for 2012 yet.
Here are the values for 2008-2011. These are obtained by averaging across all months for a year rather than averaging across each month and then dividing by 12 the way Anthony did, although I would think this would not make much difference.
year sites months mean (F) Std. Err. (F)
2008 112 1382 51.9 0.50
2009 114 1449 51.7 0.49
2010 116 1467 52.0 0.49
2011 116 1493 51.9 0.50
Incredible stability for those four years!


E.M.Smith says:
Looks to me like it’s pretty clear that “Global Warming” is an instrument error artifact
LOL – that is one way to look at it. Other ways are to call it human error, hubris, pseudoscience or just plain wrong.
Thanks Anthony !! I love it when you skewer NOAA with their own “data”. Great work, and sorely needed these days.


Last file listed contains this message:
** This is a United States Department of Commerce computer **
** system, which may be accessed and used only for **
** official Government business by authorized personnel. **
** Unauthorized access or use of this computer system may **
** subject violators to criminal, civil, and/or administrative **
** action. All information on this computer system may be **
** intercepted, recorded, read, copied, and disclosed by and **
** to authorized personnel for official purposes, including **
** criminal investigations. Access or use of this computer **
** system by any person, whether authorized or unauthorized, **
** constitutes consent to these terms. **
[Really ? . . mod]

This is hugely important, it is proof of conspiracy. The “hottest year ever” will still be announced, I fear, I think they figure more people will see that than read here. What they haven’t figured out yet is that the general public are losing interest, and rapidly. Particularly as COLD weather and record snows are causing such inconvenience everywhere – not to mention the deaths. These guys are just ramming their foot (feet?) deeper and deeper down their throats. Someone ought to trip them up while they are in that position – they deserve to fall flat (and much more!).
Is this a good time to remind them what treason is?

Problem is you cant compare the averages without correcting for altitude differences.
For example, have a look at Roy spencers average using ISH. Note that he does a lapse rate adjustment. So for example, if you have 1000 stations at 500 meters above sea level and you average them, you come up with say 14C. Now average 1000 stations at sea level. guess what, the lower stations will be slightly warmer per the lapse rate.
This is especially important if you have any missing data as that will sku the answer even more.
From the looks of it a simple average was computed with taking no account of alt differences. Even Roy understands why that matters.
You’d been amazed what a difference of 100 meters gets you. In fact, if you take CRN data and compare it to nearby stations ( one CRN has 14 ISH hourly stations nearby ) you might find cases where the lower CRN station, although well sited, is warmer than the horrible ISHs at airports.
Why? because the airports happen to be at higher colder elevations. For reference there are around 400 ISH hourly stations within 100km of CRN stations, so its not that hard to illustrate
So, before you compare averages of absolute temperatureyou MUST insure that the sampling distributions come from the same altitude OR correct for lapse rate. Of course if you work in anomalies you dont have to account for this.
The enviromental lapse rate is around 6.5C per 1km or .65 C per 100 meters. On average the CRN stations tend to be lower in altitude than other collections of stations. Not by alot, but precision matters, after all if you apply imprecise methods to gold standard data.. you lose what you thought to gain

Anthony, you seem to be doing all the work that NCDC does not dare to do. I hope you get your share of their Big Oil Money.

Claude Harvey

Anthony, how can I trust your numbers when they have not been statistically mutilated? Having read a wide variety of “AGW consensus papers” it is clear to me that the man-made warming signal cannot be teased out of such noisy data without very sophisticated, statistical mutilation of the raw numbers. The test for proper mutilation is that earlier century temperatures move down from the raw value and letter century values move up from the raw figures.


Just wondering as to where I can find the MONTHLY CRN CONUS graphic that was sed in the article as viewed above? ( )
Is it from this website? ? It states that it is “coming soon”…I’m just a bit confused. Thanks.


John H.,
Thanks for the tip off – I see the July now!

Edward Martin

According to the last figure in, they were right all along.


Steven Mosher – What is wrong with having the least amount of data alterations? If all the data stays as raw as possible, doesn’t that remove most of the criticisms of the main stream data sets? Also, just the shear fact it is early afternoon on the east coast and mid morning on the west coast, there are differences in how much change there has been in the day and how much change is left in the day. All of which I’m sure can be “corrected” but shouldn’t the goal to have as little “corrected” or modified data as possible? Shouldn’t consistency be the most important yard stick?
The point I’m intending to make is, what skews the data and in what direction shouldn’t that be less important than consistent data usage?


Steven M.,
Do you have to correct the lapse rate with latitude also?
It seems to me the whole exercise is fairly meaningless as a comparative measure is all that is required. There is no such thing as “absolute average temperature for a continent”. What is needed is information on whatever processes are used to generate what is in effect a “comparative average temperature”. They might in principle be doing anything.


EM.Smith “So at a minimum this says that “station selection” has a 2 F variation in it. So much for the assertion that “station dropout” doesn’t matter…”
Yep, that is a large difference. I have always suspected that the loss of data from remote stations (how did they manage that, must have been a big effort) and the “adjustments” (lol), and the lack of proper allowance for UHI and the increased reliance on urban temp data etc all contributed to the rise in the “global average urban land temperature” in the pre-1998 period
The satellite record shows two basically zero trend sections 1979-1997, and 2000 – 2012, with a step change in the middle.
I doubt there was much real warming in the 1970-1998 period at all, yet that is the period that the CAGW hoax is built on. The small rise in the land temp was all from data manipulation and lies.

Problem is you cant compare the averages without correcting for altitude differences.

I believe that is an unnecessary complication of the problem. I do not want to know what the temperature WOULD be if the entire US were ironed smooth to sea level, I want to know what it *IS*. But more importantly, I don’t need a lapse rate adjustment for surface stations if what I am interested is trend over time. A station at 6,000 feet in Colorado will remain at 6,000 for the rest of this interglacial. I am interested in the change in temperature over time, not trying to “correct” it to sea level.
A average of all reported stations is good enough when those stations do not require a correction for station hijinks and UHI. It just is what it is. If you start messing around and adjusting for lapse rate, then that opens the door for making all sorts of other adjustments. What about a wind direction adjustment? In some places one can get a very warm condition when wind is blowing from a certain direction and one gets adiabatic warming from air flowing downhill (Chinook or föhn winds). Conditions are quite different when the wind blows in the other direction.
No, lets just leave things as they are. Part of the temptation to do this comes from the desire to create a “fill” value where one is missing. That’s bogus, too, because in many parts of the country there are microclimates that make doing that an exercise in futility anyway. Trying to “fill” a missing value in a California station, for example, by using a value from a distant station is likely to be futile. Which CRN station are you going to use to fill data for a missing value at Truckee? You will notice on the map above that no station anywhere around it has anything like Truckee’s temperatures. Once you start doing adjustments, it is over.

I am not sure you have got the correct figure for the temperature/altitude equation.
There is a certain amount of ‘it depends’ as well, due to temperature inversions and other factors

Thank you


Re: Government warning. Yes, indeed, the NOAA ftp site link given at the end of this post does indeed lead you to a file directory with the scary U.S. Gov’t warning message. Do I need to start looking over my shoulder?


See the “Mosher” is an unhappy bunny. It’s not just about this data…it’s the way you AGW crowd have got it wrong again and again. After the Aqua Satellite debacle this should have been over in 2002…has anyone ever discovered the missing heat in the Troposphere? In 2004 AGW became “Climate Change”…after all it sounded so much better than “Global Warming Freezing” which was what some of your comedians were debating…refer “Climatgate”…on this site. Today the MET Office reckon no warming again for a long time yet…the technical reason for this is the Hansen’s, Gore’s, King’s, Mann’s.Trenberths, Jones et al will be long gone.

Crosspatch has it right
Conus? the man drew you a pic.
The hardest job in the world picking fly speck out of black pepper

Lil Fella from OZ

Just some things money cannot buy. Thanks Anthony


I agree with crosspatch, it depends on what you are trying to get an average of. An average of the actual US surface seems more valuable than an average at a fixed height above sea-level.

Lance Wallace

One explanation for the somewhat different averages listed above is the very confusing choice made by the CRN data group of descriptions of two different numbers. One is the “traditional” Tavg = (Tmin+Tmax)/2. This is described as T_Monthly_mean in their data description quoted below. The other is the average of all available “continuous” measurements, i.e. the hourly averages across the entire month. This they call T_monthly_avg. This latter measure is much closer to what most people would consider the “true” average. I discussed the difference (with maps of all the stations) in a guest post or two a few months ago. We can see for example that almost all stations have a consistent difference across all years (either positive or negative), averaging about 0.5 C, between the “traditional” average from the min and max measurements and the better estimate using hourly averages.
This is the CRN definition of the two terms.
cols 57 — 63 [7 chars] T_MONTHLY_MEAN
The mean temperature, in degrees C, calculated using the typical
historical approach of (T_MONTHLY_MAX + T_MONTHLY_MIN) / 2
cols 65 — 71 [7 chars] T_MONTHLY_AVG
The average air temperature, in degrees C, for the month. This average
is calculated using all available day-averages, each derived from
24 one-hour averages. To be valid there must be less than 4 consecutive
day averages missing, and no more than 5 total day averages missing.
The difference between the N of 116 and 124 is basically that 7 locations have 2 stations (in one case 3) sited close by. My calculations weight every site equally, so actually have 124 sites in 116 locations (probably should have made that clear earlier, since my table listed only the 116 locations.) I believe this is better than averaging the two sites at the same city, since they are not true duplicates but often separated by some miles, and may be in quite different locales and subject to different meteorological conditions.
I think that properly we should just look at the individual stations to see how they are varying. Also since CRN is so recent, and “trends” are limited to about 4 years (4 datapoints) for the full network, I can’t think that a trend analysis would mean a thing until after about a decade or two. I do hope that this network, well planned and maintained as it is, will retain funding for the future.


That will change the absolute values but not the slopes when plotted as a function of time, correct?
So, if you want an “accurate” “average” temperature then altitude will matter but if you just want the trend it should not matter, correct?

Mike Bromley the Canucklehead back in Kurdistan

Glad to have all the siting just right, but: does an average temperature of something that covers subtropical, alpine and temperate zones really have any meaning? Crickets….

“Did Al Gore share some of his Big Oil money with you?”
I’ll bet he just put it with the Big Tobacco money and the Anti-Big Tomacco money he has already taken from both sides.


Elevation, UHI, and various other surface temperature data tweaks are red herrings. An “average global temperature” based on the atmosphere does not account for humidity, wind kinetic energy, and latent heat. No matter how many corrections we make, we’ll never have a meaninful apples-to-apples comparison.
Worse yet, this so-called “global temperature,” even with the best corrections for elevation, etc., is irrelevant compared to the energy balance of the oceans. The thermal capacity (BTU/⚬F) of the oceans is 1100 times greater than the atmosphere.
Instead, we’re trying to measure the most transient part of the system, the one with the most noise. Is it any wonder the climatasters are using arcane statistical methods to tease out a signal?