UPDATE: NOAA plans to release SOTC at 1PM EST today. Look for updates soon and a special report on today’s release. The map below will automatically update when we have the new December COOP Tavg value, probably later today. I’ll have another post on the differences between the CRN and COOP in the near future. – Anthony
Pursuant to our previous story showing issues with diverging data and claims over time, NCDC has updated the Climate Reference Network Data for December 2012. I’m still waiting on the NCDC State of the Climate report to come in with their number, and I’ll update the graphic (in yellow) when it is available.
Being a state of the art system, it is well sited, and requires no adjustments and the data is well spatially distributed by design so that it is representative of the CONUS. Here’s the current plot (click to enlarge):
Each (small) number in blue represents one of the NCDC operated U.S. Climate Reference Network stations in the CONUS that we use. Here’s the data reports for December and the entire year:
==========================================================
2012 Average Monthly Reports – text files
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201201.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201202.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201203.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201204.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201205.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201206.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201207.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201208.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201209.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201210.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201211.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201212.txt
Source for all data: ftp://ftp.ncdc.noaa.gov/pub/data/uscrn/products/monthly01/
==========================================================
The December report looks like this:
==========================================================
TOTALS Totals for T_MONTHLY_MEAN (column 8) = 296.7
Totals for T_MONTHLY_AVG (column 9) = 302.4
Total Number of CRN Stations Included in this Report = 116 out of 117 CONUS stations possible (stations with missing data excluded – see below)
AVERAGING CALCULATIONS
Average of T_MONTHLY_MEAN Totals = 296.7 / 116 = 2.55775862068965 or 2.6° C Average of T_MONTHLY_AVG Totals = 302.4 / 116 = 2.60689655172414 or 2.6° C Average of T_MONTHLY_MEAN Totals in Fahrenheit = (2.55775862068965 * 1.8) + 32 = 36.6039655172414 or 36.6° F
Average of T_MONTHLY_AVG Totals in Fahrenheit = (2.60689655172414 * 1.8) + 32 = 36.6924137931034 or 36.7° F
SUMMARY National Average of Monthly Mean Temperatures = 2.6° C or 36.6° F National Average of Monthly Average Temperatures = 2.6° C or 36.7° F
EXCLUDED STATIONS The following stations reported no data (-9999.0) for either T_MONTHLY_MEAN or T_MONTHLY_AVG and were not used:
CRNM0101-PA_Avondale_2_N.txt
================================================================
From the NCDC provided FTP data files we can calculate a yearly CONUS Tavg, which has never been done before by NCDC to my knowledge. Odd that is falls to somebody outside of the organization don’t you think?
Climate Reference Network Data for 2012
| Month | Tavg | |
| 1 | 36.8 | |
| 2 | 38.1 | |
| 3 | 50.6 | |
| 4 | 54.8 | |
| 5 | 63.3 | |
| 6 | 70.8 | |
| 7 | 75.6 | |
| 8 | 72.9 | |
| 9 | 65.6 | |
| 10 | 53.9 | |
| 11 | 43.9 | |
| 12 | 36.7 | |
| Sum | 663 | |
| /12 | 55.25 |
Therefore, from this data, the Average Annual Temperature for the Contiguous United States for 2012 is 55.25°F
Note also the value from the CRN from July 2012, 75.6°F far lower than what NCDC reported in the SOTC of 77.6°F and later in the database of 76.93°F as discussed here.
Makes you wonder why NCDC never mentions their new state of the art, well sited climate monitoring network in those press releases, doesn’t it? The CRN has been fully operational since late 2008, and we never here a peep about it in SOTC. Maybe they don’t wish to report adverse results.
I look forward to seeing what NCDC comes up with for the Cooperative Observer Network (COOP) in their “preliminary” State of the Climate Report for Dec 2012 and the year, and what the final number will be in 1-2 months when all the data from the COOP network comes in.
I’ll have more on this in the near future. I’ll be offline for the rest of the day traveling.
UPDATE: 10:30PM PST, Climatebeagle and others have been puzzled over the 117 stations used, and can’t reconcile with the larger list. Here’s the logic:
Some stations, such as the Oak Ridge, TN and Sterling, VA were removed due to them not reporting regularly or at all (they are test sites). The one CRN station in Egbert, Ontario Canada is not part of the CONUS, and is removed also. None of the stations in Alaska are used as they are also not part of the CONUS.
Here is the list: conus_stations_master_list_1-8-13 (PDF)
UPDATE2: 9:30AM PST, 1/8 Reader Lance Wallace noted a mistake, which has to do with versioning control on our end. One CRN station in Egbert Ontario was inadvertently included in the monthly code, where it was not in the daily code we run. We’ll rerun it all and update. I’m thankful for the many eyes of WUWT readers – Anthony

@Mosher:
So can you point me at where GIStemp does their lapse rate adjustment?
Didn’t the folks at NCDC say it doesn’t matter if stations come and go? Where is their lapse rate adjustment?
“Now average 1000 stations at sea level. guess what, the lower stations will be slightly warmer per the lapse rate.”
So is that why the GHCN consistently drops high altitude stations and replaces them with sea level stations? Nice to know. Thanks!
http://chiefio.wordpress.com/2009/11/16/ghcn-south-america-andes-what-andes/
http://chiefio.wordpress.com/2009/12/01/ncdc-ghcn-africa-by-altitude/
http://chiefio.wordpress.com/2009/11/17/ghcn-the-under-mountain-western-usa/
http://chiefio.wordpress.com/2009/11/13/ghcn-pacific-islands-sinking-from-the-top-down/
Oh, yeah, the fictional “anomaly” is supposed to fix all that… the one that isn’t done until it’s all ‘grid/boxes’ in the last step….
“Why? because the airports happen to be at higher colder elevations.”
Must not fly much… FYI Airports are built where there is a lot of flat land, typically. As often as possible down in the valley floors or even next to water (so a long approach can be made with a low flat surface). Examples? SFO approach over the bay. Moffett Field approach over the bay. SJC San Jose approach over the bay. Not one of them up in the surrounding hills. ORD Chicago on flat land (as is all of Chicago near the lake). Denver down on the flatter part down slope from downtown. Etc. etc. etc.
Folks only put airports on mountains and mountain tops when there is no alternative. One finds LAX down low, not in Beverly Hills… Even Reno and Lake Tahoe airports are on the flat land, not in the hills you can see from them… It is easier to get enough density altitude and runway speed on a lower flat runway than on a high bumpy one.
@climatebeagle:
Averaging temperatures can have no meaning.
http://chiefio.wordpress.com/2011/07/01/intrinsic-extrinsic-intensive-extensive/
Yet it is widely done. GIStemp keeps temperatures AS temperatures all through the various transformations and creation of a fictional “grid/box” value (that they call a temperature). Then at the very end they make a ‘grid/box anomaly’ between two of these fictional grid/box temperatures. All fundamentally hokum due to averaging intensive properties and not dealing with enthalpy. But “it’s what they do”. So your instinct is sound.
I’ve averaged temperatures for the purpose of seeing the ‘shape of the data’ that causes some “climate scientists’ to have a hissy fit as they presume I think that results in a temperature when it doesn’t. (But it is a good way to see what the basic nature of the change in the numbers might be… bigger, smaller, more variation. Metadata about the data…)
With that said, to do it with some sanity you need to weight things for a variety of stuff that approximates enthalpy and sample bias. It ought to include areal weighting, altitude, distance from water, relative humidity, phase change of fluid (snow, ice, evaporation) and a few more. “Climate scientists” pick a couple from the list that would give an extrinsic property and ignore the rest. So you can ‘cherry pick’ a few too.
What I did was to never average a temperature. (Other than that the input I had available was a min-max average already… I really ought to re-do this with just the mins and just the maxs). Just do a ‘first differences’ style anomaly creation on one, and only one, instrument record at a time. I think that gives the cleanest view of what is going on. At that point, averaging the anomalies is valid.
http://chiefio.wordpress.com/category/dtdt/
and in theory you can ignore things like altitude and areal weighting (though in reality there are still a couple of ‘issues’ with station change…)
@Eco-geek:
Good one!
@crosspatch:
That’s why what I did does the anomaly as the very first step and does not ‘fix’ missing data but just ‘bridges the gap’ for that particular instrument and place.
http://chiefio.wordpress.com/category/dtdt/
Each instrument and month compared only to itself. Nothing else.
What it shows is that at any given place any given month may be going up or down in trend. Overall, not much changing on the globe. Some nations warming, some cooling (often next door to each other). Overall impression is that it is “data are variable”.
Yet there are sea change moments, such as the point when the MMTS is rolled out, where a ‘jump’ happens. Is it the instrument or the ‘fix’ for the change? In either case, it’s not the reality, it’s the fiddling…
There is NO global warming. There are some instrument records for some places in some months that rise (often from the lows being lifted, not the highs getting hotter). Only averaging that in with all the ‘no change’ or ‘cooling’ places makes anything “global”, and that number is a data artifact ridden fantasy…
If you read the pdf you will discover that one aspect of the CRN network is to provide long term, un-interrupted sites. However, they have planned for possible relocations due to owners requirements, failures etc…
However, already they are talking about ‘adjustments’ to the data:
“273 (Collow et al. 2012). It is now planned that if a station must be removed for non-emergency
274 reasons, such as the changing needs of the site host, there would ideally be one or two years of
275 time to run a new USCRN station at a nearby site so as to develop an accurate calibration of the
276 differences in climate between the sites and adjust the data of the discontinued site to match the
277 new site. This process is currently underway for one station in Goodwell, OK, that is required to
278 be removed because of unanticipated planned local LULC. Given such sufficient advance notice, ”
This assumes things that should be up for argument:
a.) We know that nearby sites have ‘correlated’ data but the assumption that a universal longterm adjustment can be made is absolutely silly.
b.) If the second replacement site is close enough (I am not sure as to how to properly define it) it is more reasonable to assume the new site is ‘equivalent’ to the old site and any common time data should, perhaps be averaged. But to argue that a prior or later site to permanently adjust just opens the door to further adjustments.
Even if one sees one site is systematically warmer or colder for the approximate two year window it should/would would only show that micro-climate variations exists and we can not begin to account for them for every station. It just shows that weather varies in in locales. How you can argue that one is a BETTER representative of the area vs another if they both meet siting guidelines is silly.
USHCN(2.5) vs. CRN 2012. CRN difference from USHCN in ().
Jan – 36.1 vs 36.8(0.7)
Feb – 37.7 vs 38.1(0.4)
Mar – 50.3 vs 50.6(0.3)
Apr – 54.6 vs 54.8(0.2)
May – 63.4 vs 63.3(-0.1)
Jun – 70.5 vs 70.8(0.3)
Jul – 76.9 vs 75.6(-1.3)
Aug – 73.8 vs 72.9(-0.9)
Sep – 66.2 vs 65.6(-0.6)
Oct – 53.9 vs 53.9(0.0)
Nov – 43.9 vs 43.9(0.0)
Dec – ? vs 36.7(?)
Ann – ? vs 55.25(?)
USHCN has 1998 @ur momisugly 54.32 as the warmest in the record followed by 2006 @ur momisugly 54.30. CRN is 0.93 warmer than the USHCN record in 1998. Up until Nov, USHCN has an avg of 57.03 and CRN at 56.94 (-0.09).
I used the table/year setting for the months and then table/rank for the annual here.
http://www.ncdc.noaa.gov/oa/climate/research/cag3/na.html
Looks like CRN showed warmer in the colder months and cooler in the hotter months. Interesting. Wonder if that plays out like that in the previous few years.
I thought maybe the world was ending, but alas it was only the cacophony of all the wooopsies…
So I plugged all these values into Minita. First, I was really surprised that the data is fairly well normally distributed. Second, summary statistics show the average as 36.6 with a 95% confidence interval of 2.4. Assumptions of accuracy or precision less than 2.4F are BS. So the discrepencies between the data sets from a statistical standpoint are meaningless: both averages are tolerable estimates of the true population average. But it’s not known to within 0.1F folks!
Climate REFERENCE Network. Keep repeating, NCDC…
I’m really glad that you’re doing the Anthony. Quite the achievement – and now ……in their face. And it isn’t like you didn’t give fair warning….
;-D
And Mosh, saying that the lapse rate adjustment should be 6.5C/km is a furphy to.
The rate depends on many things. Moisture content most specifically.
There is NO WAY you know the proper adjustment to correct for altitude at any specific place or time.
Just another fudge factor available for the AGW /Giss/Hadcrud bletheren to use.
[“a furphy to.” ??? Mod]
I still think this average business is foolishness. It has no real value for anything. A nice but only marginally useful number. It is just another over simplified meta data that is used by the spin doctors on all sides of the issue. The reality is it gets you nothing, except a pork barrel grant, and is about a useful and meaningful as a politician. If we want to look at tightly defined geographic areas, altitude corrected, etc. then at least we have a number that the people in those regions understand and may for them be useful. I think the land and water masses cause some difficulties that further confound world wide average to an even more meaningless number.
Paul Marko says:
January 7, 2013 at 11:34 am
December’s sun was really quiet. SSN ~ 40; 10.7 ~ 108; Ap ~ 3. Dalton or Maunder?
“Eddy”
http://wattsupwiththat.com/2009/06/13/online-petition-the-next-solar-minimum-should-be-called-the-eddy-minimum/
possibly a silly question, but for average yearly temperature should the monthly averages be weighted by number of days in the month? Or is the yearly average normally just the mean of the month averages for comparisons over several years? I suspect there would be very little difference in the results anyhow when looking for trends over years/decades….but just a mean of the monthly averages would be skewed a bit by an exceptionally warm or cold February, for example…or maybe the monthly averages are already somehow normalized?
More mistakes can happen when records from other countries are fed into the global compilations. CLIMAT data feeds are based on whatever a country supplies, not necessarily on (min+max)/2. About 40% of countries, the largest China and the former Soviet Union, use the mean of evenly-spaced observations. About 20% try to replicate a true mean through weighted temp averages at particular hours, mostly in Europe and Latin America.
It’s believed by some authorities that as long as this is consistent through time it doesn’t distort long-term trends, although it can affect comparisons between countries. Most have been consistent but some have changed methods or observation timing, sometimes conflicting with daylight saving.
The consensus seems to be that these conflicts negate each other and don’t cause a systemic bias in global temperatures. Personally, I remain to be convinced.
We know that Australian data fed to GISS from 1994 was on average .15C below the BoM’s HQ figures, an error that wasn’t noticed until around 2003, and the GISS and NCDC records from 1994 to 2004 weren’t corrected till 2009. Is it corrected now?
This happened because it was agreed that Australia send (min+max)/2 data to the US but forgot to do it for nearly a decade. A 0.15 deg discrepancy could be way, way off the mark, higher or lower.
Australia’s BoM processes raw data from observer sheets to produce a homogenised version, then a High Quality network of more than 100 stations that vanished fairly quickly, now a new version called ACORN including some different stations. There are periods of a year or more when these versions can differ at a nominated station by more than 1C.
Working back from deg C to deg F you find that if one place after the decimal is used, you can get two solutions for one conversion. If you try to rationalise, you find a problem that cannot be solved. It turns out that at many Australian sites, temperatures were recorded in whole deg F more than 30% of the time.
Working with grids and interpolation before sending numbers to the US, of course no interpolation scheme is perfect. One issue we see with a situation like the current one is in areas with steep gradients and sparse networks, as Stephen Mosher nores above. A good example is in Australia’s Nullarbor. Long-term averages from Cook station go into the average fields which are about 5-6C warmer in summer than the coast is. But without any current Cook observations to “anchor” the analysis, on very hot days the anomaly from Nullarbor Roadhouse will be projected too far inland. So, for example, if you have a 45-degree day at Nullarbor (which is 18 degrees above average), that +18 anomaly will be applied to Cook’s 32 degree average to give an analysis of 50 – but in reality on very hot days there’s usually little difference between the inland and coastal sites.
Personally, I think absolute values are far more important than trends, for one of the main uses is in proxy reconstructions which don’t seem to have a natural mechanism for change at country borders or when a new normal is introduced.
This whole topic suffers from the old hymn, “Build on the rock and not upon the sand” for the sands here are forever shifting.
Mosh, the lapse rate surely depends on the time available for the rising, cooling air mass to shed its excess heat, by whatever mechanism. It can’t be 100% instant radiation loss. There must be some conduction loss to move a thermometer. Can’t see how one value fits all.
Dennis Nikols says:
January 7, 2013 at 4:26 pm
“I still think this average business is foolishness.”
Have to agree, totally. Isn’t the whole problem caused by the ridiculous idea that Earth has an average temperature?
Tom in Florida:
At January 7, 2013 at 5:14 pm you say
That depends on what you mean by “Earth has an average temperature”.
If you have not seen it then I think you will want to read this, especially its Appendix B.
http://www.publications.parliament.uk/pa/cm200910/cmselect/cmsctech/memo/climatedata/uc0102.htm
And please note how the paper was prevented from publication by the frequent data changes which have been much discussed on several threads of WUWT today.
Richard
Dennis Nikols says:
January 7, 2013 at 4:26 pm
“I still think this average business is foolishness.”
Agreed. Conpletely. The earth has temperature variation due to altitude, longitude,and latitude, and season, amongst other factors. To boil all this down to an average number is meaningless. It’s like measuring the temperature inside my furnace, inside my fridge, inside my freezer, inside my garage, and inside every room of my house, including basement and attic and then quoting an average of all measurements, and then to say the average is trending upward because the sun comes out and warms the attic. Total misuse of statistics and physically meaningless.
Geoff Sherrington, even the automatically recorded BOM data is a mess.
As you know, we are experiencing a spot of hot weather in south eastern Australia. I keep an eye on the Canberra readings (which come from the airport, but that’s another story.) Anyway, the other night, according to the BOM, the temperature dropped from about 22 to 5 degrees in 10 minutes. I assure you that the temperature did not change much at all. Furthermore, that ‘minimum’ stayed on the chart as the lowest minimum for the rest of the reporting period. The other thing I noticed is that when it got really hot (high 30s) the chart would just blank out altogether for sometimes hours at a time. I would not trust these readings, which no-one seems to check (see the absurd drop to 5 degrees in the middle of a heatwave, uncorrected for at least 12 hours, if ever) as far as I can throw Al Gore.
Back on topic, I am awestruck that a citizen scientist with a family, a business and the world’s biggest science blog manages to do quality control for massively funded public agencies in his spare time, such as it is. The overpaid and lazy slobs who are supposed to be in charge of this stuff should hang their heads in shame. Anthony, if you were being paid adequately for doing their jobs for them, you would never have to work again.
I guess the real point is that you cannot compare temperatures measured with different systems.
The CRN is a new system, and it will take several years before they can draw any comparisons.
The same when you “disappear” 2/3 of your measuring stations. You are then working with a new measuring system. And when you allow urban encroachment within that system, you =have a continually changing measurement system.
You CANNOT reliably compare calculated /averaged readings even a couple of years apart because the overall system has changed.
Because of the massive unreliability of the measurement system, it does make it very easy to fudge the data to say what you want someone else to believe., if you have an agenda to do so.
The whole thing is a mess and totally unreliable. Why the heck they are wasting so much money on idiocies related to temperature rise, when , in reality, we actually have NO IDEA whether any real rise has actually occured.
Lance Wallace says:
January 7, 2013 at 12:34 pm
year sites months mean (F) Std. Err. (F)
2008 112 1382 51.9 0.50
2009 114 1449 51.7 0.49
2010 116 1467 52.0 0.49
2011 116 1493 51.9 0.50
Incredible stability for those four years!
Average 51.88F/11.06C.
2008-2011: change 0.0C
But http://journals.ametsoc.org/doi/pdf/10.1175/BAMS-D-12-00170
Fig. 6, pg 42 CONUS annual mean temperatures derived independently using a first difference method
relative to the 2006-2010 mean of each data source: the USCRN (blue) and the USHCN (red).
Revised and updated from Menne et al., 2009.
2006: +0.60C,
2008: -0.41C
2011: +0.20C
Change 2006-2008: 1.01C OR 1.88F.
Change 2008-2011: -.040C OR 0.36F
……confusion here, not that this is unusual ….
. . . And the even-more ridiculous claim that this fictitious ‘global’ temperature can provide evidence that mankind is about to overheat the planet by burning fossil fuels. It’s a very clever magical trick; easy to pretend you’re sawing the Earth in half while the audience is distracted by smoke and mirrors—or rather, smokestacks and ice floes.
P.T. Barnum would have been proud of the Climatists.
/Mr Lynn
Joseph Stalin take on this: ” Its not who measures the tempature, its who averages the tempatures and writes the report to be published for the offical records.”
“Agenda control job one.”
They intend to lay the U.S. low by any means.
Its all agenda. We too are just things to be adjusted to fit the agenda.
GeoLurking says:
January 7, 2013 at 4:29 pm
“Eddy”
Thanks for link.
David L says:
January 7, 2013 at 5:55 pm (Edit)
Dennis Nikols says:
January 7, 2013 at 4:26 pm
“I still think this average business is foolishness.”
Agreed. Conpletely. The earth has temperature variation due to altitude, longitude,and latitude, and season, amongst other factors. To boil all this down to an average number is meaningless. It’s like measuring the temperature inside my furnace, inside my fridge, inside my freezer, inside my garage, and inside every room of my house, including basement and attic and then quoting an average of all measurements, and then to say the average is trending upward because the sun comes out and warms the attic. Total misuse of statistics and physically meaningless.
##########################
Actually it is not meaningless. People need to get the notion out of their heads that Hansen, Jones, etc are calculating an average temperature. They ( and we ) are doing something quite different although “averaging ” is used, and people call it “an average”. What it is, what it mathematically is ( forget the PR and focus on the science ) is an estimate of temperature at UNMEASURED places. Such that, if I take all the measures together and use the correct techniques I can win the following game.
1. Pick a place, any place on the planet. Hide a thermometer there for 1 month.
2. I will now calculate “the average” NOT USING that point.
3. Challenge people to guess the temperature at your unknown location.
all players get the time ( the month ) and all known data for that month.
The job is to estimate the temperature at an undisclosed location.
the “average” (done right ) will be the best estimate. Now tell me the month, the latitude, the longitude and the altitude and my estimate will be even closer. And we can test this synthetically by generating centuries of synthetic data ( that looks like weather ) for the entire global and then sampling that complete field sparsely. and seeing how well our estimating proceedure works.
Or, I can USHCN to “predict” what you will see at CRN.
So, we use an “average” ( its really not an average ) to come up with an estimate for the temperature at any given spot. Its not really an average, although people call it an average. But when you look down into the math of things you see.. “Oh, this is an estimate of the temperature at unknown locations that minimizes the error of prediction”
Finally “average” temperature also has a meaning when we talk about things like the LIA and say
“It” was cooler in the LIA.. or “It” was warmer in the MWP.
We all say that. “it was warmer in the MWP” thats not meaningless.
“If you read the pdf you will discover that one aspect of the CRN network is to provide long term, un-interrupted sites. However, they have planned for possible relocations due to owners requirements, failures etc…”
yes, one such move will supply data on the effect that nearby roads have on temperature measures. So instead of speculation ( roads will corrupt the data ) you’ll actually have data and magnitudes and all sorts of science.. as opposed to speculation
“[“a furphy to.” ??? Mod]”
too !!!
sorry !, me bad typist 🙁
MangoChutney says:
January 7, 2013 at 12:21 pm
CONUS or CON US?
—————————
E.Pluribus fool em.
john from DB