UPDATE: NOAA plans to release SOTC at 1PM EST today. Look for updates soon and a special report on today’s release. The map below will automatically update when we have the new December COOP Tavg value, probably later today. I’ll have another post on the differences between the CRN and COOP in the near future. – Anthony
Pursuant to our previous story showing issues with diverging data and claims over time, NCDC has updated the Climate Reference Network Data for December 2012. I’m still waiting on the NCDC State of the Climate report to come in with their number, and I’ll update the graphic (in yellow) when it is available.
Being a state of the art system, it is well sited, and requires no adjustments and the data is well spatially distributed by design so that it is representative of the CONUS. Here’s the current plot (click to enlarge):
Each (small) number in blue represents one of the NCDC operated U.S. Climate Reference Network stations in the CONUS that we use. Here’s the data reports for December and the entire year:
==========================================================
2012 Average Monthly Reports – text files
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201201.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201202.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201203.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201204.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201205.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201206.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201207.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201208.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201209.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201210.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201211.txt
http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201212.txt
Source for all data: ftp://ftp.ncdc.noaa.gov/pub/data/uscrn/products/monthly01/
==========================================================
The December report looks like this:
==========================================================
TOTALS Totals for T_MONTHLY_MEAN (column 8) = 296.7
Totals for T_MONTHLY_AVG (column 9) = 302.4
Total Number of CRN Stations Included in this Report = 116 out of 117 CONUS stations possible (stations with missing data excluded – see below)
AVERAGING CALCULATIONS
Average of T_MONTHLY_MEAN Totals = 296.7 / 116 = 2.55775862068965 or 2.6° C Average of T_MONTHLY_AVG Totals = 302.4 / 116 = 2.60689655172414 or 2.6° C Average of T_MONTHLY_MEAN Totals in Fahrenheit = (2.55775862068965 * 1.8) + 32 = 36.6039655172414 or 36.6° F
Average of T_MONTHLY_AVG Totals in Fahrenheit = (2.60689655172414 * 1.8) + 32 = 36.6924137931034 or 36.7° F
SUMMARY National Average of Monthly Mean Temperatures = 2.6° C or 36.6° F National Average of Monthly Average Temperatures = 2.6° C or 36.7° F
EXCLUDED STATIONS The following stations reported no data (-9999.0) for either T_MONTHLY_MEAN or T_MONTHLY_AVG and were not used:
CRNM0101-PA_Avondale_2_N.txt
================================================================
From the NCDC provided FTP data files we can calculate a yearly CONUS Tavg, which has never been done before by NCDC to my knowledge. Odd that is falls to somebody outside of the organization don’t you think?
Climate Reference Network Data for 2012
| Month | Tavg | |
| 1 | 36.8 | |
| 2 | 38.1 | |
| 3 | 50.6 | |
| 4 | 54.8 | |
| 5 | 63.3 | |
| 6 | 70.8 | |
| 7 | 75.6 | |
| 8 | 72.9 | |
| 9 | 65.6 | |
| 10 | 53.9 | |
| 11 | 43.9 | |
| 12 | 36.7 | |
| Sum | 663 | |
| /12 | 55.25 |
Therefore, from this data, the Average Annual Temperature for the Contiguous United States for 2012 is 55.25°F
Note also the value from the CRN from July 2012, 75.6°F far lower than what NCDC reported in the SOTC of 77.6°F and later in the database of 76.93°F as discussed here.
Makes you wonder why NCDC never mentions their new state of the art, well sited climate monitoring network in those press releases, doesn’t it? The CRN has been fully operational since late 2008, and we never here a peep about it in SOTC. Maybe they don’t wish to report adverse results.
I look forward to seeing what NCDC comes up with for the Cooperative Observer Network (COOP) in their “preliminary” State of the Climate Report for Dec 2012 and the year, and what the final number will be in 1-2 months when all the data from the COOP network comes in.
I’ll have more on this in the near future. I’ll be offline for the rest of the day traveling.
UPDATE: 10:30PM PST, Climatebeagle and others have been puzzled over the 117 stations used, and can’t reconcile with the larger list. Here’s the logic:
Some stations, such as the Oak Ridge, TN and Sterling, VA were removed due to them not reporting regularly or at all (they are test sites). The one CRN station in Egbert, Ontario Canada is not part of the CONUS, and is removed also. None of the stations in Alaska are used as they are also not part of the CONUS.
Here is the list: conus_stations_master_list_1-8-13 (PDF)
UPDATE2: 9:30AM PST, 1/8 Reader Lance Wallace noted a mistake, which has to do with versioning control on our end. One CRN station in Egbert Ontario was inadvertently included in the monthly code, where it was not in the daily code we run. We’ll rerun it all and update. I’m thankful for the many eyes of WUWT readers – Anthony

My earlier table above was for the full USCRN dataset (125 stations, including 8 in AK, 2 in HI, and 1 in Ontario CAN). This table is for the continental US (114 stations, except about two fewer in 2008 and 2009). Presumably if Anthony removes the two Hawaiian stations his overall average will drop from 55 to a bit closer to the values of about 53 listed here.
year Mean SE
2008 52.8 0.50
2009 52.6 0.49
2010 53.1 0.50
2011 53.3 0.50
Gawd Mosher you really are getting on my nerves.
1. Spatial averages of temperature are meaningless in reality. You claim to want to know what the temperature is in a region that is not being measured? Fine, you estimate temperature using the usual algorithms, but that is not the same as claiming you know the temperature or its response/history over time, it should merely give you a rough figure to work with. Such a figure would merely be used prior to the temperature actually being measured (you would only want to know the temperature prior to something being done at that location) . It serves no purpose otherwise, and it certainly does not serve the purpose it is being used for (to comment on the state of the climate)! Looking at the change in spatial averages over time is not meaningful because nothing on the planet responds to such an average; it merely allows for statistical masturbation.
2. Meaningful comparison for climate science purposes can be made by looking at the rate of change in the temperature readings of single stations over time. If those stations are well sited it significantly reduces or eliminates most of the siting biases. More specifically, as the climate and weather patterns are driven by differences in temperature across the globe examining those changes is far more meaningful than some pointless averages. Given this, there is zero need for altitude/lapse rate correction as this is constant. Why are you trying to compare an airport with a sea-level site in any case? There is no need to do such a thing!
4. Core samples, tree rings etc are useful in identifying localised climatic conditions only. Averaging these to come with a spatial average is even more meaningless than doing the same with thermometers. The correct usage of such data would be to compare various points on the planet over time, look at the changes in climate and deduce whether various events were localised or global in nature. Beyond that you are trying to coax information from whence none exists.
Let’s get real about what you should and shouldn’t be doing here.
Anthony, a smaller (XX.XX °C XXX.XX K) somewhere sure would be appreciated by many I bet and let the user round to whatever precision they feel is proper. Now that would be fast, easy and very useful for everyone without a calculator wristwatch. 😉
The remaining discrepancy between my list and Anthony’s is a single station at Goodwell OK. My list includes only one station at Goodwell (at the Panhandle Center) and Anthony’s list (also Climate Beagle’s) includes two. I expect his list of 115 sites (dropping the two in Hawaii) is correct. I have matched the other 113 sites with Anthony’s in the Excel file in Dropbox.
OK Goodwell 2 E 20040227 Panhandle Research & Extn. Center (Native Grassland Site)
OK Goodwell 2 SE 20110618 Oklahoma Panhandle State Univ., School of Agriculture
https://www.dropbox.com/s/js1p7tns1gyepp9/CRN%20list%20of%20locations%20and%20sites%20compared%20to%20Watts%20CONUS%20list.xlsx?m
Trying that dropbox link again to the Excel file with Anthony’s list of 115 sites.
https://www.dropbox.com/s/js1p7tns1gyepp9/CRN%20list%20of%20locations%20and%20sites%20compared%20to%20Watts%20CONUS%20list.xlsx
Anthony,
I haven’t read all the comments yet, so I’ll post what I’ve found even if it has been posted already.
There are some problems with the list of stations you are using. The USCRN folder of site data has many more stations in it than are part of the network. Maybe you could get a concise list from someone at USCRN. A BAMS article recently put up on the USCRN site says:
http://journals.ametsoc.org/doi/abs/10.1175/BAMS-D-12-00170
I noticed site number 64757 CRNM0101-ON_Egbert_1_W.txt is located north of Toronto. Also there are 7 pair sites of which I found you used them all except number 53927 CRNM0101-OK_Stillwater_5_WNW.txt. Since those 14 sites are so close to each other, I wonder if it might be appropriate to average their data.
Here is the pair list I came up with by looking at the map in this photo file.
http://www1.ncdc.noaa.gov/pub/data/uscrn/documentation/site/photos/stationsbystate_lores.pdf
54796, 54797 RI 1 km apart.
54794, 54795 NH 7 km apart.
53877, 53878 NC 9 km apart.
63828, 63829 GA 13 km apart.
53926, 53927 OK 2 km apart.
94995, 94996 NE 29 km apart.
94059, 94060 MT 21 km apart.
I emailed the MET office recently and asked if there is a trend ( as they claim ) for rising temperatures on average across the UK/US etc etc then why is there not a “trend-towards-now” for absolute temperature records being broken ? I stated that “surely this illustrates that your data/assumptions can not be correct because basic ( and I mean VERY basic ) statistical theory states that extreme records should be tumbling left-right-and-centre but are not…. needless the say the answer I got back was unscientific gibberish and counter-logical… I also asked someone recently how the surface temperature measurements taken on the QUEEN MARY 2 are calibrated and verified since they are used to calibrate and verify satellite temperatue measurements. No one seems to know !? I also asked if the ship’s own “heat island” was isolated from any measurements ?! no one seemed to know or care….. so maybe this could be another area for Anthony to delve into….is there a floating heat island taking dubious measurements on a daily basis as well as probably consistantly serving Champagne at the incorrect temperature….
Mosh
Could you please clarify where you get your figure of a 6.5C change in temperature per 1000 metres of altitude? Thanks
Tonyb
tonyb
The dry adiabatic lapse rate is 9.8C/km, but this is heavily affected by ONLY H2O) in its various states. The 100% humidity lapse rate is about 4.5C/km (iirc) and since there is nearly always H2O in the atmosphere a generalised average value of 6.5C/km is taken in lieu of other information.
In other words, at any time, the lapse rate could be anywhere between 4.5C/km and 9.8C/km.
So really its just another way to introduce further “variables” for temperature “adjustments” 😉
ps, I can just see what Hansen would do with lapse rate adjustments.
”
Adjust using 6.5C/km…… oh, reading is still too cold, ,
must have been dry so I’ll adjust using 8.2c/km instead, spurious reason given..
that’s looks better.. maybe try 9.2 and get it even warmer at sea level.
”
They have currently nearly run out of adjustments they can use, so the global urban average temp has levelled off..
lets not give them another variable to mess with !!!!!
AndyG55
Thanks for that.
It would be interesting to see Moshs BEST work with the full range of possible adjustments instead of using his 6.5C figure.
Mind you to have a belief in the accuracy of your data you would first need to check each temperature point instead of averaging a historic probably incorrect one off figure with thousands of other probably incorrect one off figure in the belief that averaging somehow makes lots of wrongs right
tony
AndyG55 says:
January 8, 2013 at 4:13 am
ps, I can just see what Hansen would do with lapse rate adjustments.
”
Adjust using 6.5C/km…… oh, reading is still too cold, ,
must have been dry so I’ll adjust using 8.2c/km instead, spurious reason given..
that’s looks better.. maybe try 9.2 and get it even warmer at sea level.
”
AndyG55 – The whole temperature “correction” for lapse rate idea is just silly. Why on Earth would anyone do this? It appears that people who are motivated towards this correction think that an average “temperature” that is adjusted so that every point on the planet is at some effective “sea level” is somehow meaningful. And as you point out, the actual lapse rate depends on the local weather, and so introduces even more complications.
The whole idea is a nonsense anyway. They are effectively assigning the same temperature to a very large area, without knowing if its in any way applicable to that area.
In the old network, UHI increased urban temps are applied over large areas of countryside which are in no way affected by urban heat, Nearby readings that might be unaffected by urban expansion, are homogenised so that they are.
The whole issue of a global average land temperature is a joke and a farce.. !
“in the belief that averaging somehow makes lots of wrongs right”
did you mention climate models ???? 😉
Meanwhile…
Cold wave unabated in North India, 24 more die.
Plummeting mercury, coupled with thick fog cover, threw normal life out of gear in the entire North India on Monday, with 24 more people succumbing to the cold wave in various parts of the region.
—
I suppose these poor people didn’t realize that their lapse-rate corrected temperature was really warmer than what the thermometer was telling them, so they should not have succumbed to the cold.
/climate-science
Steve Mosher wrote: “Lapse rate, it will get you every time if you are not careful.”
I believe it’s got Steve this time. Here’s anecdotal evidence why you can’t apply lapse rate (even if you wanted to). Here in Austin, TX, the new international airport east of town is sited on a plain several hundred meters lower than the western half of the city proper, which occupies a hilly area. So the airport temps should be warmer than temps in west Austin due to the lapse rate. But in fact the airport is almost never warmer. Reason? Urban heat island effect. Temps at higher elevations in Austin run up to 10 degrees warmer than those at the lower-level airport.
Taking UHI effect into account, it seems to me there is no valid way to make a blanket adjustment.of temperature, and it all points to the absurdity of trying to find a valid “average” temp. Reminds me of my days at a major Texas daily newspaper, arguing with a business editor against his plan to average 10 economic forecasts to arrive at a prediction of future growth — to two decimal places!
.
Using anomaly or absolute temperature data statistical analysis results to prove or refute AGW, is simply numerology, the lowest form of scientific observation and discussion. Let me state it another way. Regardless of the cause, small scale weather pattern variation trends cause temperature variation trends at sensor level. But importantly, larger scale oceanic and atmospheric parameters remain in control of resultant weather pattern variation trends. Therefore, to prove or disprove anthropogenic cause, you must look at large scale oceanic and atmospheric parameter trends, not temperature trends. Hansen and his ilk prefer to dabble in low hanging fruit (temperature trends), hoping gullible sheeple will eat it all up and lick the plate.
As long as those who seriously doubt this AGW fad continue to argue over low hanging fruit, we will get nowhere fast.
What we really need is the data related to large oceanic and atmospheric parameters (IE semi-permanent pressure systems, global cloud cover data, smaller pressure system tracks, etc), over at least a 60 to 100 year span of time, and ignore temperature all together. Why? For humans to definitively cause large scale temperature change, what we are supposedly doing must first definitively affect large scale climate and temperature drivers beyond their normal random walk.
Frank K. says:
January 7, 2013 at 10:07 pm
“… To put it another way, if it’s 50 degrees F in Denver, CO, will I feel colder/warmer in Denver than I would if I were exposed to 50 degree air in Charleston, SC? Of course not!”
Frank you probably realized after you posted this how it is too simplistic. You probably realized that you must also consider wind, humidity and insolation on exposed skin when saying “will I feel colder/warmer” .
For Mosh and others discussing lapse rate.
A lapse rate correction will do nothing but add noise to the data because weather yanks the lapse rate all over the place. Any fixed lapse rate correction against temperature data over time without knowing how the atmosphere over the station changed would be wrong. A dry lapse rate of 5.5°F/1,000 ft (10°C/km) is often used to calculate temperature changes in air not at 100% relative humidity. A wet lapse rate of 3°F/1,000 ft (5.5°C/km) is used to calculate the temperature changes in air that is saturated (i.e., air at 100% relative humidity). Pick one and try to link it to the local weather the station experiences.
Note that your suggested .65c/100m rate is an average and would be wrong for anything other than a rough estimate, It is not a real piece of data like the temperature measurement. Got a balloonsonde site nearby and have that daily lapse rate data in parallel? THAT would be real data.
Remember this is 2 meter surface temperature, not barometric pressure. The thermometers are not changing altitude against the surface. Changes in air density and temperature (updraft/downdraft) is what lapse rate is used to predict for aviation and models, but it isn’t accurate without knowing the state of the atmosphere at that location at that point in time. Otherwise we wouldn’t need a global balloonsonde network measuring twice daily.
For example, try using a general value for density altitude while taking off in a loaded Cessna at Leadville Colorado on a hot and humid summer day and cross your fingers that you get off the runway to altitude before smacking into the mountain. This is why density altitude is calculated by pilots from hourly data just prior to takeoff so they know if the plane will fly or not. Without the hourly data,(Baro/ Dewpoint/ Temp) knowing the state of the atmosphere (and thus lapse rate) is a crapshoot. Using a standard average lapse rate to correct temperature over elevation to any degree of accuracy is equally a crapshoot. All it will do is add noise the way you propose it.
Comparing station to station data at different altitudes nearby, yes you need a lapse rate correction, but you also need the other variables (baro, DP) for it to be accurate. Comparing two roughly similar networks (remember NCDC designed the CRN to be roughly the same distribution as the COOP so while station count is lower, distribution by area and altitude placement is similar) at 2 meter SURFACE averaged, you don’t. When I present more data, you’ll see that the CRN and COOP network actually match some months, a good indication that they are similar.
It’s a non issue. Nice try though.
Anthony
Unless Mosh knows the daily weather conditions at the time of the readings of each of the instrumental records he used for BEST, surely his rough rule of thumb for the lapse rate is so approximate as to destroy the idea of a robust data base?
tonyb
@ur momisuglyLance Wallace
Thanks for also looking into this, but where did your 125 station list with 8 in AK come from?
I’ve worked off the NOAA list here:
http://www.ncdc.noaa.gov/isis/stationlist.htm?networkid=1
which I assumed was the official list. The list I gave earlier is manually derived from that list.
It has 12 AK stations.
It’s a pity NOAA doesn’t seem to have an easily consumable version of their metadata, e.g. a csv file, for all the reference stations.
@ur momisugly Anthony
Thanks for the list, I also took the approach of looking at the mismatches in WBANNO numbers from your 116 stations (from your one of your monthly files) and my CONUS list, which then confused me even more.
Your monthly file included these that were not in my list.
NM_Santa_Fe_20WNW,03087
CO_Colorado_Springs_23_NW,53007
UT_Blanding_26_SSW,53012
AL_Selma_6_SSE,63897
Note that these four are not USCRN stations according to:
http://www.ncdc.noaa.gov/isis/stationlist.htm?networkid=1
You monthly file also included
ON_Egbert_1_W,64757,USCRN
My list included these that were not on your list.
NM_Los_Alamos_13_W,03062,USCRN
PA_Avondale_2_N,03761,USCRN
CA_Santa_Barbara_11_W,53152,USCRN
OK_Stillwater_5_WNW,53927,USCRN
REPLY: To satisfy your request while traveling, I recreated the list from scratch last night in my hotel room, and obviously failed. Let that be a lesson not to do detailed work after a full day of driving. I’ll have my office forward the correct list (which I don’t have on my laptop) later today and then I’ll post it. Until then, please just stop speculating. – Anthony
I have commented recently on solar threads about the possibility there has been no real warming which fits very nicely with Lief’s new sunspot count. That is, if all the warming due to UHI and adjustments is removed the real temperature of the planet has changed very little over the last 150 years. This works very nicely with a sunspot count that also hasn’t changed very much.
The primary changes in temperature would be the variation due to ENSO and the AMO (although other minor factors exist). These variations can lead to melting ice caps, glaciers, etc. But that will soon stop now that the PDO has flipped and the AMO will flip in the not too distant future. Add this is a real solar cooling due to the L-P effect and it is likely we will see cooling over the next few decades.
One thing to beware of is any theory that uses the temperature record in any manner. Even if it is skeptical in nature it may very well be correlating to bad data.
What does this mean for the GHE? Why isn’t it warming like it should? I’ve stated my opinion in the past. The GHE is only one of the effects of adding GHGs to the atmosphere. There are other effects and some of them cool the planet. When all effects of GHGs are taken into consideration they more or less cancel out at the current concentrations and temperatures.
Mosher puts his foot in his mouth once again. Just can’t resist tampering with records, eh?
Tom in Florida says:
January 8, 2013 at 6:09 am
Frank K. says:
January 7, 2013 at 10:07 pm
“… To put it another way, if it’s 50 degrees F in Denver, CO, will I feel colder/warmer in Denver than I would if I were exposed to 50 degree air in Charleston, SC? Of course not!”
Frank you probably realized after you posted this how it is too simplistic. You probably realized that you must also consider wind, humidity and insolation on exposed skin when saying “will I feel colder/warmer” .
—
Hi Tom – 50 deg F will feel the same to me provided all other variables are the same (as you say wind, humidity) – it won’t matter if I’m in Denver or Miami 🙂 That’s because my reference point is my body temperature (98.6 F). My point is that lapse rate corrections for spatial temperature averages are a silly idea.
Anthony Watts says:
January 8, 2013 at 6:54 am
“Comparing station to station data at different altitudes nearby, yes you need a lapse rate correction, but you also need the other variables (baro, DP) for it to be accurate.”
My point Anthony is that you really don’t need any correction at all, no matter the altitude. If it’s 50 F on top of a local mountain and 60 F in a nearby valley, I would be fine with averaging the two readings together. The reference point for temperature climatology is the 2 meter height of your thermometers from the ** surface of the Earth **.
Here’s another example – if today ALL thermometers in the CONUS had an reading of 50 F (all stations identical, no regard to altitude), would the CONUS average temperature be 50 F or some different lapse-rate adjusted temperature?
You have it right Anthony. Adjusting data is no simple task.
It was illustrated today during a morning when we were 10F in Nashua NH (119 feet elevation) at 1015UTC with a steep very low inversion. Jaffrey NH airport at 1013 feet was at 18F and Worcester MA at 1009 feet at 29 the same time and Boston at sea level at 33F. Note sure how one would even attempt to ‘adjust’.
Lapse rates as you stated vary considerable – from sharp inversions to superadiabatic depending on surface factors like snowcover, humidiysaturated soils vs dry ground , vegetation type and state, air masses and fronts, time of day and year, clouds, winds and microclimate factors. Pressure adjustments are more straightforward but even those have issues relying on a standard atmosphere when the atmosphere is rarely standard. In much the same way as temperatures are rarely at the ‘average’.