Announcing the first ever CONUS yearly average temperature from the Climate Reference Network

UPDATE: NOAA plans to release SOTC at 1PM EST today. Look for updates soon and a special report on today’s release. The map below will automatically update when we have the new December COOP Tavg value, probably later today. I’ll have another post on the differences between the CRN and COOP in the near future. – Anthony

Pursuant to our previous story showing issues with diverging data and claims over time, NCDC has updated the Climate Reference Network Data for December 2012. I’m still waiting on the NCDC State of the Climate report to come in with their number, and I’ll update the graphic (in yellow) when it is available.

Being a state of the art system, it is well sited, and requires no adjustments and the data is well spatially distributed by design so that it is representative of the CONUS. Here’s the current plot (click to enlarge):

Each (small) number in blue represents one of the NCDC operated U.S. Climate Reference Network stations in the CONUS that we use. Here’s the data reports for December and the entire year:

==========================================================

2012 Average Monthly Reports – text files

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201201.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201202.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201203.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201204.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201205.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201206.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201207.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201208.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201209.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201210.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201211.txt

http://crn.intelliweather.net/imagery/crn/crn_temps_national_monthly_average_report_201212.txt

Source for all data: ftp://ftp.ncdc.noaa.gov/pub/data/uscrn/products/monthly01/

==========================================================

The December report looks like this:

==========================================================

TOTALS Totals for T_MONTHLY_MEAN (column 8) = 296.7

Totals for T_MONTHLY_AVG (column 9) = 302.4

Total Number of CRN Stations Included in this Report = 116 out of 117 CONUS stations possible (stations with missing data excluded – see below)

AVERAGING CALCULATIONS

Average of T_MONTHLY_MEAN Totals = 296.7 / 116 = 2.55775862068965 or 2.6° C Average of T_MONTHLY_AVG Totals = 302.4 / 116 = 2.60689655172414 or 2.6° C Average of T_MONTHLY_MEAN Totals in Fahrenheit = (2.55775862068965 * 1.8) + 32 = 36.6039655172414 or 36.6° F

Average of T_MONTHLY_AVG Totals in Fahrenheit = (2.60689655172414 * 1.8) + 32 = 36.6924137931034 or 36.7° F

SUMMARY National Average of Monthly Mean Temperatures = 2.6° C or 36.6° F National Average of Monthly Average Temperatures = 2.6° C or 36.7° F

EXCLUDED STATIONS The following stations reported no data (-9999.0) for either T_MONTHLY_MEAN or T_MONTHLY_AVG and were not used:

CRNM0101-PA_Avondale_2_N.txt

================================================================

From the NCDC provided FTP data files we can calculate a yearly CONUS Tavg, which has never been done before by NCDC to my knowledge. Odd that is falls to somebody outside of the organization don’t you think?

Climate Reference Network Data for 2012

Month Tavg
1 36.8
2 38.1
3 50.6
4 54.8
5 63.3
6 70.8
7 75.6
8 72.9
9 65.6
10 53.9
11 43.9
12 36.7
Sum 663
/12 55.25

Therefore, from this data, the Average Annual Temperature for the Contiguous United States for 2012 is 55.25°F

Note also the value from the CRN from July 2012, 75.6°F far lower than what NCDC reported in the SOTC of 77.6°F and later in the database of 76.93°F as discussed here.

Makes you wonder why NCDC never mentions their new state of the art, well sited climate monitoring network in those press releases, doesn’t it? The CRN has been fully operational since late 2008, and we never here a peep about it in SOTC. Maybe they don’t wish to report adverse results.

I look forward to seeing what NCDC comes up with for the Cooperative Observer Network (COOP) in their “preliminary” State of the Climate Report for Dec 2012 and the year, and what the final number will be in 1-2 months when all the data from the COOP network comes in.

I’ll have more on this in the near future. I’ll be offline for the rest of the day traveling.

UPDATE: 10:30PM PST, Climatebeagle and others have been puzzled over the 117 stations used, and can’t reconcile with the larger list. Here’s the logic:

Some stations, such as the Oak Ridge, TN and Sterling, VA were removed due to them not reporting regularly or at all (they are test sites). The one CRN station in Egbert, Ontario Canada is not part of the CONUS, and is removed also. None of the stations in Alaska are used as they are also not part of the CONUS.

Here is the list: conus_stations_master_list_1-8-13 (PDF)

UPDATE2: 9:30AM PST, 1/8 Reader Lance Wallace noted a mistake, which has to do with versioning control on our end. One CRN station in Egbert Ontario was inadvertently included in the monthly code, where it was not in the daily code we run. We’ll rerun it all and update. I’m thankful for the many eyes of WUWT readers – Anthony

0 0 votes
Article Rating
145 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
January 7, 2013 10:45 am

Nitpick, my apologies:

“NCDC has updated the Climate Reference Network Data for December 2012. I’m still waiting on the NCDC State of the Climate report to come in with their number, and I’ll update the graphc when it is available…”

tgmccoy
January 7, 2013 10:48 am

Hmm another “Hoist with their owne Petard” moment…

January 7, 2013 10:48 am

You need to start putting in invoices to the US Government for all the work that they can’t be bothered to do for themselves.

DirkH
January 7, 2013 10:49 am

Global Warming Has Left The Building!

jonny old boy
January 7, 2013 10:52 am

Nice Work….. So it looks like a SECOND state high temperature record being broken this century may be less likely…. 😉 Poor old South Dakota !

Skiphil
January 7, 2013 10:52 am

So impressive, Anthony, congratulations!! Looking forward to seeing the implications of this…. So the CRN is proving inconvenient to someone’s Cause….

daved46
January 7, 2013 10:56 am

Dont you need to multiply each monthly average by the number of days in that month, add all the months up and then divide by 365 or 366 to get an unbiased average?
REPLY: already handled in code, we took each stations Monthly Tavg (which NCDC caclulates from daily data) and calculated a CONUS monthly Tavg. All the data is there in case anyone wants to replicate it independently. – Anthony

leon0112
January 7, 2013 10:57 am

Anthony – Did Al Gore share some of his Big Oil money with you?

Lance Wallace
January 7, 2013 11:00 am

For those interested in the CRN network, there is a recent article summarizing the first 10 years of operation
http://journals.ametsoc.org/doi/pdf/10.1175/BAMS-D-12-00170

Mike Smith
January 7, 2013 11:07 am

Oh dear. This isn’t looking good. Not good at all.
Methinks someone has a lot of explaining to do.
And that should be interesting. Time to order in more popcorn!

climatebeagle
January 7, 2013 11:08 am

I’ve been simple averaging the USCRN hourly data and 2012 consistently comes out as the hottest US year regardless of the stations. E.g. hottest since 2003 when only looking at stations with a complete record since 2003, hottest since 2008 when only looking at stations with a complete record since 2008 etc.
I’m not actually a great believer in averaging temps, but should a spatial weighted average be used, rather than a simple one? E.g. a couple of areas have two nearby stations rather than just one.

mpainter
January 7, 2013 11:12 am

This will help keep the B*st*rds honest…

January 7, 2013 11:13 am

Not sure is if that’s how you calculate average adding the monthly values and divide by 12. If I go for the weighted average, ie including the month duration in days, I get 55.305 degrees F.
https://dl.dropbox.com/u/22026080/conustemp2012.xls
Andre

climatebeagle
January 7, 2013 11:17 am

On the hourly numbers I calculated an USCRN 2012 yearly average of 12.1°C or 53.7°F, but I think my list of USCRN stations is different since I have 124. Probably at least because I’m using all the USCRN stations, thus not the same as CONUS.
However, is there a listing of WBANNO numbers for the USCRN stations? I haven’t found a simple list online, thus generated one manually and may have made mistakes.

eco-geek
January 7, 2013 11:26 am

OK am I going stupid or have I missed something? Is Antony really saying that one set of records is about 21 degrees F lower than the other(s)? I must be wrong I know as somebody must have noticed.
OK I’ll get on this link when the crack wears off and find my mistake.

milodonharlani
January 7, 2013 11:31 am

A much needed corrective to NASA & NOAA’s cooked books. Appears close to one station per 26,666 sq. miles (with a few gaps), so, as you note, automatically adjusted for elevation, urban, rural & all other parameters.

troe
January 7, 2013 11:33 am

Contacted my Congressman and requested a GAO audit of the NOAA/NCDC temp reporting practices. Hansen, Anthony… touch em up.

Paul Marko
January 7, 2013 11:34 am

December’s sun was really quiet. SSN ~ 40; 10.7 ~ 108; Ap ~ 3. Dalton or Maunder?
http://www.swpc.noaa.gov/SolarCycle/

John F. Hultquist
January 7, 2013 12:07 pm

eco,
The higher number you see is for July, not the year.

E.M.Smith
Editor
January 7, 2013 12:08 pm

A man with a watch knows what time it is.
A man with two watches is never sure….
So at a minimum this says that “station selection” has a 2 F variation in it. So much for the assertion that “station dropout” doesn’t matter…
That the NCDC/SOTC data / report is 2 F warmer and the CRN stations are supposed to be ‘the best’ strongly implies that the NCDC/SOTC data are skewed high by 2 F. As that is more than the “Global Warming” they claim to have detected, that ought to mean we are colder now rather than warmer.
As I’m experiencing a colder winter than in the ’90s that accords with my ‘reality check’.
Looks to me like it’s pretty clear that “Global Warming” is an instrument error artifact.

Liberal Skeptic
January 7, 2013 12:11 pm

Someone, Somewhere, has massively cocked SOMETHING up if there is difference as large as that between this temperature record for 2012 and the old record for 2012. (I make it about 10 degrees centigrade??)
A deeper investigation has to be done.

Liberal Skeptic
January 7, 2013 12:15 pm

^ Actually not 10C, got confused by the previous story.
Still a major difference, enough to put global warming scare stories into doubt, at least in the united states. And the temperature record for the rest of the world can’t be of any better quality either.

MangoChutney
January 7, 2013 12:21 pm

CONUS or CON US?

Berényi Péter
January 7, 2013 12:25 pm

using the same dataset we get annual average contiguous US temperatures for the last 5 years
2008 52.57°F
2009 52.48°F
2010 52.98°F
2011 53.21°F
2012 55.25°F
Trend is +61°F/century, truly worse than we thought
/sarc off

Lance Wallace
January 7, 2013 12:34 pm

Anthony and Climate Beagle–
Using the CRN monthly dataset, I get rather different numbers for 2008-2011 (about 52 F) compared to Anthony’s number of 55 for 2012. I haven’t downloaded the data for 2012 yet.
Here are the values for 2008-2011. These are obtained by averaging across all months for a year rather than averaging across each month and then dividing by 12 the way Anthony did, although I would think this would not make much difference.
year sites months mean (F) Std. Err. (F)
2008 112 1382 51.9 0.50
2009 114 1449 51.7 0.49
2010 116 1467 52.0 0.49
2011 116 1493 51.9 0.50
Incredible stability for those four years!

BioBob
January 7, 2013 12:37 pm

E.M.Smith says:
Looks to me like it’s pretty clear that “Global Warming” is an instrument error artifact
===============================================================
LOL – that is one way to look at it. Other ways are to call it human error, hubris, pseudoscience or just plain wrong.
Thanks Anthony !! I love it when you skewer NOAA with their own “data”. Great work, and sorely needed these days.

Gene
January 7, 2013 12:38 pm

Last file listed contains this message:
****** WARNING ** WARNING ** WARNING ** WARNING ** WARNING ******
** This is a United States Department of Commerce computer **
** system, which may be accessed and used only for **
** official Government business by authorized personnel. **
** Unauthorized access or use of this computer system may **
** subject violators to criminal, civil, and/or administrative **
** action. All information on this computer system may be **
** intercepted, recorded, read, copied, and disclosed by and **
** to authorized personnel for official purposes, including **
** criminal investigations. Access or use of this computer **
** system by any person, whether authorized or unauthorized, **
** constitutes consent to these terms. **
****** WARNING ** WARNING ** WARNING ** WARNING ** WARNING ******
[Really ? . . mod]

January 7, 2013 1:05 pm

This is hugely important, it is proof of conspiracy. The “hottest year ever” will still be announced, I fear, I think they figure more people will see that than read here. What they haven’t figured out yet is that the general public are losing interest, and rapidly. Particularly as COLD weather and record snows are causing such inconvenience everywhere – not to mention the deaths. These guys are just ramming their foot (feet?) deeper and deeper down their throats. Someone ought to trip them up while they are in that position – they deserve to fall flat (and much more!).
Is this a good time to remind them what treason is?

January 7, 2013 1:09 pm

Problem is you cant compare the averages without correcting for altitude differences.
For example, have a look at Roy spencers average using ISH. Note that he does a lapse rate adjustment. So for example, if you have 1000 stations at 500 meters above sea level and you average them, you come up with say 14C. Now average 1000 stations at sea level. guess what, the lower stations will be slightly warmer per the lapse rate.
This is especially important if you have any missing data as that will sku the answer even more.
From the looks of it a simple average was computed with taking no account of alt differences. Even Roy understands why that matters.
You’d been amazed what a difference of 100 meters gets you. In fact, if you take CRN data and compare it to nearby stations ( one CRN has 14 ISH hourly stations nearby ) you might find cases where the lower CRN station, although well sited, is warmer than the horrible ISHs at airports.
Why? because the airports happen to be at higher colder elevations. For reference there are around 400 ISH hourly stations within 100km of CRN stations, so its not that hard to illustrate
So, before you compare averages of absolute temperatureyou MUST insure that the sampling distributions come from the same altitude OR correct for lapse rate. Of course if you work in anomalies you dont have to account for this.
The enviromental lapse rate is around 6.5C per 1km or .65 C per 100 meters. On average the CRN stations tend to be lower in altitude than other collections of stations. Not by alot, but precision matters, after all if you apply imprecise methods to gold standard data.. you lose what you thought to gain

January 7, 2013 1:15 pm

Anthony, you seem to be doing all the work that NCDC does not dare to do. I hope you get your share of their Big Oil Money.

Claude Harvey
January 7, 2013 1:16 pm

Anthony, how can I trust your numbers when they have not been statistically mutilated? Having read a wide variety of “AGW consensus papers” it is clear to me that the man-made warming signal cannot be teased out of such noisy data without very sophisticated, statistical mutilation of the raw numbers. The test for proper mutilation is that earlier century temperatures move down from the raw value and letter century values move up from the raw figures.

Smitty
January 7, 2013 1:21 pm

Just wondering as to where I can find the MONTHLY CRN CONUS graphic that was sed in the article as viewed above? ( http://wattsupwiththat.files.wordpress.com/2013/01/crnmap-monthly-avg-temp-f_stations_national_1920x1080_201212.jpg )
Is it from this website? http://climatereferencenetwork.org/ ? It states that it is “coming soon”…I’m just a bit confused. Thanks.

eco-geek
January 7, 2013 1:26 pm

John H.,
Thanks for the tip off – I see the July now!

Edward Martin
January 7, 2013 1:28 pm

According to the last figure in http://journals.ametsoc.org/doi/pdf/10.1175/BAMS-D-12-00170, they were right all along.

RHS
January 7, 2013 1:28 pm

Steven Mosher – What is wrong with having the least amount of data alterations? If all the data stays as raw as possible, doesn’t that remove most of the criticisms of the main stream data sets? Also, just the shear fact it is early afternoon on the east coast and mid morning on the west coast, there are differences in how much change there has been in the day and how much change is left in the day. All of which I’m sure can be “corrected” but shouldn’t the goal to have as little “corrected” or modified data as possible? Shouldn’t consistency be the most important yard stick?
The point I’m intending to make is, what skews the data and in what direction shouldn’t that be less important than consistent data usage?

eco-geek
January 7, 2013 1:33 pm

Steven M.,
Do you have to correct the lapse rate with latitude also?
It seems to me the whole exercise is fairly meaningless as a comparative measure is all that is required. There is no such thing as “absolute average temperature for a continent”. What is needed is information on whatever processes are used to generate what is in effect a “comparative average temperature”. They might in principle be doing anything.

AndyG55
January 7, 2013 1:34 pm

EM.Smith “So at a minimum this says that “station selection” has a 2 F variation in it. So much for the assertion that “station dropout” doesn’t matter…”
Yep, that is a large difference. I have always suspected that the loss of data from remote stations (how did they manage that, must have been a big effort) and the “adjustments” (lol), and the lack of proper allowance for UHI and the increased reliance on urban temp data etc all contributed to the rise in the “global average urban land temperature” in the pre-1998 period
The satellite record shows two basically zero trend sections 1979-1997, and 2000 – 2012, with a step change in the middle.
I doubt there was much real warming in the 1970-1998 period at all, yet that is the period that the CAGW hoax is built on. The small rise in the land temp was all from data manipulation and lies.

January 7, 2013 1:34 pm

Problem is you cant compare the averages without correcting for altitude differences.

I believe that is an unnecessary complication of the problem. I do not want to know what the temperature WOULD be if the entire US were ironed smooth to sea level, I want to know what it *IS*. But more importantly, I don’t need a lapse rate adjustment for surface stations if what I am interested is trend over time. A station at 6,000 feet in Colorado will remain at 6,000 for the rest of this interglacial. I am interested in the change in temperature over time, not trying to “correct” it to sea level.
A average of all reported stations is good enough when those stations do not require a correction for station hijinks and UHI. It just is what it is. If you start messing around and adjusting for lapse rate, then that opens the door for making all sorts of other adjustments. What about a wind direction adjustment? In some places one can get a very warm condition when wind is blowing from a certain direction and one gets adiabatic warming from air flowing downhill (Chinook or föhn winds). Conditions are quite different when the wind blows in the other direction.
No, lets just leave things as they are. Part of the temptation to do this comes from the desire to create a “fill” value where one is missing. That’s bogus, too, because in many parts of the country there are microclimates that make doing that an exercise in futility anyway. Trying to “fill” a missing value in a California station, for example, by using a value from a distant station is likely to be futile. Which CRN station are you going to use to fill data for a missing value at Truckee? You will notice on the map above that no station anywhere around it has anything like Truckee’s temperatures. Once you start doing adjustments, it is over.

climatereason
Editor
January 7, 2013 1:44 pm

Mosh
I am not sure you have got the correct figure for the temperature/altitude equation.
http://www.grc.nasa.gov/WWW/k-12/problems/Jim_Naus/TEMPandALTITUDE_ans.htm
There is a certain amount of ‘it depends’ as well, due to temperature inversions and other factors
Tonyb

January 7, 2013 1:47 pm

Thank you

SCheesman
January 7, 2013 1:53 pm

Re: Government warning. Yes, indeed, the NOAA ftp site link given at the end of this post does indeed lead you to a file directory with the scary U.S. Gov’t warning message. Do I need to start looking over my shoulder?

Holbrook
January 7, 2013 1:58 pm

See the “Mosher” is an unhappy bunny. It’s not just about this data…it’s the way you AGW crowd have got it wrong again and again. After the Aqua Satellite debacle this should have been over in 2002…has anyone ever discovered the missing heat in the Troposphere? In 2004 AGW became “Climate Change”…after all it sounded so much better than “Global Warming Freezing” which was what some of your comedians were debating…refer “Climatgate”…on this site. Today the MET Office reckon no warming again for a long time yet…the technical reason for this is the Hansen’s, Gore’s, King’s, Mann’s.Trenberths, Jones et al will be long gone.

January 7, 2013 2:22 pm

Crosspatch has it right
Conus? the man drew you a pic.
The hardest job in the world picking fly speck out of black pepper

Lil Fella from OZ
January 7, 2013 2:30 pm

Just some things money cannot buy. Thanks Anthony

climatebeagle
January 7, 2013 2:43 pm

I agree with crosspatch, it depends on what you are trying to get an average of. An average of the actual US surface seems more valuable than an average at a fixed height above sea-level.

Lance Wallace
January 7, 2013 2:46 pm

One explanation for the somewhat different averages listed above is the very confusing choice made by the CRN data group of descriptions of two different numbers. One is the “traditional” Tavg = (Tmin+Tmax)/2. This is described as T_Monthly_mean in their data description quoted below. The other is the average of all available “continuous” measurements, i.e. the hourly averages across the entire month. This they call T_monthly_avg. This latter measure is much closer to what most people would consider the “true” average. I discussed the difference (with maps of all the stations) in a guest post or two a few months ago. We can see for example that almost all stations have a consistent difference across all years (either positive or negative), averaging about 0.5 C, between the “traditional” average from the min and max measurements and the better estimate using hourly averages.
http://wattsupwiththat.com/2012/08/30/errors-in-estimating-temperatures-using-the-average-of-tmax-and-tmin-analysis-of-the-uscrn-temperature-stations/
http://wattsupwiththat.com/2012/08/30/errors-in-estimating-temperatures-using-the-average-of-tmax-and-tmin-analysis-of-the-uscrn-temperature-stations/
This is the CRN definition of the two terms.
cols 57 — 63 [7 chars] T_MONTHLY_MEAN
The mean temperature, in degrees C, calculated using the typical
historical approach of (T_MONTHLY_MAX + T_MONTHLY_MIN) / 2
cols 65 — 71 [7 chars] T_MONTHLY_AVG
The average air temperature, in degrees C, for the month. This average
is calculated using all available day-averages, each derived from
24 one-hour averages. To be valid there must be less than 4 consecutive
day averages missing, and no more than 5 total day averages missing.
The difference between the N of 116 and 124 is basically that 7 locations have 2 stations (in one case 3) sited close by. My calculations weight every site equally, so actually have 124 sites in 116 locations (probably should have made that clear earlier, since my table listed only the 116 locations.) I believe this is better than averaging the two sites at the same city, since they are not true duplicates but often separated by some miles, and may be in quite different locales and subject to different meteorological conditions.
I think that properly we should just look at the individual stations to see how they are varying. Also since CRN is so recent, and “trends” are limited to about 4 years (4 datapoints) for the full network, I can’t think that a trend analysis would mean a thing until after about a decade or two. I do hope that this network, well planned and maintained as it is, will retain funding for the future.

Bill
January 7, 2013 2:49 pm

Mosh,
That will change the absolute values but not the slopes when plotted as a function of time, correct?
So, if you want an “accurate” “average” temperature then altitude will matter but if you just want the trend it should not matter, correct?

Mike Bromley the Canucklehead back in Kurdistan
January 7, 2013 3:09 pm

Glad to have all the siting just right, but: does an average temperature of something that covers subtropical, alpine and temperate zones really have any meaning? Crickets….

January 7, 2013 3:13 pm

“Did Al Gore share some of his Big Oil money with you?”
I’ll bet he just put it with the Big Tobacco money and the Anti-Big Tomacco money he has already taken from both sides.

jorgekafkazar
January 7, 2013 3:22 pm

Elevation, UHI, and various other surface temperature data tweaks are red herrings. An “average global temperature” based on the atmosphere does not account for humidity, wind kinetic energy, and latent heat. No matter how many corrections we make, we’ll never have a meaninful apples-to-apples comparison.
Worse yet, this so-called “global temperature,” even with the best corrections for elevation, etc., is irrelevant compared to the energy balance of the oceans. The thermal capacity (BTU/⚬F) of the oceans is 1100 times greater than the atmosphere.
Instead, we’re trying to measure the most transient part of the system, the one with the most noise. Is it any wonder the climatasters are using arcane statistical methods to tease out a signal?

E.M.Smith
Editor
January 7, 2013 3:22 pm

@Mosher:
So can you point me at where GIStemp does their lapse rate adjustment?
Didn’t the folks at NCDC say it doesn’t matter if stations come and go? Where is their lapse rate adjustment?
“Now average 1000 stations at sea level. guess what, the lower stations will be slightly warmer per the lapse rate.”
So is that why the GHCN consistently drops high altitude stations and replaces them with sea level stations? Nice to know. Thanks!
http://chiefio.wordpress.com/2009/11/16/ghcn-south-america-andes-what-andes/
http://chiefio.wordpress.com/2009/12/01/ncdc-ghcn-africa-by-altitude/
http://chiefio.wordpress.com/2009/11/17/ghcn-the-under-mountain-western-usa/
http://chiefio.wordpress.com/2009/11/13/ghcn-pacific-islands-sinking-from-the-top-down/
Oh, yeah, the fictional “anomaly” is supposed to fix all that… the one that isn’t done until it’s all ‘grid/boxes’ in the last step….
“Why? because the airports happen to be at higher colder elevations.”
Must not fly much… FYI Airports are built where there is a lot of flat land, typically. As often as possible down in the valley floors or even next to water (so a long approach can be made with a low flat surface). Examples? SFO approach over the bay. Moffett Field approach over the bay. SJC San Jose approach over the bay. Not one of them up in the surrounding hills. ORD Chicago on flat land (as is all of Chicago near the lake). Denver down on the flatter part down slope from downtown. Etc. etc. etc.
Folks only put airports on mountains and mountain tops when there is no alternative. One finds LAX down low, not in Beverly Hills… Even Reno and Lake Tahoe airports are on the flat land, not in the hills you can see from them… It is easier to get enough density altitude and runway speed on a lower flat runway than on a high bumpy one.
:
Averaging temperatures can have no meaning.
http://chiefio.wordpress.com/2011/07/01/intrinsic-extrinsic-intensive-extensive/
Yet it is widely done. GIStemp keeps temperatures AS temperatures all through the various transformations and creation of a fictional “grid/box” value (that they call a temperature). Then at the very end they make a ‘grid/box anomaly’ between two of these fictional grid/box temperatures. All fundamentally hokum due to averaging intensive properties and not dealing with enthalpy. But “it’s what they do”. So your instinct is sound.
I’ve averaged temperatures for the purpose of seeing the ‘shape of the data’ that causes some “climate scientists’ to have a hissy fit as they presume I think that results in a temperature when it doesn’t. (But it is a good way to see what the basic nature of the change in the numbers might be… bigger, smaller, more variation. Metadata about the data…)
With that said, to do it with some sanity you need to weight things for a variety of stuff that approximates enthalpy and sample bias. It ought to include areal weighting, altitude, distance from water, relative humidity, phase change of fluid (snow, ice, evaporation) and a few more. “Climate scientists” pick a couple from the list that would give an extrinsic property and ignore the rest. So you can ‘cherry pick’ a few too.
What I did was to never average a temperature. (Other than that the input I had available was a min-max average already… I really ought to re-do this with just the mins and just the maxs). Just do a ‘first differences’ style anomaly creation on one, and only one, instrument record at a time. I think that gives the cleanest view of what is going on. At that point, averaging the anomalies is valid.
http://chiefio.wordpress.com/category/dtdt/
and in theory you can ignore things like altitude and areal weighting (though in reality there are still a couple of ‘issues’ with station change…)
@Eco-geek:
Good one!
:
That’s why what I did does the anomaly as the very first step and does not ‘fix’ missing data but just ‘bridges the gap’ for that particular instrument and place.
http://chiefio.wordpress.com/category/dtdt/
Each instrument and month compared only to itself. Nothing else.
What it shows is that at any given place any given month may be going up or down in trend. Overall, not much changing on the globe. Some nations warming, some cooling (often next door to each other). Overall impression is that it is “data are variable”.
Yet there are sea change moments, such as the point when the MMTS is rolled out, where a ‘jump’ happens. Is it the instrument or the ‘fix’ for the change? In either case, it’s not the reality, it’s the fiddling…
There is NO global warming. There are some instrument records for some places in some months that rise (often from the lows being lifted, not the highs getting hotter). Only averaging that in with all the ‘no change’ or ‘cooling’ places makes anything “global”, and that number is a data artifact ridden fantasy…

TImothy Sorenson
January 7, 2013 3:24 pm

If you read the pdf you will discover that one aspect of the CRN network is to provide long term, un-interrupted sites. However, they have planned for possible relocations due to owners requirements, failures etc…
However, already they are talking about ‘adjustments’ to the data:
“273 (Collow et al. 2012). It is now planned that if a station must be removed for non-emergency
274 reasons, such as the changing needs of the site host, there would ideally be one or two years of
275 time to run a new USCRN station at a nearby site so as to develop an accurate calibration of the
276 differences in climate between the sites and adjust the data of the discontinued site to match the
277 new site. This process is currently underway for one station in Goodwell, OK, that is required to
278 be removed because of unanticipated planned local LULC. Given such sufficient advance notice, ”
This assumes things that should be up for argument:
a.) We know that nearby sites have ‘correlated’ data but the assumption that a universal longterm adjustment can be made is absolutely silly.
b.) If the second replacement site is close enough (I am not sure as to how to properly define it) it is more reasonable to assume the new site is ‘equivalent’ to the old site and any common time data should, perhaps be averaged. But to argue that a prior or later site to permanently adjust just opens the door to further adjustments.
Even if one sees one site is systematically warmer or colder for the approximate two year window it should/would would only show that micro-climate variations exists and we can not begin to account for them for every station. It just shows that weather varies in in locales. How you can argue that one is a BETTER representative of the area vs another if they both meet siting guidelines is silly.

Brian D
January 7, 2013 3:28 pm

USHCN(2.5) vs. CRN 2012. CRN difference from USHCN in ().
Jan – 36.1 vs 36.8(0.7)
Feb – 37.7 vs 38.1(0.4)
Mar – 50.3 vs 50.6(0.3)
Apr – 54.6 vs 54.8(0.2)
May – 63.4 vs 63.3(-0.1)
Jun – 70.5 vs 70.8(0.3)
Jul – 76.9 vs 75.6(-1.3)
Aug – 73.8 vs 72.9(-0.9)
Sep – 66.2 vs 65.6(-0.6)
Oct – 53.9 vs 53.9(0.0)
Nov – 43.9 vs 43.9(0.0)
Dec – ? vs 36.7(?)
Ann – ? vs 55.25(?)
USHCN has 1998 @ 54.32 as the warmest in the record followed by 2006 @ 54.30. CRN is 0.93 warmer than the USHCN record in 1998. Up until Nov, USHCN has an avg of 57.03 and CRN at 56.94 (-0.09).
I used the table/year setting for the months and then table/rank for the annual here.
http://www.ncdc.noaa.gov/oa/climate/research/cag3/na.html
Looks like CRN showed warmer in the colder months and cooler in the hotter months. Interesting. Wonder if that plays out like that in the previous few years.

D D Leone
January 7, 2013 3:29 pm

I thought maybe the world was ending, but alas it was only the cacophony of all the wooopsies…

David L
January 7, 2013 3:32 pm

So I plugged all these values into Minita. First, I was really surprised that the data is fairly well normally distributed. Second, summary statistics show the average as 36.6 with a 95% confidence interval of 2.4. Assumptions of accuracy or precision less than 2.4F are BS. So the discrepencies between the data sets from a statistical standpoint are meaningless: both averages are tolerable estimates of the true population average. But it’s not known to within 0.1F folks!

LearDog
January 7, 2013 3:57 pm

Climate REFERENCE Network. Keep repeating, NCDC…
I’m really glad that you’re doing the Anthony. Quite the achievement – and now ……in their face. And it isn’t like you didn’t give fair warning….
;-D

AndyG55
January 7, 2013 4:00 pm

And Mosh, saying that the lapse rate adjustment should be 6.5C/km is a furphy to.
The rate depends on many things. Moisture content most specifically.
There is NO WAY you know the proper adjustment to correct for altitude at any specific place or time.
Just another fudge factor available for the AGW /Giss/Hadcrud bletheren to use.
[“a furphy to.” ??? Mod]

January 7, 2013 4:26 pm

I still think this average business is foolishness. It has no real value for anything. A nice but only marginally useful number. It is just another over simplified meta data that is used by the spin doctors on all sides of the issue. The reality is it gets you nothing, except a pork barrel grant, and is about a useful and meaningful as a politician. If we want to look at tightly defined geographic areas, altitude corrected, etc. then at least we have a number that the people in those regions understand and may for them be useful. I think the land and water masses cause some difficulties that further confound world wide average to an even more meaningless number.

January 7, 2013 4:29 pm

Paul Marko says:
January 7, 2013 at 11:34 am
December’s sun was really quiet. SSN ~ 40; 10.7 ~ 108; Ap ~ 3. Dalton or Maunder?
“Eddy”
http://wattsupwiththat.com/2009/06/13/online-petition-the-next-solar-minimum-should-be-called-the-eddy-minimum/

ChrisD
January 7, 2013 4:38 pm

possibly a silly question, but for average yearly temperature should the monthly averages be weighted by number of days in the month? Or is the yearly average normally just the mean of the month averages for comparisons over several years? I suspect there would be very little difference in the results anyhow when looking for trends over years/decades….but just a mean of the monthly averages would be skewed a bit by an exceptionally warm or cold February, for example…or maybe the monthly averages are already somehow normalized?

January 7, 2013 4:53 pm

More mistakes can happen when records from other countries are fed into the global compilations. CLIMAT data feeds are based on whatever a country supplies, not necessarily on (min+max)/2. About 40% of countries, the largest China and the former Soviet Union, use the mean of evenly-spaced observations. About 20% try to replicate a true mean through weighted temp averages at particular hours, mostly in Europe and Latin America.
It’s believed by some authorities that as long as this is consistent through time it doesn’t distort long-term trends, although it can affect comparisons between countries. Most have been consistent but some have changed methods or observation timing, sometimes conflicting with daylight saving.
The consensus seems to be that these conflicts negate each other and don’t cause a systemic bias in global temperatures. Personally, I remain to be convinced.
We know that Australian data fed to GISS from 1994 was on average .15C below the BoM’s HQ figures, an error that wasn’t noticed until around 2003, and the GISS and NCDC records from 1994 to 2004 weren’t corrected till 2009. Is it corrected now?
This happened because it was agreed that Australia send (min+max)/2 data to the US but forgot to do it for nearly a decade. A 0.15 deg discrepancy could be way, way off the mark, higher or lower.
Australia’s BoM processes raw data from observer sheets to produce a homogenised version, then a High Quality network of more than 100 stations that vanished fairly quickly, now a new version called ACORN including some different stations. There are periods of a year or more when these versions can differ at a nominated station by more than 1C.
Working back from deg C to deg F you find that if one place after the decimal is used, you can get two solutions for one conversion. If you try to rationalise, you find a problem that cannot be solved. It turns out that at many Australian sites, temperatures were recorded in whole deg F more than 30% of the time.
Working with grids and interpolation before sending numbers to the US, of course no interpolation scheme is perfect. One issue we see with a situation like the current one is in areas with steep gradients and sparse networks, as Stephen Mosher nores above. A good example is in Australia’s Nullarbor. Long-term averages from Cook station go into the average fields which are about 5-6C warmer in summer than the coast is. But without any current Cook observations to “anchor” the analysis, on very hot days the anomaly from Nullarbor Roadhouse will be projected too far inland. So, for example, if you have a 45-degree day at Nullarbor (which is 18 degrees above average), that +18 anomaly will be applied to Cook’s 32 degree average to give an analysis of 50 – but in reality on very hot days there’s usually little difference between the inland and coastal sites.
Personally, I think absolute values are far more important than trends, for one of the main uses is in proxy reconstructions which don’t seem to have a natural mechanism for change at country borders or when a new normal is introduced.
This whole topic suffers from the old hymn, “Build on the rock and not upon the sand” for the sands here are forever shifting.

January 7, 2013 5:04 pm

Mosh, the lapse rate surely depends on the time available for the rising, cooling air mass to shed its excess heat, by whatever mechanism. It can’t be 100% instant radiation loss. There must be some conduction loss to move a thermometer. Can’t see how one value fits all.

Tom in Florida
January 7, 2013 5:14 pm

Dennis Nikols says:
January 7, 2013 at 4:26 pm
“I still think this average business is foolishness.”
Have to agree, totally. Isn’t the whole problem caused by the ridiculous idea that Earth has an average temperature?

richardscourtney
January 7, 2013 5:53 pm

Tom in Florida:
At January 7, 2013 at 5:14 pm you say

Have to agree, totally. Isn’t the whole problem caused by the ridiculous idea that Earth has an average temperature?

That depends on what you mean by “Earth has an average temperature”.
If you have not seen it then I think you will want to read this, especially its Appendix B.
http://www.publications.parliament.uk/pa/cm200910/cmselect/cmsctech/memo/climatedata/uc0102.htm
And please note how the paper was prevented from publication by the frequent data changes which have been much discussed on several threads of WUWT today.
Richard

David L
January 7, 2013 5:55 pm

Dennis Nikols says:
January 7, 2013 at 4:26 pm
“I still think this average business is foolishness.”
Agreed. Conpletely. The earth has temperature variation due to altitude, longitude,and latitude, and season, amongst other factors. To boil all this down to an average number is meaningless. It’s like measuring the temperature inside my furnace, inside my fridge, inside my freezer, inside my garage, and inside every room of my house, including basement and attic and then quoting an average of all measurements, and then to say the average is trending upward because the sun comes out and warms the attic. Total misuse of statistics and physically meaningless.

johanna
January 7, 2013 6:14 pm

Geoff Sherrington, even the automatically recorded BOM data is a mess.
As you know, we are experiencing a spot of hot weather in south eastern Australia. I keep an eye on the Canberra readings (which come from the airport, but that’s another story.) Anyway, the other night, according to the BOM, the temperature dropped from about 22 to 5 degrees in 10 minutes. I assure you that the temperature did not change much at all. Furthermore, that ‘minimum’ stayed on the chart as the lowest minimum for the rest of the reporting period. The other thing I noticed is that when it got really hot (high 30s) the chart would just blank out altogether for sometimes hours at a time. I would not trust these readings, which no-one seems to check (see the absurd drop to 5 degrees in the middle of a heatwave, uncorrected for at least 12 hours, if ever) as far as I can throw Al Gore.
Back on topic, I am awestruck that a citizen scientist with a family, a business and the world’s biggest science blog manages to do quality control for massively funded public agencies in his spare time, such as it is. The overpaid and lazy slobs who are supposed to be in charge of this stuff should hang their heads in shame. Anthony, if you were being paid adequately for doing their jobs for them, you would never have to work again.

AndyG55
January 7, 2013 6:25 pm

I guess the real point is that you cannot compare temperatures measured with different systems.
The CRN is a new system, and it will take several years before they can draw any comparisons.
The same when you “disappear” 2/3 of your measuring stations. You are then working with a new measuring system. And when you allow urban encroachment within that system, you =have a continually changing measurement system.
You CANNOT reliably compare calculated /averaged readings even a couple of years apart because the overall system has changed.
Because of the massive unreliability of the measurement system, it does make it very easy to fudge the data to say what you want someone else to believe., if you have an agenda to do so.
The whole thing is a mess and totally unreliable. Why the heck they are wasting so much money on idiocies related to temperature rise, when , in reality, we actually have NO IDEA whether any real rise has actually occured.

January 7, 2013 6:26 pm

Lance Wallace says:
January 7, 2013 at 12:34 pm
year sites months mean (F) Std. Err. (F)
2008 112 1382 51.9 0.50
2009 114 1449 51.7 0.49
2010 116 1467 52.0 0.49
2011 116 1493 51.9 0.50
Incredible stability for those four years!
Average 51.88F/11.06C.
2008-2011: change 0.0C
But http://journals.ametsoc.org/doi/pdf/10.1175/BAMS-D-12-00170
Fig. 6, pg 42 CONUS annual mean temperatures derived independently using a first difference method
relative to the 2006-2010 mean of each data source: the USCRN (blue) and the USHCN (red).
Revised and updated from Menne et al., 2009.
2006: +0.60C,
2008: -0.41C
2011: +0.20C
Change 2006-2008: 1.01C OR 1.88F.
Change 2008-2011: -.040C OR 0.36F
……confusion here, not that this is unusual ….

January 7, 2013 6:37 pm

Tom in Florida says:
January 7, 2013 at 5:14 pm
Dennis Nikols says:
January 7, 2013 at 4:26 pm
“I still think this average business is foolishness.”
Have to agree, totally. Isn’t the whole problem caused by the ridiculous idea that Earth has an average temperature?

. . . And the even-more ridiculous claim that this fictitious ‘global’ temperature can provide evidence that mankind is about to overheat the planet by burning fossil fuels. It’s a very clever magical trick; easy to pretend you’re sawing the Earth in half while the audience is distracted by smoke and mirrors—or rather, smokestacks and ice floes.
P.T. Barnum would have been proud of the Climatists.
/Mr Lynn

apachewhoknows
January 7, 2013 6:48 pm

Joseph Stalin take on this: ” Its not who measures the tempature, its who averages the tempatures and writes the report to be published for the offical records.”
“Agenda control job one.”
They intend to lay the U.S. low by any means.
Its all agenda. We too are just things to be adjusted to fit the agenda.

Paul Marko
January 7, 2013 6:50 pm

GeoLurking says:
January 7, 2013 at 4:29 pm
“Eddy”
Thanks for link.

January 7, 2013 7:28 pm

David L says:
January 7, 2013 at 5:55 pm (Edit)
Dennis Nikols says:
January 7, 2013 at 4:26 pm
“I still think this average business is foolishness.”
Agreed. Conpletely. The earth has temperature variation due to altitude, longitude,and latitude, and season, amongst other factors. To boil all this down to an average number is meaningless. It’s like measuring the temperature inside my furnace, inside my fridge, inside my freezer, inside my garage, and inside every room of my house, including basement and attic and then quoting an average of all measurements, and then to say the average is trending upward because the sun comes out and warms the attic. Total misuse of statistics and physically meaningless.
##########################
Actually it is not meaningless. People need to get the notion out of their heads that Hansen, Jones, etc are calculating an average temperature. They ( and we ) are doing something quite different although “averaging ” is used, and people call it “an average”. What it is, what it mathematically is ( forget the PR and focus on the science ) is an estimate of temperature at UNMEASURED places. Such that, if I take all the measures together and use the correct techniques I can win the following game.
1. Pick a place, any place on the planet. Hide a thermometer there for 1 month.
2. I will now calculate “the average” NOT USING that point.
3. Challenge people to guess the temperature at your unknown location.
all players get the time ( the month ) and all known data for that month.
The job is to estimate the temperature at an undisclosed location.
the “average” (done right ) will be the best estimate. Now tell me the month, the latitude, the longitude and the altitude and my estimate will be even closer. And we can test this synthetically by generating centuries of synthetic data ( that looks like weather ) for the entire global and then sampling that complete field sparsely. and seeing how well our estimating proceedure works.
Or, I can USHCN to “predict” what you will see at CRN.
So, we use an “average” ( its really not an average ) to come up with an estimate for the temperature at any given spot. Its not really an average, although people call it an average. But when you look down into the math of things you see.. “Oh, this is an estimate of the temperature at unknown locations that minimizes the error of prediction”
Finally “average” temperature also has a meaning when we talk about things like the LIA and say
“It” was cooler in the LIA.. or “It” was warmer in the MWP.
We all say that. “it was warmer in the MWP” thats not meaningless.

January 7, 2013 7:30 pm

“If you read the pdf you will discover that one aspect of the CRN network is to provide long term, un-interrupted sites. However, they have planned for possible relocations due to owners requirements, failures etc…”
yes, one such move will supply data on the effect that nearby roads have on temperature measures. So instead of speculation ( roads will corrupt the data ) you’ll actually have data and magnitudes and all sorts of science.. as opposed to speculation

AndyG55
January 7, 2013 7:47 pm

“[“a furphy to.” ??? Mod]”
too !!!
sorry !, me bad typist 🙁

john
January 7, 2013 7:47 pm

MangoChutney says:
January 7, 2013 at 12:21 pm
CONUS or CON US?
—————————
E.Pluribus fool em.
john from DB

AndyG55
January 7, 2013 7:47 pm

and with a comma in front, tooo

January 7, 2013 7:51 pm

EM
“@Mosher:
So can you point me at where GIStemp does their lapse rate adjustment?
Didn’t the folks at NCDC say it doesn’t matter if stations come and go? Where is their lapse rate adjustment?
###########################
EM
1. Remember the concern about the loss of thermometers.. What was your concern? Loss of high altititude thermometers. Why? because they tend to be colder.
2. When you work in ABSOLUTE temperature then you have to take care to adjust in lapse rate. This is why, GISS works with anomalies
Lets do a little example
We are going to average two stations. 1 station is at sea level. the other station is at
1km. above it.
Ready. We will do 10 years of data. station A is at sea level and station B is at 1000 meters
A) 6 6 6 6 6 6 6 6 6 6
B) 0 0 0 0 0 0 0 0 0 0
See. Now we made that all simple because all the data is there and its always there. So we can can just sum every month and divide by 2 and presto.. the average is 3.
Now lets have a data drop out.
A) 6 6 6 6 6 6 6 6 6 6
B) 0 0 0 0 0 0 0 0 NA NA
And we average.. and Opps! the average goes from 3C to 6C what the hell!
So. You have two options.
Option 1. Use anomalies
Option 2. Adjust for lapse rate.
Since Giss uses Anomalies they dont have to and should not correct for lapse rate.
So, if you are working in ABSOLUTE temperature and trying to COMPARE your dataset to another dataset in ABSOLUTE temperature, then you must check for altitude differences or you can just adjust for lapse rate. Its easy. You can even do it empricially.

January 7, 2013 7:53 pm

E.M.Smith says:
January 7, 2013 at 12:08 pm (Edit)
A man with a watch knows what time it is.
A man with two watches is never sure….
So at a minimum this says that “station selection” has a 2 F variation in it. So much for the assertion that “station dropout” doesn’t matter…
############
EM its already been proven that station drop out doesnt matter, cause we put the stations back in and the answer didnt change. Anomalies. Lovely thing.

bw
January 7, 2013 7:58 pm

As others have stated, crosspatch points out what needs to be stressed.
Any point averages need to be justified. Alaska, Hawaii, Oregon, Kansas and Florida have independent climates that are not related to each other. This is basic science. There is no physical meaning to averaging tropical data with polar data.
There may be some justification for averages, but not in this case.
Also, there is danger in averaging time data. Why are you averaging July with January? Does averaging July with January have physical meaning?
I’ve plotted monthly temperatures of 15 CRN sites with 10 full years of data. None show any significant trend in temperature since those stations began operating.
The best climate site that addresses the AGW issue is http://www.surfacestations.org/
When the average reader sees photos of thermometers next to air conditioners, they understand what is going on.
Another important story to me is what GISS is doing to their “data” on a monthly basis.
Keep up the good work.

January 7, 2013 8:01 pm

Steven Mosher says:
“Since Giss uses Anomalies they dont have to and should not correct for lapse rate… its already been proven that station drop out doesnt matter, cause we put the stations back in and the answer didnt change. Anomalies. Lovely thing.”
But, some things DO change at GISS:
http://jonova.s3.amazonaws.com/graphs/giss/hansen-giss-1940-1980.gif

mpaul
January 7, 2013 8:03 pm

Anthony, your headline reads:

Announcing the first ever CONUS yearly average temperature from the Climate Reference Network

As your PR adviser, might I suggest that this headline will likely not get much attention with the MSN.
Rather, you should consider:

2012 hottest year ever recorded in the CONUS yearly average temperature index

See how much better that is?

January 7, 2013 8:11 pm

climatebeagle says:
January 7, 2013 at 11:17 am (Edit)
On the hourly numbers I calculated an USCRN 2012 yearly average of 12.1°C or 53.7°F, but I think my list of USCRN stations is different since I have 124. Probably at least because I’m using all the USCRN stations, thus not the same as CONUS.
######################
there are some additional stations beyond those that anthony talks about, these are regional networks. Also excellent stations. To get the correct CRN you need to download the metadata
Also, Folks should realize that not all of the CRN are actually commissioned and operational, some are experimental. This status is in a different file that I have around her somewhere. Last I looked I had a count of around 108 that were actually commissioned and non experimental. Then of course you should drop those that are actually in built up areas or have concrete around them ( 30 meter NLCD data can help you spot that in a jiffy )

January 7, 2013 8:16 pm

EM
““Why? because the airports happen to be at higher colder elevations.”
Must not fly much… FYI Airports are built where there is a lot of flat land, typically. As often as possible down in the valley floors or even next to water (so a long approach can be made with a low flat surface). Examples? SFO approach over the bay. Moffett Field approach over the bay. SJC San Jose approach over the bay. Not one of them up in the surrounding hills. ORD Chicago on flat land (as is all of Chicago near the lake). Denver down on the flatter part down slope from downtown. Etc. etc. etc.”
#################
Sorry I wasnt clear.
Here is the example I am thinking of.
You have a CRN station at 5 meters high, and 10 km away you have an airport at
200meters above sea level
Guess what?
With a lapse rate of 6.5C per 1000 meters how much cooler is a airport at 200 meters versus a station at sea level?
get it.
Lapse rate, it will get you every time if you are not careful. Perhaps a few examples of prefect stations that are warmer than their neighbors simply because they differ in altitude by 200 or 300 meters. 6.5C per 1000 meters in altitude. Guess you dont fly much

climatebeagle
January 7, 2013 8:22 pm

Still having trouble resolving the station count.
The USCRN list here has 133 stations.
http://www.ncdc.noaa.gov/isis/stationlist.htm?networkid=1
Three do not appear in the hourly data for 2012:
TN Oakridge 0 N
SA TIKSI 4 N
VA Sterling 0 N
Thus 130 stations in the USCRN that reported hourly data in 2012:
For CONUS we can exclude:
AK – 12 stations
HI – 2 stations
ON – 1 station
Leaving: 115.
But Anthony says he is using 117 USCRN CONUS stations, with one skipped for missing data.
TN Oakridge,SA TIKSI and VA Sterling also seem to be missing from the monthly data.
So 115 vs. 117?
Here’s the list I manually created from the USCRN list and mapping file names to the WBANNO numbers (thus it has 130, excluding TN Oakridge,SA TIKSI and VA Sterling) :
(Mods feel free to remove this if it’s too long)
AK_Barrow_4_ENE,27516,USCRN
AK_Fairbanks_11_NE,26494,USCRN
AK_Gustavus_2_NE,25380,USCRN
AK_Kenai_29_ENE,26563,USCRN
AK_King_Salmon_42_SE,25522,USCRN
AK_Metlakatla_6_S,25381,USCRN
AK_Port_Alsworth_1_SW,26562,USCRN
AK_Red_Dog_Mine_3_SSW,26655,USCRN
AK_Sand_Point_1_ENE,25630,USCRN
AK_Sitka_1_NE,25379,USCRN
AK_St._Paul_4_NE,25711,USCRN
AK_Tok_70_SE,96404,USCRN
AL_Fairhope_3_NE,63869,USCRN
AL_Gadsden_19_N,63857,USCRN
AL_Selma_13_WNW,63858,USCRN
AR_Batesville_8_WNW,23904,USCRN
AZ_Elgin_5_S,53132,USCRN
AZ_Tucson_11_W,53131,USCRN
AZ_Williams_35_NNW,53155,USCRN
AZ_Yuma_27_ENE,53154,USCRN
CA_Bodega_6_WSW,93245,USCRN
CA_Fallbrook_5_NE,53151,USCRN
CA_Merced_23_WSW,93243,USCRN
CA_Redding_12_WNW,04222,USCRN
CA_Santa_Barbara_11_W,53152,USCRN
CA_Stovepipe_Wells_1_SW,53139,USCRN
CA_Yosemite_Village_12_W,53150,USCRN
CO_Boulder_14_W,94075,USCRN
CO_Cortez_8_SE,03061,USCRN
CO_Dinosaur_2_E,94082,USCRN
CO_La_Junta_17_WSW,03063,USCRN
CO_Montrose_11_ENE,03060,USCRN
CO_Nunn_7_NNE,94074,USCRN
FL_Everglades_City_5_NE,92826,USCRN
FL_Sebring_23_SSE,92827,USCRN
FL_Titusville_7_E,92821,USCRN
GA_Brunswick_23_S,63856,USCRN
GA_Newton_11_SW,63829,USCRN
GA_Newton_8_W,63828,USCRN
GA_Watkinsville_5_SSE,63850,USCRN
HI_Hilo_5_S,21515,USCRN
HI_Mauna_Loa_5_NNE,21514,USCRN
IA_Des_Moines_17_E,54902,USCRN
ID_Arco_17_SW,04126,USCRN
ID_Murphy_10_W,04127,USCRN
IL_Champaign_9_SW,54808,USCRN
IL_Shabbona_5_NNE,54811,USCRN
IN_Bedford_5_WNW,63898,USCRN
KS_Manhattan_6_SSW,53974,USCRN
KS_Oakley_19_SSW,03067,USCRN
KY_Bowling_Green_21_NNE,63849,USCRN
KY_Versailles_3_NNW,63838,USCRN
LA_Lafayette_13_SE,53960,USCRN
LA_Monroe_26_N,53961,USCRN
ME_Limestone_4_NNW,94645,USCRN
ME_Old_Town_2_W,94644,USCRN
MI_Chatham_1_SE,54810,USCRN
MI_Gaylord_9_SSW,54854,USCRN
MN_Goodridge_12_NNW,04994,USCRN
MN_Sandstone_6_W,54932,USCRN
MO_Chillicothe_22_ENE,13301,USCRN
MO_Joplin_24_N,23908,USCRN
MO_Salem_10_W,23909,USCRN
MS_Holly_Springs_4_N,23803,USCRN
MS_Newton_5_ENE,63831,USCRN
MT_Dillon_18_WSW,04137,USCRN
MT_Lewistown_42_WSW,04140,USCRN
MT_St._Mary_1_SSW,04130,USCRN
MT_Wolf_Point_29_ENE,94060,USCRN
MT_Wolf_Point_34_NE,94059,USCRN
NC_Asheville_13_S,53878,USCRN
NC_Asheville_8_SSW,53877,USCRN
NC_Durham_11_W,03758,USCRN
ND_Jamestown_38_WSW,54937,USCRN
ND_Medora_7_E,94080,USCRN
ND_Northgate_5_ESE,94084,USCRN
NE_Harrison_20_SSE,94077,USCRN
NE_Lincoln_11_SW,94996,USCRN
NE_Lincoln_8_ENE,94995,USCRN
NE_Whitman_5_ENE,94079,USCRN
NH_Durham_2_N,54794,USCRN
NH_Durham_2_SSW,54795,USCRN
NM_Las_Cruces_20_N,03074,USCRN
NM_Los_Alamos_13_W,03062,USCRN
NM_Socorro_20_N,03048,USCRN
NV_Baker_5_W,53138,USCRN
NV_Denio_52_WSW,04139,USCRN
NV_Mercury_3_SSW,53136,USCRN
NY_Ithaca_13_E,64758,USCRN
NY_Millbrook_3_W,64756,USCRN
OH_Coshocton_8_NNE,54851,USCRN
OK_Goodwell_2_E,03055,USCRN
OK_Goodwell_2_SE,53182,USCRN
OK_Stillwater_2_W,53926,USCRN
OK_Stillwater_5_WNW,53927,USCRN
ON_Egbert_1_W,64757,USCRN
OR_Coos_Bay_8_SW,04141,USCRN
OR_Corvallis_10_SSW,04236,USCRN
OR_John_Day_35_WNW,04125,USCRN
OR_Riley_10_WSW,04128,USCRN
PA_Avondale_2_N,03761,USCRN
RI_Kingston_1_NW,54796,USCRN
RI_Kingston_1_W,54797,USCRN
SC_Blackville_3_W,63826,USCRN
SC_McClellanville_7_NE,03728,USCRN
SD_Aberdeen_35_WNW,54933,USCRN
SD_Buffalo_13_ESE,94081,USCRN
SD_Pierre_24_S,94085,USCRN
SD_Sioux_Falls_14_NNE,04990,USCRN
TN_Crossville_7_NW,63855,USCRN
TX_Austin_33_NW,23907,USCRN
TX_Bronte_11_NNE,03072,USCRN
TX_Edinburg_17_NNE,12987,USCRN
TX_Monahans_6_ENE,03047,USCRN
TX_Muleshoe_19_S,03054,USCRN
TX_Palestine_6_WNW,53968,USCRN
TX_Panther_Junction_2_N,22016,USCRN
TX_Port_Aransas_32_NNE,23906,USCRN
UT_Brigham_City_28_WNW,04138,USCRN
UT_Torrey_7_E,53149,USCRN
VA_Cape_Charles_5_ENE,03739,USCRN
VA_Charlottesville_2_SSE,03759,USCRN
WA_Darrington_21_NNE,04223,USCRN
WA_Quinault_4_NE,04237,USCRN
WA_Spokane_17_SSW,04136,USCRN
WI_Necedah_5_WNW,54903,USCRN
WV_Elkins_21_ENE,03733,USCRN
WY_Lander_11_SSE,94078,USCRN
WY_Moose_1_NNE,04131,USCRN
WY_Sundance_8_NNW,94088,USCRN

mpainter
January 7, 2013 8:23 pm

D Boehm Stealey says: January 7, 2013 at 8:01 pm
=================
Has Hansen ever been called to account concerning this apparent fabrication?

D Böehm
January 7, 2013 8:38 pm

mpainter,
Not that I know of. Nor for his regular lawbreaking, nor for his using his GISS position for politics, etc.
BTW, there are more examples of GISS “adjusting” the temperature record. Their adjustments take one of two forms: either lowering past temperatures in order to show more rapid warming, or adjusting current temperatures upward. The result is always more alarming than reality.

Jeff Alberts
January 7, 2013 8:46 pm

Does this “CONUS Average temperature” have any real physical meaning?

January 7, 2013 8:53 pm

Well, after all is said and done, at least we’ve provided Tamino with his next post.
So we might as well wait for the world’s-best statistician to make his comments.
Problem is, he won’t post his results here, and you won’t be allowed to comment there.

bw
January 7, 2013 9:02 pm

D Boehm, I’ve seen the same trend and agree fully on your GISS observations. However, recently I’ve found one station (Yakutat) where old temp data have been increased, resulting in a decadal cooling trend when compared to data I saved six months ago.
I still have the six month old Yakutat text file with that data, along with about a dozen other key stations that I’ve been monitoring for years. So far, four Antarctic stations (Amundsend-Scott, Vostok, Halley and Davis) have not been altered. All show zero temp changes since their data began in the 1950s.

bw
January 7, 2013 9:08 pm

As for past temp changes, I forgot to add this link,
http://stevengoddard.wordpress.com/data-tampering-at-ushcngiss/
There are others who are monitoring what NOAA is doing to historical “data”

theduke
January 7, 2013 9:27 pm

Year after year, my admiration for Anthony and his dedication to the science of meteorology and climate continues to grow. The world owes him a debt of gratitude for his efforts to promote understanding of these complicated issues.

January 7, 2013 9:34 pm

I expect a warm US and a cool world for December. Most of December was very warm in the US plains and east and they didn’t get a cold blast until late in the month. They will be getting another cold blast in the last two weeks of January, too. This is one reason why I tend to like seasonal averages rather than monthly averages. You can have an unusually cold 4 week period during a season that is split between two different months. The seasonal average might be a more accurate measure of climate.

mpainter
January 7, 2013 9:40 pm

bw says: January 7, 2013 at 9:08 pm
As for past temp changes, I forgot to add this link,
http://stevengoddard.wordpress.com/data-tampering-at-ushcngiss/
There are others who are monitoring what NOAA is doing to historical “data”
========================
The blink graph was damning. I think it is appropriate to draw conclusions about this temperature record business. I am glad to see that we have a network of individuals who are dedicated to keeping up with all of this. Some day, I feel, they will be called to give testimony.

davidmhoffer
January 7, 2013 9:48 pm

Stephen Mosher;
Now lets have a data drop out.
A) 6 6 6 6 6 6 6 6 6 6
B) 0 0 0 0 0 0 0 0 NA NA
And we average.. and Opps! the average goes from 3C to 6C what the hell!
So. You have two options.
Option 1. Use anomalies
Option 2. Adjust for lapse rate.
>>>>>>>>>>>>>>>>
For starters neither temperature nor anomalies vary directly with w/m2. Averaging them results in under representing changes in warm areas and over representing them in cold areas. But put aside the completely absurd notion of averaging things that ought not be averaged in the first place, and let’s consider your approach. There are more options than the ones you have listed.
If the above example were done with some dose of reality, there would be thousands of data series, not two. The first step would be to simply drop incomplete data series out altogether, and only average the series that are complete. I know, I know, we don’t have diddly squat for weather station data that is complete from one end of the record to the other. So what to do? The answer is to follow a process similar to what Leif Svalgaard described for normalizing sun spot counts. You take series that overlap, and determine which ones vary together in the overlap period so you can apply a compensation factor to let series B stand in for series A after the data from A ends. Complicated? Not really. Just a gawd awful number of calculations which is why we invented computers. I can think of some problems with this approach too, and I can also think of other approaches. But my main point is that the options aren’t limited to the two you use.

Frank K.
January 7, 2013 10:07 pm

Anthony Watts says:
January 7, 2013 at 8:18 pm
Hi Anthony – I know I’m violating my New Year’s resolution, but I have to address the silly notion that someone has asserted here that you need to correct temperatures for lapse rate before averaging. The answer is, of course, NO! Temperatures are being measured at climate stations a specified height above the Earth’s surface, so the Earth’s surface-averaged temperature will not care what altitude it’s at. Temperature is temperature. You can carry out a spatial average on the absolute temperatures and they can be just as meaningful as any other (arbitrary) averaging scheme (perhaps more meaningful). To put it another way, if it’s 50 degrees F in Denver, CO, will I feel colder/warmer in Denver than I would if I were exposed to 50 degree air in Charleston, SC? Of course not! What WILL change between the two locations is the air density simply due to the change in static air pressure with altitude.

Kevin Hilde
January 7, 2013 10:54 pm

Furphy … hmmm … okay
http://en.wikipedia.org/wiki/Furphy

Lance Wallace
January 7, 2013 10:59 pm

@Climate Beagle
The link if I did it right should lead you to an Excel file in Dropbox with a list of 125 sites in the USCRN network. The list includes latlong and altitude data. There are 118 locations, but 5 locations have 2 associated sites and two locations have 3 sites. One location is in Canada (Ontario) leaving 117 in the “US” part of the USCRN. I count 8 locations in Alaska (not your 12) and two in Hawaii, leaving 107 separate locations and 114 sites in CONUS. I never ran across TN Oak Ridge or VA Sterling in this dataset. What the heck is SA TIKSI? Somewhere in Russia? SA is not a US state abbreviation but might be the Sakha Republic in Siberia.
The data are from a monthly file created by NOAA back in about August of this year, so should be up to date.
The mean values I provided in my table earlier were for the 125 sites in the USCRN network. This included the 8 sites in Alaska, the one in Canada, and the two in Hawaii, so no doubt were a bit lower than one would get for a continental US average.
https://www.dropbox.com/s/rjx767suajwqr41/CRN%20list%20of%20locations%20and%20sites.xlsx

Billy
January 7, 2013 11:25 pm

Steven Mosher says:
January 7, 2013 at 1:09 pm
Problem is you cant compare the averages without correcting for altitude differences.
———————————————————————————————
Are you saying that the continent is rising and subsiding by 100M frequently? Or do you want to adjust the readings so they are all the same?
You could just use one thermometer for the whole USA to acheive that end. Temps for any location could be created by adjustments. Good for research grants and eliminates the need to go outdoors.
not really sarc/

Lance Wallace
January 7, 2013 11:29 pm

Anthony
Your updated list includes the two Hawaii stations

Lance Wallace
January 7, 2013 11:37 pm

My earlier table above was for the full USCRN dataset (125 stations, including 8 in AK, 2 in HI, and 1 in Ontario CAN). This table is for the continental US (114 stations, except about two fewer in 2008 and 2009). Presumably if Anthony removes the two Hawaiian stations his overall average will drop from 55 to a bit closer to the values of about 53 listed here.
year Mean SE
2008 52.8 0.50
2009 52.6 0.49
2010 53.1 0.50
2011 53.3 0.50

pete
January 7, 2013 11:38 pm

Gawd Mosher you really are getting on my nerves.
1. Spatial averages of temperature are meaningless in reality. You claim to want to know what the temperature is in a region that is not being measured? Fine, you estimate temperature using the usual algorithms, but that is not the same as claiming you know the temperature or its response/history over time, it should merely give you a rough figure to work with. Such a figure would merely be used prior to the temperature actually being measured (you would only want to know the temperature prior to something being done at that location) . It serves no purpose otherwise, and it certainly does not serve the purpose it is being used for (to comment on the state of the climate)! Looking at the change in spatial averages over time is not meaningful because nothing on the planet responds to such an average; it merely allows for statistical masturbation.
2. Meaningful comparison for climate science purposes can be made by looking at the rate of change in the temperature readings of single stations over time. If those stations are well sited it significantly reduces or eliminates most of the siting biases. More specifically, as the climate and weather patterns are driven by differences in temperature across the globe examining those changes is far more meaningful than some pointless averages. Given this, there is zero need for altitude/lapse rate correction as this is constant. Why are you trying to compare an airport with a sea-level site in any case? There is no need to do such a thing!
4. Core samples, tree rings etc are useful in identifying localised climatic conditions only. Averaging these to come with a spatial average is even more meaningless than doing the same with thermometers. The correct usage of such data would be to compare various points on the planet over time, look at the changes in climate and deduce whether various events were localised or global in nature. Beyond that you are trying to coax information from whence none exists.
Let’s get real about what you should and shouldn’t be doing here.

wayne
January 8, 2013 12:06 am

Anthony, a smaller (XX.XX °C XXX.XX K) somewhere sure would be appreciated by many I bet and let the user round to whatever precision they feel is proper. Now that would be fast, easy and very useful for everyone without a calculator wristwatch. 😉

Lance Wallace
January 8, 2013 12:24 am

The remaining discrepancy between my list and Anthony’s is a single station at Goodwell OK. My list includes only one station at Goodwell (at the Panhandle Center) and Anthony’s list (also Climate Beagle’s) includes two. I expect his list of 115 sites (dropping the two in Hawaii) is correct. I have matched the other 113 sites with Anthony’s in the Excel file in Dropbox.
OK Goodwell 2 E 20040227 Panhandle Research & Extn. Center (Native Grassland Site)
OK Goodwell 2 SE 20110618 Oklahoma Panhandle State Univ., School of Agriculture
https://www.dropbox.com/s/js1p7tns1gyepp9/CRN%20list%20of%20locations%20and%20sites%20compared%20to%20Watts%20CONUS%20list.xlsx?m

Lance Wallace
January 8, 2013 12:26 am

Trying that dropbox link again to the Excel file with Anthony’s list of 115 sites.
https://www.dropbox.com/s/js1p7tns1gyepp9/CRN%20list%20of%20locations%20and%20sites%20compared%20to%20Watts%20CONUS%20list.xlsx

Bob Koss
January 8, 2013 1:03 am

Anthony,
I haven’t read all the comments yet, so I’ll post what I’ve found even if it has been posted already.
There are some problems with the list of stations you are using. The USCRN folder of site data has many more stations in it than are part of the network. Maybe you could get a concise list from someone at USCRN. A BAMS article recently put up on the USCRN site says:

260 WHERE ARE THE USCRN STATIONS? The number of USCRN stations distributed across
261 the CONUS is 114, consisting of 7 paired sites and 100 single sites, or 107 total sites that are
262 fully instrumented; resulting in an effective national average spacing of approximately 265 km.

http://journals.ametsoc.org/doi/abs/10.1175/BAMS-D-12-00170
I noticed site number 64757 CRNM0101-ON_Egbert_1_W.txt is located north of Toronto. Also there are 7 pair sites of which I found you used them all except number 53927 CRNM0101-OK_Stillwater_5_WNW.txt. Since those 14 sites are so close to each other, I wonder if it might be appropriate to average their data.
Here is the pair list I came up with by looking at the map in this photo file.
http://www1.ncdc.noaa.gov/pub/data/uscrn/documentation/site/photos/stationsbystate_lores.pdf
54796, 54797 RI 1 km apart.
54794, 54795 NH 7 km apart.
53877, 53878 NC 9 km apart.
63828, 63829 GA 13 km apart.
53926, 53927 OK 2 km apart.
94995, 94996 NE 29 km apart.
94059, 94060 MT 21 km apart.

jonny old boy
January 8, 2013 1:35 am

I emailed the MET office recently and asked if there is a trend ( as they claim ) for rising temperatures on average across the UK/US etc etc then why is there not a “trend-towards-now” for absolute temperature records being broken ? I stated that “surely this illustrates that your data/assumptions can not be correct because basic ( and I mean VERY basic ) statistical theory states that extreme records should be tumbling left-right-and-centre but are not…. needless the say the answer I got back was unscientific gibberish and counter-logical… I also asked someone recently how the surface temperature measurements taken on the QUEEN MARY 2 are calibrated and verified since they are used to calibrate and verify satellite temperatue measurements. No one seems to know !? I also asked if the ship’s own “heat island” was isolated from any measurements ?! no one seemed to know or care….. so maybe this could be another area for Anthony to delve into….is there a floating heat island taking dubious measurements on a daily basis as well as probably consistantly serving Champagne at the incorrect temperature….

January 8, 2013 1:54 am

Mosh
Could you please clarify where you get your figure of a 6.5C change in temperature per 1000 metres of altitude? Thanks
Tonyb

AndyG55
January 8, 2013 4:00 am

tonyb
The dry adiabatic lapse rate is 9.8C/km, but this is heavily affected by ONLY H2O) in its various states. The 100% humidity lapse rate is about 4.5C/km (iirc) and since there is nearly always H2O in the atmosphere a generalised average value of 6.5C/km is taken in lieu of other information.
In other words, at any time, the lapse rate could be anywhere between 4.5C/km and 9.8C/km.
So really its just another way to introduce further “variables” for temperature “adjustments” 😉

AndyG55
January 8, 2013 4:13 am

ps, I can just see what Hansen would do with lapse rate adjustments.

Adjust using 6.5C/km…… oh, reading is still too cold, ,
must have been dry so I’ll adjust using 8.2c/km instead, spurious reason given..
that’s looks better.. maybe try 9.2 and get it even warmer at sea level.

They have currently nearly run out of adjustments they can use, so the global urban average temp has levelled off..
lets not give them another variable to mess with !!!!!

January 8, 2013 4:26 am

AndyG55
Thanks for that.
It would be interesting to see Moshs BEST work with the full range of possible adjustments instead of using his 6.5C figure.
Mind you to have a belief in the accuracy of your data you would first need to check each temperature point instead of averaging a historic probably incorrect one off figure with thousands of other probably incorrect one off figure in the belief that averaging somehow makes lots of wrongs right
tony

Frank K.
January 8, 2013 5:33 am

AndyG55 says:
January 8, 2013 at 4:13 am
ps, I can just see what Hansen would do with lapse rate adjustments.

Adjust using 6.5C/km…… oh, reading is still too cold, ,
must have been dry so I’ll adjust using 8.2c/km instead, spurious reason given..
that’s looks better.. maybe try 9.2 and get it even warmer at sea level.

AndyG55 – The whole temperature “correction” for lapse rate idea is just silly. Why on Earth would anyone do this? It appears that people who are motivated towards this correction think that an average “temperature” that is adjusted so that every point on the planet is at some effective “sea level” is somehow meaningful. And as you point out, the actual lapse rate depends on the local weather, and so introduces even more complications.

AndyG55
January 8, 2013 5:35 am

The whole idea is a nonsense anyway. They are effectively assigning the same temperature to a very large area, without knowing if its in any way applicable to that area.
In the old network, UHI increased urban temps are applied over large areas of countryside which are in no way affected by urban heat, Nearby readings that might be unaffected by urban expansion, are homogenised so that they are.
The whole issue of a global average land temperature is a joke and a farce.. !

AndyG55
January 8, 2013 5:37 am

“in the belief that averaging somehow makes lots of wrongs right”
did you mention climate models ???? 😉

Frank K.
January 8, 2013 5:51 am

Meanwhile…
Cold wave unabated in North India, 24 more die.
Plummeting mercury, coupled with thick fog cover, threw normal life out of gear in the entire North India on Monday, with 24 more people succumbing to the cold wave in various parts of the region.

I suppose these poor people didn’t realize that their lapse-rate corrected temperature was really warmer than what the thermometer was telling them, so they should not have succumbed to the cold.
/climate-science

Richie D
January 8, 2013 5:58 am

Steve Mosher wrote: “Lapse rate, it will get you every time if you are not careful.”
I believe it’s got Steve this time. Here’s anecdotal evidence why you can’t apply lapse rate (even if you wanted to). Here in Austin, TX, the new international airport east of town is sited on a plain several hundred meters lower than the western half of the city proper, which occupies a hilly area. So the airport temps should be warmer than temps in west Austin due to the lapse rate. But in fact the airport is almost never warmer. Reason? Urban heat island effect. Temps at higher elevations in Austin run up to 10 degrees warmer than those at the lower-level airport.
Taking UHI effect into account, it seems to me there is no valid way to make a blanket adjustment.of temperature, and it all points to the absurdity of trying to find a valid “average” temp. Reminds me of my days at a major Texas daily newspaper, arguing with a business editor against his plan to average 10 economic forecasts to arrive at a prediction of future growth — to two decimal places!
.

Pamela Gray
January 8, 2013 6:09 am

Using anomaly or absolute temperature data statistical analysis results to prove or refute AGW, is simply numerology, the lowest form of scientific observation and discussion. Let me state it another way. Regardless of the cause, small scale weather pattern variation trends cause temperature variation trends at sensor level. But importantly, larger scale oceanic and atmospheric parameters remain in control of resultant weather pattern variation trends. Therefore, to prove or disprove anthropogenic cause, you must look at large scale oceanic and atmospheric parameter trends, not temperature trends. Hansen and his ilk prefer to dabble in low hanging fruit (temperature trends), hoping gullible sheeple will eat it all up and lick the plate.
As long as those who seriously doubt this AGW fad continue to argue over low hanging fruit, we will get nowhere fast.
What we really need is the data related to large oceanic and atmospheric parameters (IE semi-permanent pressure systems, global cloud cover data, smaller pressure system tracks, etc), over at least a 60 to 100 year span of time, and ignore temperature all together. Why? For humans to definitively cause large scale temperature change, what we are supposedly doing must first definitively affect large scale climate and temperature drivers beyond their normal random walk.

Tom in Florida
January 8, 2013 6:09 am

Frank K. says:
January 7, 2013 at 10:07 pm
“… To put it another way, if it’s 50 degrees F in Denver, CO, will I feel colder/warmer in Denver than I would if I were exposed to 50 degree air in Charleston, SC? Of course not!”
Frank you probably realized after you posted this how it is too simplistic. You probably realized that you must also consider wind, humidity and insolation on exposed skin when saying “will I feel colder/warmer” .

January 8, 2013 7:01 am

Anthony
Unless Mosh knows the daily weather conditions at the time of the readings of each of the instrumental records he used for BEST, surely his rough rule of thumb for the lapse rate is so approximate as to destroy the idea of a robust data base?
tonyb

climatebeagle
January 8, 2013 7:10 am

@Lance Wallace
Thanks for also looking into this, but where did your 125 station list with 8 in AK come from?
I’ve worked off the NOAA list here:
http://www.ncdc.noaa.gov/isis/stationlist.htm?networkid=1
which I assumed was the official list. The list I gave earlier is manually derived from that list.
It has 12 AK stations.
It’s a pity NOAA doesn’t seem to have an easily consumable version of their metadata, e.g. a csv file, for all the reference stations.
@ Anthony
Thanks for the list, I also took the approach of looking at the mismatches in WBANNO numbers from your 116 stations (from your one of your monthly files) and my CONUS list, which then confused me even more.
Your monthly file included these that were not in my list.
NM_Santa_Fe_20WNW,03087
CO_Colorado_Springs_23_NW,53007
UT_Blanding_26_SSW,53012
AL_Selma_6_SSE,63897
Note that these four are not USCRN stations according to:
http://www.ncdc.noaa.gov/isis/stationlist.htm?networkid=1
You monthly file also included
ON_Egbert_1_W,64757,USCRN
My list included these that were not on your list.
NM_Los_Alamos_13_W,03062,USCRN
PA_Avondale_2_N,03761,USCRN
CA_Santa_Barbara_11_W,53152,USCRN
OK_Stillwater_5_WNW,53927,USCRN
REPLY: To satisfy your request while traveling, I recreated the list from scratch last night in my hotel room, and obviously failed. Let that be a lesson not to do detailed work after a full day of driving. I’ll have my office forward the correct list (which I don’t have on my laptop) later today and then I’ll post it. Until then, please just stop speculating. – Anthony

Richard M
January 8, 2013 7:11 am

I have commented recently on solar threads about the possibility there has been no real warming which fits very nicely with Lief’s new sunspot count. That is, if all the warming due to UHI and adjustments is removed the real temperature of the planet has changed very little over the last 150 years. This works very nicely with a sunspot count that also hasn’t changed very much.
The primary changes in temperature would be the variation due to ENSO and the AMO (although other minor factors exist). These variations can lead to melting ice caps, glaciers, etc. But that will soon stop now that the PDO has flipped and the AMO will flip in the not too distant future. Add this is a real solar cooling due to the L-P effect and it is likely we will see cooling over the next few decades.
One thing to beware of is any theory that uses the temperature record in any manner. Even if it is skeptical in nature it may very well be correlating to bad data.
What does this mean for the GHE? Why isn’t it warming like it should? I’ve stated my opinion in the past. The GHE is only one of the effects of adding GHGs to the atmosphere. There are other effects and some of them cool the planet. When all effects of GHGs are taken into consideration they more or less cancel out at the current concentrations and temperatures.

beng
January 8, 2013 7:15 am

Mosher puts his foot in his mouth once again. Just can’t resist tampering with records, eh?

Frank K.
January 8, 2013 7:42 am

Tom in Florida says:
January 8, 2013 at 6:09 am
Frank K. says:
January 7, 2013 at 10:07 pm
“… To put it another way, if it’s 50 degrees F in Denver, CO, will I feel colder/warmer in Denver than I would if I were exposed to 50 degree air in Charleston, SC? Of course not!”
Frank you probably realized after you posted this how it is too simplistic. You probably realized that you must also consider wind, humidity and insolation on exposed skin when saying “will I feel colder/warmer” .

Hi Tom – 50 deg F will feel the same to me provided all other variables are the same (as you say wind, humidity) – it won’t matter if I’m in Denver or Miami 🙂 That’s because my reference point is my body temperature (98.6 F). My point is that lapse rate corrections for spatial temperature averages are a silly idea.
Anthony Watts says:
January 8, 2013 at 6:54 am
“Comparing station to station data at different altitudes nearby, yes you need a lapse rate correction, but you also need the other variables (baro, DP) for it to be accurate.”
My point Anthony is that you really don’t need any correction at all, no matter the altitude. If it’s 50 F on top of a local mountain and 60 F in a nearby valley, I would be fine with averaging the two readings together. The reference point for temperature climatology is the 2 meter height of your thermometers from the ** surface of the Earth **.
Here’s another example – if today ALL thermometers in the CONUS had an reading of 50 F (all stations identical, no regard to altitude), would the CONUS average temperature be 50 F or some different lapse-rate adjusted temperature?

January 8, 2013 8:18 am

You have it right Anthony. Adjusting data is no simple task.
It was illustrated today during a morning when we were 10F in Nashua NH (119 feet elevation) at 1015UTC with a steep very low inversion. Jaffrey NH airport at 1013 feet was at 18F and Worcester MA at 1009 feet at 29 the same time and Boston at sea level at 33F. Note sure how one would even attempt to ‘adjust’.
Lapse rates as you stated vary considerable – from sharp inversions to superadiabatic depending on surface factors like snowcover, humidiysaturated soils vs dry ground , vegetation type and state, air masses and fronts, time of day and year, clouds, winds and microclimate factors. Pressure adjustments are more straightforward but even those have issues relying on a standard atmosphere when the atmosphere is rarely standard. In much the same way as temperatures are rarely at the ‘average’.

climatebeagle
January 8, 2013 8:37 am

Anthony, I’m not speculating, I’m trying to get to a point to reproduce your results. From the list of WBANNO numbers in your monthly files you seem to include three non-USCRN stations and the Ontario station, while excluding three USCRN stations.
It’s also me trying to understand what exactly is the defined set of USCRN stations and their CONUS subset, in this posting we have three people trying to define that list and each seem to have a different list and probably source for their list. Meanwhile NOAA seems to say that USCRN has 114 stations but list 130, while Mosher says only “around 108” are active.
For a reference network it seems NOAA has already lost control of it.

mpainter
January 8, 2013 8:57 am

Let the data be furnished raw. Let scientific investigators perform the studies and argue the value of lapse rate, homogenation, etc. etc. etc., but let the data be furnished raw by these %8**^3^*#!! lest they succumb to the temptation to adulterate the record for ideological advantage, as they apparently already have. The integrity of the NOAA is no longer.

Matt G
January 8, 2013 10:11 am

Whether it is absolute temperature or anomaly, it doesn’t matter because both are the same using fixed data points. What we want to see is how the actual data changes, not how adjustments made change the data. A lot of poor adjustments especially with Arctic data are caused by weighting, where just one station can represent 10 percent of the total. Location of one station never reflects the temperature of a whole area 1200km. There is too much weather variance within this size area to call one station reflects it.
1) 10c + 15c + 20c (Station A 1500m, Station B 800m, Station C 100M)
Total 45c, mean = 15c
2) Mean = 15c so this = 0.0c anomaly
Anomaly
-5.0c + 0c + 5c = 0c
0c = 15c so it is the same.
Where this does matter is when the altitudes change for the station location and they are comparing the same location over years with a different altitude. The surface stations do this with no adjustment for it anywhere. It is the reason why higher regions have been replaced with low ones to introduce warming bias not corrected for. The biggest issue I have with anomalies is the way it hides what has been changed when. Absolute values often point out the error, where anomalies aren’t so obvious.

Kelvin Vaughan
January 8, 2013 10:26 am

Just plotted it against Central England 2012 and it looks like the USA is about six weeks ahead of the UK.
USA peaks in July and the UK in August. The USA minimum is in December and the UK in February.

Lance Wallace
January 8, 2013 10:48 am

Climate Beagle–
Agreed that it is difficult to get a final list. It’s a moving target. The link is to a file on Dropbox, which is my most complete metadata file, obtained from several files at NCDC in about August of 2012. It has 142 sites, including 125 labelled “USCRN” and another 17 in Alabama labelled USCRN-Regional or something like that. The 125 sites include 7 “paired” sites. There are 8 sites in Alaska. 2 in Hawaii, and one in Ontario CN. So to get the “contiguous” US, or CONUS, there would be 125-11 = 114 sites. These match Anthony’s 115 sites exactly with the exception of an extra site that Anthony has in Goodwell OK. This site is described in the recent 10-year review article mentioned above. Apparently it involved a move and some overlapping measurements, so it probably was not available in August 2012. Although now it has a total of about 18 months of data from June of 2011.
Sites are being added in Alaska, so I expect your value of 12 rather than 8 is probably indicative of later additions. But these would have only 1 or 2 years of data or less, so would not be too useful for a while.
https://www.dropbox.com/s/x59zbn16ze3qcdq/CRN%20METADATA%20142%20sites%20A.xls

Lance Wallace
January 8, 2013 11:04 am

Climate beagle–forgot to include the link to the site that I have used to download the data.
ftp://ftp.ncdc.noaa.gov/pub/data/uscrn/products/monthly01

Hoser
January 8, 2013 1:05 pm

They’ve had to add a new color to their Oz temperature maps.
http://www.guardian.co.uk/environment/damian-carrington-blog/2013/jan/08/australia-bush-fires-heatwave-temperature-scale
Claim a sign of coming times due to global warming. The article talks about an average temp for Oz, and it makes me wonder about the quality of their met sites, especially because of what you report here.

climatebeagle
January 8, 2013 1:33 pm

Thanks for updating the station master list.
Do you have any additional information on your selection criteria?
From my comment at January 8, 2013 at 7:10 am:
You have included these stations which are not in USCRN:
NM_Santa_Fe_20WNW,03087
CO_Colorado_Springs_23_NW,53007
UT_Blanding_26_SSW,53012
AL_Selma_6_SSE,63897
But excluded these stations which are in USCRN:
NM_Los_Alamos_13_W,03062,USCRN
PA_Avondale_2_N,03761,USCRN
CA_Santa_Barbara_11_W,53152,USCRN
OK_Stillwater_5_WNW,53927,USCRN
PA_Avondale_2_N has missing data for 12/2012 & you stated the reason for omitting that,
but the other three all have data.
My criteria has been the list at:
http://www.ncdc.noaa.gov/isis/stationlist.htm?networkid=1
I just found an station metadata excel at: ftp://ftp.ncdc.noaa.gov/pub/data/uscrn/products/
That seems to be in sync with the NOAA link, ie. the additional four are the regional network, and the excluded four are USCRN.
Is there a different NOAA source for your station list?
REPLY: Our flags show them as USCRN, but there may be some issue with the definitions in metadata we have vs what you have. They very well might be USRCRN http://www.ncdc.noaa.gov/crn/usrcrn/ which is a regional equivalent.
Will look into it when I get back into the office in a couple of days. As we have seen, things change with NOAA rapidly, so we may be victims of versioning differences. -Anthony

January 8, 2013 1:47 pm

http://www.usatoday.com/story/weather/2013/01/08/record-warm-year-2012/1817841/
“It’s official: 2012 marked the warmest year on record for the contiguous USA, scientists from the National Climatic Data Center in Asheville, N.C., announced Tuesday. The past year smashed the previous record for the warmest year, which was 1998.”
I can almost hear the collective deep breath for the start of the alarm

tomharrisicsc
January 8, 2013 6:53 pm

Does anyone have a plot of yearly averages from the US CRN? If it shows no rise since the network started, shouldn’t we be pointing this out to everyone?

izen
January 8, 2013 7:50 pm

@- Pamela Gray
” Regardless of the cause, small scale weather pattern variation trends cause temperature variation trends at sensor level. But importantly, larger scale oceanic and atmospheric parameters remain in control of resultant weather pattern variation trends. Therefore, to prove or disprove anthropogenic cause, you must look at large scale oceanic and atmospheric parameter trends, not temperature trends. ”
Correct.
Most of the energy in the climate system is in water which with its high thermal capacity and large energy changes with phase changes from solid/liquid/vapor forms is the key dominant component.
The obvious macro scale measures of the energy in the climate system are sea level, ocean heat content, land ice mass balance and atmospheric moisture content.
They reveal how much energy is increasing in the system from the shift from ice to vapor and thermal expansion of the liquid phase.
the implication of the changes seen in these values is obvious.
The confirmation from the NCDC that the CRN shows 2012 was the warmest measured year by 1degF is just the result of the macro scale influence of the oceanic and atmospheric parameter trends.

Trudy Cashel
January 8, 2013 11:21 pm

Good one. Something powerful to link people to when dealing with misleading climate/temperature articles.

climatebeagle
January 10, 2013 9:05 pm

I finally got around to calculating the CONUS 2012 average temperature using USCRN hourly readings with a calculated result of 12.99°C or 55.38°F.
That compares to:
NOAA (COOP?) : 12.96°C or 55.32°F. [1]
Anthony USCRN: 55.25°F. [this post]
I know there are two process differences between mine & Anthony’s, which are both using USCRN:
a) Anthony is using the monthly means, I’m averaging the hourly figures.
b) Anthony is using an earlier USCRN list, I’m using the one from Sep 2012 (see earlier comments). Not really sure why the USCRN network is changing its station list at this point in time.
Also, just to note with the hourly figures:
There are over one million records for 115 USCRN stations for a single year.
In 2012 I calculated 0.37% of data was missing, which seemed to be:
0.10% data entries missing from input files
0.27% data entries with missing data values (e.g. -9999.0).
[1] http://www.ncdc.noaa.gov/temp-and-precip/ranks.php?periods%5B%5D=ytd&parameter=tmp&year=2012&month=12&state=110&div=0