An 'inconvenient result' – July 2012 not a record breaker according to data from the new NOAA/NCDC U.S. Climate Reference Network

I decided to do myself something that so far NOAA has refused to do: give a CONUS average temperature for the United States from the new ‘state of the art’ United States Climate Reference Network (USCRN). After spending millions of dollars to put in this new network from 2002 to 2008, they are still giving us data from the old one when they report a U.S. national average temperature. As readers may recall, I have demonstrated that old COOP/USHCN network used to monitor U.S. climate is a mishmash of urban, semi-urban, rural, airport and non-airport stations, some of which are sited precariously in observers backyards, parking lots, near air conditioner vents, airport tarmac, and in urban heat islands. This is backed up by the 2011 GAO report spurred by my work.

Here is today’s press release from NOAA, “State of the Climate” for July 2012 where they say:

The average temperature for the contiguous U.S. during July was 77.6°F, 3.3°F above the 20th century average, marking the hottest July and the hottest month on record for the nation. The previous warmest July for the nation was July 1936 when the average U.S. temperature was 77.4°F. The warm July temperatures contributed to a record-warm first seven months of the year and the warmest 12-month period the nation has experienced since recordkeeping began in 1895.

OK, that average temperature for the contiguous U.S. during July is easy to replicate and calculate using NOAA’s USCRN network of stations, shown below:

Map of the 114 climate stations in the USCRN, note the even distribution.
In case you aren’t familiar with his network and why it exists, let me cite NOAA/NCDC’s reasoning for its creation. From the USCRN overview page:

The U.S. Climate Reference Network (USCRN) consists of 114 stations developed, deployed, managed, and maintained by the National Oceanic and Atmospheric Administration (NOAA) in the continental United States for the express purpose of detecting the national signal of climate change. The vision of the USCRN program is to maintain a sustainable high-quality climate observation network that 50 years from now can with the highest degree of confidence answer the question: How has the climate of the nation changed over the past 50 years? These stations were designed with climate science in mind. Three independent measurements of temperature and precipitation are made at each station, insuring continuity of record and maintenance of well-calibrated and highly accurate observations. The stations are placed in pristine environments expected to be free of development for many decades. Stations are monitored and maintained to high standards, and are calibrated on an annual basis. In addition to temperature and precipitation, these stations also measure solar radiation, surface skin temperature, and surface winds, and are being expanded to include triplicate measurements of soil moisture and soil temperature at five depths, as well as atmospheric relative humidity. Experimental stations have been located in Alaska since 2002 and Hawaii since 2005, providing network experience in polar and tropical regions. Deployment of a complete 29 station USCRN network into Alaska began in 2009. This project is managed by NOAA’s National Climatic Data Center and operated in partnership with NOAA’s Atmospheric Turbulence and Diffusion Division.

So clearly, USCRN is an official effort, sanctioned, endorsed, and accepted by NOAA, and is of the highest quality possible. Here is what a typical USCRN station looks like:

USCRN Station at the Stroud Water Research Center, Avondale, PA

A few other points about the USCRN:

  • Temperature is measured with triple redundant air aspirated sensors (Platinum Resistance Thermometers) and averaged between all three sensors. The air aspirated shield exposure system is the best available.
  • Temperature is measured continuously and logged every 5 minutes, ensuring a true capture of Tmax/Tmin
  • All stations were sited per Leroy 1999 siting specs, and are Class 1 or Class 2 stations by that siting standard. (see section 2.2.1 here of the USCRN handbook PDF)
  • The data goes through quality control, to ensure an errant sensor hasn’t biased the values, but is otherwise unchanged.
  • No stations are near any cities, nor have local biases of any kind that I have observed in any of my visits to them.
  • Unlike the COOP/USHCN network where they fought me tooth and nail, NOAA provided station photographs up front to prove the “pristine” nature of the siting environment.
  • All data is transmitted digitally via satellite uplink direct from the station.

So this means that:

  1. There are no observer or transcription errors to correct.
  2. There is no time of observation bias, nor need for correction of it.
  3. There is no broad scale missing data, requiring filling in data from potentially bad surrounding stations. (FILNET)
  4. There are no needs for bias adjustments for equipment types since all equipment is identical.
  5. There are no need for urbanization adjustments, since all stations are rural and well sited.
  6. There are no regular sensor errors due to air aspiration and triple redundant lab grade sensors. Any errors detected in one sensor are identified and managed by two others, ensuring quality data.
  7. Due to the near perfect geospatial distribution of stations in the USA, there isn’t a need for gridding to get a national average temperature.

Knowing this, I wondered why NOAA has never offered a CONUS monthly temperature from this new network. So, I decided that I’d calculate one myself.

The procedure for a CONUS monthly average temperature from USCRN:

  1. Download each station data set from here: USCRN Quality Controlled Datasets.
  2. Exclude stations that are part of the USHCN-M (modernized USHCN) or USRCRN-Lite stations which are not part of the 114 station USCRN master set.
  3. Exclude stations that are not part of the CONUS (HI and AK)
  4. Load all July USCRN 114 station data into an Excel Spreadsheet, available here: CRN_CONUS_stations_July2012_V1.2
  5. Note stations that have missing monthly totals data. Three in July 2012, Elgin, AZ, (4 missing days) Avondale, PA,(5 missing days) McClellanville, SC, (7 missing days) and  set their data aside to be dealt with separately.
  6. Do sums and calculate CONUS area averages from the Tmax, Tmin, Tavg and Tmean data provided for each station.
  7. Do a separate calculation to see how much difference the stations with missing/partial data make for the entire CONUS.

Here are the results:

USA Monthly Mean for July 2012:   75.72°F 

(111 stations)

USA Monthly Average for July 2012:   75.51°F 

(111 stations)

USA Monthly Mean for July 2012:   75.74°F 

(114 stations, 3 w/ partial missing data, difference  0.02)

USA Monthly Average for July 2012:   75.55°F 

(114 stations, 3 w/ partial missing data, difference  0.04)

============================

Comparison to NOAA’s announcement today:

Using the old network, NOAA says the USA Average Temperature for July 2012 is: 77.6°F

Using the NOAA USCRN data, the USA Average Temperature for July 2012 is: 75.5°F

The difference between the old problematic network and new USCRN is 2.1°F cooler.

This puts July 2012, according to the best official climate monitoring network in the USA at 1.9°F below the  77.4°F July 1936 USA average temperature in the NOAA press release today, not a record by any measure. Dr. Roy Spencer suggested earlier today that he didn’t think so either, saying:

So, all things considered (including unresolved issues about urban heat island effects and other large corrections made to the USHCN data), I would say July was unusually warm. But the long-term integrity of the USHCN dataset depends upon so many uncertain factors, I would say it’s a stretch to to call July 2012 a “record”.

This result also strongly suggests, that a well sited network of stations, as the USCRN is designed from inception to be, is totally free of the errors, biases, adjustments, siting issues, equipment issues, and UHI effects that plague the older COOP USHCN network that is a mishmash of problems that the new USCRN was designed to solve.

It suggests Watts et al 2012 is on the right track when it comes to pointing out the temperature measurement differences between stations with and without such problems. I don’t suggest that my method is a perfect comparison to the older COOP/USHCN network, but the fact that my numbers come close, within the bounds of the positive temperature bias errors noted in Leroy 1999, and that the more “pristine” USCRN network is cooler for absolute monthly temperatures (as would be expected) suggests my numbers aren’t an unreasonable comparison.

NOAA never mentions this new pristine USCRN network in any press releases on climate records or trends, nor do they calculate and display a CONUS value for it. Now we know why. The new “pristine” data it produces is just way too cool for them.

Look for a regular monthly feature using the USCRN data at WUWT. Perhaps NOAA will then be motivated to produce their own monthly CONUS Tavg values from this new network. They’ve had four years to do so since it was completed.

UPDATE: Some people questioned what is the difference between the mean and average temperature values. In the monthly data files from USCRN, there are these two values:

T_MONTHLY_MEAN

T_MONTHLY_AVG

http://www.ncdc.noaa.gov/crn/qcdatasets.html

The mean is the monthly (max+min)/2, and the average is the average of all the daily averages.

UPDATE2: I’ve just sent this letter to NCDC – to ncdc.info@ncdc.noaa.gov

Hello,

I apologize for not providing a proper name in the salutation, but none was given on the contact section of the referring web page.

I am attempting to replicate the CONUS  temperature average of 77.6 degrees Fahrenheit for July 2012, listed in the August 8th 2012, State of the Climate Report here: http://www.ncdc.noaa.gov/sotc/

Pursuant to that, would you please provide the following:

1. The data source of the surface temperature record used.

2. The list of stations used from that surface temperature record, including any exclusions and reasons for exclusions.

3. The method used to determine the CONUS average temperature, such as simple area average, gridded average, altitude corrections, bias corrections, etc. Essentially what I’m requesting is the method that can be used to replicate the resultant 77.6F CONUS average value.

4. A flowchart of the procedures in step 3 if available.

5. Any other information you deem relevant to the replication process.

Thank you sincerely for your consideration.

Best Regards,

Anthony Watts

===================================================

Below is the response I got to the email address provided in the SOTC release, some email addresses redacted to prevent spamming.

===================================================

—–Original Message—–
From: mailer-daemon@xxxx.xxxx.xxx
Date: Thursday, August 09, 2012 3:22 PM
To: awatts@xxxxxxx.xxx
Subject: Undeliverable: request for methods used in SOTC press release
Your message did not reach some or all of the intended recipients.
   Sent: Thu, 9 Aug 2012 15:22:43 -0700
   Subject: request for methods used in SOTC press release
The following recipient(s) could not be reached:
ncdc.info@ncdc.noaa.gov
   Error Type: SMTP
   Error Description: No mail servers appear to exists for the recipients address.
   Additional information: Please check that you have not misspelled the recipients email address.
hMailServer

===============================

UPDATE3: 8/10/2012. This may put the issue to rest about straight averaging -vs- some corrected method. From http://www.ncdc.noaa.gov/temp-and-precip/us-climate-divisions.php

It seems they are using TCDD (simple average) still. I’ve sent an email to verify…hopefully they get it.


Traditional Climate Divisional Database

Traditionally, climate division values have been computed using the monthly values for all of the Cooperative Observer Network (COOP) stations in each division are averaged to compute divisional monthly temperature and precipitation averages/totals. This is valid for values computed from 1931 to the present. For the 1895-1930 period, statewide values were computed directly from stations within each state. Divisional values for this early period were computed using a regression technique against the statewide values (Guttman and Quayle, 1996). These values make up the traditional climate division database (TCDD).


Gridded Divisional Database

The GHCN-D 5km gridded divisional dataset (GrDD) is based on a similar station inventory as the TCDD however, new methodologies are used to compute temperature, precipitation, and drought for United States climate divisions. These new methodologies include the transition to a grid-based calculation, the inclusion of many more stations from the pre-1930s, and the use of NCDC’s modern array of quality control algorithms. These are expected to improve the data coverage and the quality of the dataset, while maintaining the current product stream.

The GrDD is designed to address the following general issues inherent in the TCDD:

  1. For the TCDD, each divisional value from 1931-present is simply the arithmetic average of the station data within it, a computational practice that results in a bias when a division is spatially undersampled in a month (e.g., because some stations did not report) or is climatologically inhomogeneous in general (e.g., due to large variations in topography).
  2. For the TCDD, all divisional values before 1931 stem from state averages published by the U.S. Department of Agriculture (USDA) rather than from actual station observations, producing an artificial discontinuity in both the mean and variance for 1895-1930 (Guttman and Quayle, 1996).
  3. In the TCDD, many divisions experienced a systematic change in average station location and elevation during the 20th Century, resulting in spurious historical trends in some regions (Keim et al., 2003; Keim et al., 2005; Allard et al., 2009).
  4. Finally, none of the TCDD’s station-based temperature records contain adjustments for historical changes in observation time, station location, or temperature instrumentation, inhomogeneities which further bias temporal trends (Peterson et al., 1998).

The GrDD’s initial (and more straightforward) improvement is to the underlying network, which now includes additional station records and contemporary bias adjustments (i.e., those used in the U.S. Historical Climatology Network version 2; Menne et al., 2009).

The second (and far more extensive) improvement is to the computational methodology, which now addresses topographic and network variability via climatologically aided interpolation (Willmott and Robeson, 1995). The outcome of these improvements is a new divisional dataset that maintains the strengths of its predecessor while providing more robust estimates of areal averages and long-term trends.

The NCDC’s Climate Monitoring Branch plans to transition from the TCDD to the more modern GrDD by 2013. While this transition will not disrupt the current product stream, some variances in temperature and precipitation values may be observed throughout the data record. For example, in general, climate divisions with extensive topography above the average station elevation will be reflected as cooler climatology. A preliminary assessment of the major imapacts of this transition can be found in Fenimore, et. al, 2011.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
260 Comments
Inline Feedbacks
View all comments
John F. Hultquist
August 9, 2012 7:44 am

Nick Stokes says 2:48 and 5:24 regarding elevations
at a lapse rate of 6 °C/km, . . .
Seems this is an average of averages of lapse rates, both unsaturated (dry) and saturated (wet). I don’t know what else one would do with this issue, so I’ve no complaint. However, this just adds to the uncertainty of knowing what average temperature is. Also, thanks for the contribution and the update.
—————————
Stuff pedantic: The symbolism of six degrees Celsius should not look the same as six Celsius degrees. The first is a temperature, the second is a change. Because most everyone does this incorrectly, does not make it right. The same type of thing with A.M. and P.M. being used with 12:00 o’clock. But who cares about such truffles? Oops! Wrong word.

John@EF
August 9, 2012 7:44 am

REPLY: I understand exactly how anomalies work. Different baselines give different offsets for anomaly values. – Anthony
====
Seriously …. So what? Sus is exactly right, yours is a non-point. The only impact of baseline selection is the scale on the Y axis. When you compare two temperature series with different base periods, one simply converts them to a common base period. There’s no impact on the data line shape or trend.
Stokes’ point regarding station altitude seems valid too, at least must be considered before asserting claims. If you’re accusing him of “guessing”, you certainly are as well.
REPLY: And all that is fine, because we aren’t talking about anomalies, nor trends, but absolute temperatures for ONE MONTH. So its all just pointless distraction. Since NOAA doesn’t tell us what method they use to calculate the US monthly CONUS average temp, we are all guessing at that. The most important point here is that they aren’t using this network to try to provide any sort of sanity check to the poorly sited mishmashed highly adjusted train wreck that is the COOP/USHCN/GHCN networks – Anthony

David C. Greene
August 9, 2012 7:45 am

Great contribution, Anthony! Nick Stokes’ (corrected) comment about differences in station altitude and lapse rate makes sense to me. The paucity of quality sites at lower altitude can be verified using Google Earth. My favorite measure of long term trend is ocean heat content as obtained from the Argo buoys. Even there, (long term) time-averaging is necessary because of the annual variation in the “global” average heat content. Last I looked Loehle found recent cooling, while the Argo project’s Josh found no significant change in ocean heat content.
I am looking forward to publication of the Watts et al paper showing the sliced and diced data from the contiguous US surface stations.

RobertInAz
August 9, 2012 8:02 am

Looking at the NCDC Map here,
http://www.ncdc.noaa.gov/oa/climate/research/cag3/cag3.html
I see “hotspots” over St. Louis, Sioux Falls, Montgomery. Other “hotspots” are not so closely tied to cities. One wonders about the sensors in these areas.

Rattus Norvegicus
August 9, 2012 8:09 am

Tony, you might try clicking on the “datasets” link from the NCDC page you linked to:
“Temperature – USHCN Version 2 Adjusted Data”
Nick is right you are wrong.
REPLY: My name is Anthony.
Show me how NOAA calculates the value in the press release, and how they deal with the fact that weather will change the adibatic lapse rate for each station on a daily basis, and how they account for that. If NOAA shows how they calculate the Tavg for the CONUS, I can duplicate it. But they aren’t showing it, so neither you nor Nick know the answer. That’s the point of this whole excercise, to get that answer and apply it to the USCRN.
No curiosity with either of you gents, just tribal derision. You aren’t even happy at the prospect that the temperature might not be as bad as proclaimed. – Anthony

Bob Koss
August 9, 2012 8:11 am

Alan D McIntire and other queries on Tmean & Taverage,
Here is how they define the CRN values.
cols 41 — 47 [7 chars] T_MONTHLY_MAX
The maximum air temperature, in degrees C, for the month. This maximum
is calculated as the average of all available day-maximums. To be
valid there must be less than 4 consecutive day maximums missing,
and no more than 5 total day maximums missing.
cols 49 — 55 [7 chars] T_MONTHLY_MIN
The minimum air temperature, in degrees C, for the month. This minimum
is calculated as the average of all available day-minimums. To be
valid there must be less than 4 consecutive day minimums missing,
and no more than 5 total day minimums missing.
cols 57 — 63 [7 chars] T_MONTHLY_MEAN
The mean temperature, in degrees C, calculated using the typical
historical approach of (T_MONTHLY_MAX + T_MONTHLY_MIN) / 2
cols 65 — 71 [7 chars] T_MONTHLY_AVG
The average air temperature, in degrees C, for the month. This average
is calculated using all available day-averages, each derived from
24 one-hour averages. To be valid there must be less than 4 consecutive
day averages missing, and no more than 5 total day averages missing.

John@EF
August 9, 2012 8:18 am

“REPLY: And all that is fine, because we aren’t talking about anomalies, nor trends, but absolute temperatures for ONE MONTH.”
===
The point being, baseline selection has no impact whatsoever on the data no matter how it’s used – you appear to believe differently.
***
“… So its all just pointless distraction. …”
===
Exactly my sentiment, although applied a bit differently.

August 9, 2012 8:20 am

Nick, what stations were in use in 1936 compared to 2012 for NOAA and what was the altitude difference?

HowardG
August 9, 2012 8:20 am

RE: Mean and Average data (are they nat the same?).
AW averages the data as provided. See how the source defines their data.
From the USCRN/USRCRN FTP MONTHLY STREAM
ftp://ftp.ncdc.noaa.gov/pub/data/uscrn/products/monthly01
cols 41 — 47 [7 chars] T_MONTHLY_MAX
The maximum air temperature, in degrees C, for the month. This maximum
is calculated as the average of all available day-maximums. To be
valid there must be less than 4 consecutive day maximums missing,
and no more than 5 total day maximums missing.
cols 49 — 55 [7 chars] T_MONTHLY_MIN
The minimum air temperature, in degrees C, for the month. This minimum
is calculated as the average of all available day-minimums. To be
valid there must be less than 4 consecutive day minimums missing,
and no more than 5 total day minimums missing.
cols 57 — 63 [7 chars] T_MONTHLY_MEAN
The mean temperature, in degrees C, calculated using the typical
historical approach of (T_MONTHLY_MAX + T_MONTHLY_MIN) / 2
cols 65 — 71 [7 chars] T_MONTHLY_AVG
The average air temperature, in degrees C, for the month. This average
is calculated using all available day-averages, each derived from
24 one-hour averages. To be valid there must be less than 4 consecutive
day averages missing, and no more than 5 total day averages missing.

August 9, 2012 8:22 am

Excellent, Anthony. Thanks!
When the political climate cools in the near future we’ll get the USCRN data published?

chris y
August 9, 2012 8:38 am

Spence_UK says-
“Which is why I’ve argued that the REAL confidence intervals – even for the anomalies – is some large fraction of that 2.1 F – probably of the order of 1F or so (and I mean 1-sigma here).”
I absolutely agree. Every adjustment applied to the raw data *widens* the Confidence Interval. The list includes TOBS, altitude, UHI, equipment changes, location shifts, etc.
For example, Steven Goddard has shown an adjustment of 3.1 F between raw and twiddled July temperatures for USHCN.
All of these temperature adjustments provide fertile ground for confirmation bias at NOAA.
The claim made is that July 2012 is the hottest based on an absolute temperature, but defining ‘the temperature of what?’ is an unresolvable problem that climate science brushes under the table.
The situation is quite sad. The USHCN raw data has been adjusted by far more than the purported CACC warming.
Compared to other countries, the US has a gold standard temperature network.
Compared to ocean measurements, global land data is the gold standard.
Compared to paleo reconstructions of temperature, global land and ocean data is the gold standard.
Yet we have climate activists claiming astonishing accuracy when comparing temperatures over millenial periods.

JanF
August 9, 2012 8:41 am

Temperature is measured with triple redundant air aspirated sensors (Platinum Resistance Thermometers) and averaged between all three sensors. The air aspirated shield exposure system is the best available.

Are they just averaging the 3 sensors or only when de difference is limited? When one sensor is faulty it can ruin the average.
REPLY: There’s a sanity check on each sensor, to prevent just that – Anthony

August 9, 2012 8:44 am

“…I wondered why NOAA has never offered a CONUS monthly temperature from this new network….”
I think they’ve mentioned it a bit:
“…The vision of the USCRN program is to maintain a sustainable high-quality climate observation network that 50 years from now can with the highest degree of confidence answer the question: How has the climate of the nation changed over the past 50 years?…”
So, 50 years from now, they’ll look into it.
Also, most databases use that “30 year point” to define an averaging period, with a 5 year running average.
Maybe, since this is the “climate observation network”, there hasn’t been enough time to observe the climate. Anything less than 30 years is just weather.
REPLY: Exactly, which is why you can use it on a one month scale – Anthony

Owen in Ga
August 9, 2012 8:46 am

Curiosity here: Are they calculating average station temperature as a mean of all 288 recorded daily measurements? That seems like a much better number than the old (Tmin+Tmax)/2, and would seem to track energy budget somewhat better. Are they recording solar insolation on 5 minute intervals as well? Of course logic would indicate that on average 144 (fewer in summer more in winter) of those should be very near 0. Seems like a fairly robust network, but still a little sparse in some places where a number of microclimates can occur in relatively short distances. (Do we really need data on every little microclimate, or just areas free of UHI and land use issues to establish baseline temperatures.) On the twinned stations, are they using the second station as a check on the first, or are they oversampling that location (i.e do the paired stations report independently or as a single site)? What kind of security do they have at the site? I ask this because down here we had one of the old weather stations quit reporting and when the folks went out to find out what broke, they discovered the whole station had been destroyed/removed by scrap metal scavengers. Any explanation as to why those few stations had gaps? Any technology can break so I am not implying foul play, just wondering if they established a mean time between failure for these things.

Rattus Norvegicus
August 9, 2012 8:55 am

Anthony,
Click on the national bit in the image map and then select First Year 1930, Last Year 2012, Period July, Mean temperature, Table, Sort By Rank.
Then look at the data. Seems to agree with the article, modulo the rounding, which doesn’t make any difference to the rank.
REPLY: A link to which page you are referring to would be helpful, and we aren’t talking about rank, but absolute temperature for the month of July 2012 – Anthony

August 9, 2012 8:58 am

Nick Stokes brings up a problem that I have had with the old network for a while, namely that the stations are predominantly in lower parts of the individual states. The average elevation in the United States is 2,500 ft which lies above both of the averages that he quotes.
Incidentally I did look at the variation in temperature with elevation for each of the states at Bit Tooth Energy (see for example Michigan) and it varies quite considerably since I suspect that the changing elevations also bring other factors into play in controlling the resulting relationship. On average it was 0.02 deg F per ft much higher than the rate that Nick quotes, but I suspect that this has much to do with the fact that the stations with the lower elevations are also those with the larger populations.

Editor
August 9, 2012 9:02 am

Amidst all the discussion about anomalies/trends/absolute values, we are forgetting a key point.
There was a simple reason why USCRN was introduced – the old system was not of a good enough quality. We don’t appear to have enough data from USCRN yet to see if trends are different to USHCN, but it must be clear that the accuracy of the latter is simply not good enough to compare current temperatures with those of 80 years ago to tenths of a degree.

Rattus Norvegicus
August 9, 2012 9:02 am

Anthony, you provided the link in your previous post on this subject. The table provides both rank and absolute termperature (my reference to the rounding…).
REPLY: I provided several links, would it be too much trouble to point to the exact one you are referring to and what image map you are speaking of? – Anthony

Owen in Ga
August 9, 2012 9:10 am

I see a lot of hand waving about “comparing trends between unlike networks.” My strong contention is that that argument undermines and invalidates the old networks the CAGW folks putting those arguments forward consider as gospel, because the old COOP network in 1936 is not even close to being the “same” COOP network of 2012. Find the same stations, read the same way, on the same equipment, with the same surrounding population densities, with the same surrounding land use, measured at the same times of day between 1936 and 2012 and you can compare them accurately. Anything else is rife with expectation bias and guesstimate adjustments.

Editor
August 9, 2012 9:14 am

Interestingly, Virginia is the only state to post a record temperature in July.
And what does the NCDC summary say?
The temperature trend for the period of record (1895 to present) is 0.0 degrees Fahrenheit per decade.
http://www.ncdc.noaa.gov/oa/climate/research/cag3/va.html

Pamela Gray
August 9, 2012 9:15 am

But, the AGW crowd will say, the high quality set all set records!
Duh. Newly installed sensors set records all the time.

August 9, 2012 9:17 am

Oops! Sorry the result that I quoted should have read 0.02 degF/m not per foot, that’s the trouble with doing things in a hurry. This translates into about double the value that Nick quotes, but it varies quite considerably from state to state.

JJ
August 9, 2012 9:17 am

The headline of this article is in no way supported by the content of the article.
For July 2012 to not be a record breaker according to the USCRN dataset, July 2012 would have to not be the warmest July in the USCRN dataset. Is that the case? Given that USCRN only goes back a few years, I doubt it.
NOAA’s claim is that July 2012 is hotter than July 1936. You can’t refute that claim by comparing just NOAA 2012 to USCRN 2012. USCRN 2012 is 2F cooler than NOAA’s 2012? So what? If USCRN 1936 were also 2F cooler than NOAA’s 2012, then NOAA’s claim would still be valid. You can’t refute a comparison between apples with a single orange.
Moot question of course, given that there is no USCRN 1936. You dont have the data you need to say what you want to say, but you are saying it anyways.
Leave that crap to the Team.

Bill Taylor
August 9, 2012 9:18 am

a layman’s attempt to explain anomalies = anomalies are variances from the normal, since our climate is and always has been in a constant state of CHANGE there is NO starting point to have an anomaly from there is NO “normal” it constantly changes.
since there is no normal there is no baseline and any claim of anomaly is based on NOTHING in reality except for the persons attempt to paint a false picture by just selecting something and calling it the baseline.
Mark Twain said long ago there are liars, damned liars and statistics……the use of anomalies in the discussion of weather is an example of statistical LYING!

Jim G
August 9, 2012 9:22 am

James McCauley says:
August 9, 2012 at 12:06 am
“Proud to be a WUWTer! This post will be shared with my Senators R. Portman and (aah-hemm) Sherrod Brown, etc. Someone needs to forward to Inhofe, et al.”
Don’t waste your time on Sherrod Brown. In a town hall meeting which I attended in 1993 in Medina, OH, when I believe he was running for congress, he indicated that the crime problems in our country would not be corrected “until there was a more equal distribution of wealth”, right out of Carl Marx (also a favorite theme of Barack Obama aka Barry Soetoro), take your pick. Since the real goal of today’s “climate science” is government control and directing tax dollars to administration friends, as with Obamacare, people like Sherrod Brown are not open to facts or real science, only what furthers their power and control.

1 3 4 5 6 7 11