An 'inconvenient result' – July 2012 not a record breaker according to data from the new NOAA/NCDC U.S. Climate Reference Network

I decided to do myself something that so far NOAA has refused to do: give a CONUS average temperature for the United States from the new ‘state of the art’ United States Climate Reference Network (USCRN). After spending millions of dollars to put in this new network from 2002 to 2008, they are still giving us data from the old one when they report a U.S. national average temperature. As readers may recall, I have demonstrated that old COOP/USHCN network used to monitor U.S. climate is a mishmash of urban, semi-urban, rural, airport and non-airport stations, some of which are sited precariously in observers backyards, parking lots, near air conditioner vents, airport tarmac, and in urban heat islands. This is backed up by the 2011 GAO report spurred by my work.

Here is today’s press release from NOAA, “State of the Climate” for July 2012 where they say:

The average temperature for the contiguous U.S. during July was 77.6°F, 3.3°F above the 20th century average, marking the hottest July and the hottest month on record for the nation. The previous warmest July for the nation was July 1936 when the average U.S. temperature was 77.4°F. The warm July temperatures contributed to a record-warm first seven months of the year and the warmest 12-month period the nation has experienced since recordkeeping began in 1895.

OK, that average temperature for the contiguous U.S. during July is easy to replicate and calculate using NOAA’s USCRN network of stations, shown below:

Map of the 114 climate stations in the USCRN, note the even distribution.
In case you aren’t familiar with his network and why it exists, let me cite NOAA/NCDC’s reasoning for its creation. From the USCRN overview page:

The U.S. Climate Reference Network (USCRN) consists of 114 stations developed, deployed, managed, and maintained by the National Oceanic and Atmospheric Administration (NOAA) in the continental United States for the express purpose of detecting the national signal of climate change. The vision of the USCRN program is to maintain a sustainable high-quality climate observation network that 50 years from now can with the highest degree of confidence answer the question: How has the climate of the nation changed over the past 50 years? These stations were designed with climate science in mind. Three independent measurements of temperature and precipitation are made at each station, insuring continuity of record and maintenance of well-calibrated and highly accurate observations. The stations are placed in pristine environments expected to be free of development for many decades. Stations are monitored and maintained to high standards, and are calibrated on an annual basis. In addition to temperature and precipitation, these stations also measure solar radiation, surface skin temperature, and surface winds, and are being expanded to include triplicate measurements of soil moisture and soil temperature at five depths, as well as atmospheric relative humidity. Experimental stations have been located in Alaska since 2002 and Hawaii since 2005, providing network experience in polar and tropical regions. Deployment of a complete 29 station USCRN network into Alaska began in 2009. This project is managed by NOAA’s National Climatic Data Center and operated in partnership with NOAA’s Atmospheric Turbulence and Diffusion Division.

So clearly, USCRN is an official effort, sanctioned, endorsed, and accepted by NOAA, and is of the highest quality possible. Here is what a typical USCRN station looks like:

USCRN Station at the Stroud Water Research Center, Avondale, PA

A few other points about the USCRN:

  • Temperature is measured with triple redundant air aspirated sensors (Platinum Resistance Thermometers) and averaged between all three sensors. The air aspirated shield exposure system is the best available.
  • Temperature is measured continuously and logged every 5 minutes, ensuring a true capture of Tmax/Tmin
  • All stations were sited per Leroy 1999 siting specs, and are Class 1 or Class 2 stations by that siting standard. (see section 2.2.1 here of the USCRN handbook PDF)
  • The data goes through quality control, to ensure an errant sensor hasn’t biased the values, but is otherwise unchanged.
  • No stations are near any cities, nor have local biases of any kind that I have observed in any of my visits to them.
  • Unlike the COOP/USHCN network where they fought me tooth and nail, NOAA provided station photographs up front to prove the “pristine” nature of the siting environment.
  • All data is transmitted digitally via satellite uplink direct from the station.

So this means that:

  1. There are no observer or transcription errors to correct.
  2. There is no time of observation bias, nor need for correction of it.
  3. There is no broad scale missing data, requiring filling in data from potentially bad surrounding stations. (FILNET)
  4. There are no needs for bias adjustments for equipment types since all equipment is identical.
  5. There are no need for urbanization adjustments, since all stations are rural and well sited.
  6. There are no regular sensor errors due to air aspiration and triple redundant lab grade sensors. Any errors detected in one sensor are identified and managed by two others, ensuring quality data.
  7. Due to the near perfect geospatial distribution of stations in the USA, there isn’t a need for gridding to get a national average temperature.

Knowing this, I wondered why NOAA has never offered a CONUS monthly temperature from this new network. So, I decided that I’d calculate one myself.

The procedure for a CONUS monthly average temperature from USCRN:

  1. Download each station data set from here: USCRN Quality Controlled Datasets.
  2. Exclude stations that are part of the USHCN-M (modernized USHCN) or USRCRN-Lite stations which are not part of the 114 station USCRN master set.
  3. Exclude stations that are not part of the CONUS (HI and AK)
  4. Load all July USCRN 114 station data into an Excel Spreadsheet, available here: CRN_CONUS_stations_July2012_V1.2
  5. Note stations that have missing monthly totals data. Three in July 2012, Elgin, AZ, (4 missing days) Avondale, PA,(5 missing days) McClellanville, SC, (7 missing days) and  set their data aside to be dealt with separately.
  6. Do sums and calculate CONUS area averages from the Tmax, Tmin, Tavg and Tmean data provided for each station.
  7. Do a separate calculation to see how much difference the stations with missing/partial data make for the entire CONUS.

Here are the results:

USA Monthly Mean for July 2012:   75.72°F 

(111 stations)

USA Monthly Average for July 2012:   75.51°F 

(111 stations)

USA Monthly Mean for July 2012:   75.74°F 

(114 stations, 3 w/ partial missing data, difference  0.02)

USA Monthly Average for July 2012:   75.55°F 

(114 stations, 3 w/ partial missing data, difference  0.04)

============================

Comparison to NOAA’s announcement today:

Using the old network, NOAA says the USA Average Temperature for July 2012 is: 77.6°F

Using the NOAA USCRN data, the USA Average Temperature for July 2012 is: 75.5°F

The difference between the old problematic network and new USCRN is 2.1°F cooler.

This puts July 2012, according to the best official climate monitoring network in the USA at 1.9°F below the  77.4°F July 1936 USA average temperature in the NOAA press release today, not a record by any measure. Dr. Roy Spencer suggested earlier today that he didn’t think so either, saying:

So, all things considered (including unresolved issues about urban heat island effects and other large corrections made to the USHCN data), I would say July was unusually warm. But the long-term integrity of the USHCN dataset depends upon so many uncertain factors, I would say it’s a stretch to to call July 2012 a “record”.

This result also strongly suggests, that a well sited network of stations, as the USCRN is designed from inception to be, is totally free of the errors, biases, adjustments, siting issues, equipment issues, and UHI effects that plague the older COOP USHCN network that is a mishmash of problems that the new USCRN was designed to solve.

It suggests Watts et al 2012 is on the right track when it comes to pointing out the temperature measurement differences between stations with and without such problems. I don’t suggest that my method is a perfect comparison to the older COOP/USHCN network, but the fact that my numbers come close, within the bounds of the positive temperature bias errors noted in Leroy 1999, and that the more “pristine” USCRN network is cooler for absolute monthly temperatures (as would be expected) suggests my numbers aren’t an unreasonable comparison.

NOAA never mentions this new pristine USCRN network in any press releases on climate records or trends, nor do they calculate and display a CONUS value for it. Now we know why. The new “pristine” data it produces is just way too cool for them.

Look for a regular monthly feature using the USCRN data at WUWT. Perhaps NOAA will then be motivated to produce their own monthly CONUS Tavg values from this new network. They’ve had four years to do so since it was completed.

UPDATE: Some people questioned what is the difference between the mean and average temperature values. In the monthly data files from USCRN, there are these two values:

T_MONTHLY_MEAN

T_MONTHLY_AVG

http://www.ncdc.noaa.gov/crn/qcdatasets.html

The mean is the monthly (max+min)/2, and the average is the average of all the daily averages.

UPDATE2: I’ve just sent this letter to NCDC – to ncdc.info@ncdc.noaa.gov

Hello,

I apologize for not providing a proper name in the salutation, but none was given on the contact section of the referring web page.

I am attempting to replicate the CONUS  temperature average of 77.6 degrees Fahrenheit for July 2012, listed in the August 8th 2012, State of the Climate Report here: http://www.ncdc.noaa.gov/sotc/

Pursuant to that, would you please provide the following:

1. The data source of the surface temperature record used.

2. The list of stations used from that surface temperature record, including any exclusions and reasons for exclusions.

3. The method used to determine the CONUS average temperature, such as simple area average, gridded average, altitude corrections, bias corrections, etc. Essentially what I’m requesting is the method that can be used to replicate the resultant 77.6F CONUS average value.

4. A flowchart of the procedures in step 3 if available.

5. Any other information you deem relevant to the replication process.

Thank you sincerely for your consideration.

Best Regards,

Anthony Watts

===================================================

Below is the response I got to the email address provided in the SOTC release, some email addresses redacted to prevent spamming.

===================================================

—–Original Message—–
From: mailer-daemon@xxxx.xxxx.xxx
Date: Thursday, August 09, 2012 3:22 PM
To: awatts@xxxxxxx.xxx
Subject: Undeliverable: request for methods used in SOTC press release
Your message did not reach some or all of the intended recipients.
   Sent: Thu, 9 Aug 2012 15:22:43 -0700
   Subject: request for methods used in SOTC press release
The following recipient(s) could not be reached:
ncdc.info@ncdc.noaa.gov
   Error Type: SMTP
   Error Description: No mail servers appear to exists for the recipients address.
   Additional information: Please check that you have not misspelled the recipients email address.
hMailServer

===============================

UPDATE3: 8/10/2012. This may put the issue to rest about straight averaging -vs- some corrected method. From http://www.ncdc.noaa.gov/temp-and-precip/us-climate-divisions.php

It seems they are using TCDD (simple average) still. I’ve sent an email to verify…hopefully they get it.


Traditional Climate Divisional Database

Traditionally, climate division values have been computed using the monthly values for all of the Cooperative Observer Network (COOP) stations in each division are averaged to compute divisional monthly temperature and precipitation averages/totals. This is valid for values computed from 1931 to the present. For the 1895-1930 period, statewide values were computed directly from stations within each state. Divisional values for this early period were computed using a regression technique against the statewide values (Guttman and Quayle, 1996). These values make up the traditional climate division database (TCDD).


Gridded Divisional Database

The GHCN-D 5km gridded divisional dataset (GrDD) is based on a similar station inventory as the TCDD however, new methodologies are used to compute temperature, precipitation, and drought for United States climate divisions. These new methodologies include the transition to a grid-based calculation, the inclusion of many more stations from the pre-1930s, and the use of NCDC’s modern array of quality control algorithms. These are expected to improve the data coverage and the quality of the dataset, while maintaining the current product stream.

The GrDD is designed to address the following general issues inherent in the TCDD:

  1. For the TCDD, each divisional value from 1931-present is simply the arithmetic average of the station data within it, a computational practice that results in a bias when a division is spatially undersampled in a month (e.g., because some stations did not report) or is climatologically inhomogeneous in general (e.g., due to large variations in topography).
  2. For the TCDD, all divisional values before 1931 stem from state averages published by the U.S. Department of Agriculture (USDA) rather than from actual station observations, producing an artificial discontinuity in both the mean and variance for 1895-1930 (Guttman and Quayle, 1996).
  3. In the TCDD, many divisions experienced a systematic change in average station location and elevation during the 20th Century, resulting in spurious historical trends in some regions (Keim et al., 2003; Keim et al., 2005; Allard et al., 2009).
  4. Finally, none of the TCDD’s station-based temperature records contain adjustments for historical changes in observation time, station location, or temperature instrumentation, inhomogeneities which further bias temporal trends (Peterson et al., 1998).

The GrDD’s initial (and more straightforward) improvement is to the underlying network, which now includes additional station records and contemporary bias adjustments (i.e., those used in the U.S. Historical Climatology Network version 2; Menne et al., 2009).

The second (and far more extensive) improvement is to the computational methodology, which now addresses topographic and network variability via climatologically aided interpolation (Willmott and Robeson, 1995). The outcome of these improvements is a new divisional dataset that maintains the strengths of its predecessor while providing more robust estimates of areal averages and long-term trends.

The NCDC’s Climate Monitoring Branch plans to transition from the TCDD to the more modern GrDD by 2013. While this transition will not disrupt the current product stream, some variances in temperature and precipitation values may be observed throughout the data record. For example, in general, climate divisions with extensive topography above the average station elevation will be reflected as cooler climatology. A preliminary assessment of the major imapacts of this transition can be found in Fenimore, et. al, 2011.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
260 Comments
Inline Feedbacks
View all comments
Jim G
August 9, 2012 9:30 am

JJ says:
August 9, 2012 at 9:17 am
The headline of this article is in no way supported by the content of the article.
“For July 2012 to not be a record breaker according to the USCRN dataset, July 2012 would have to not be the warmest July in the USCRN dataset. Is that the case? Given that USCRN only goes back a few years, I doubt it.
NOAA’s claim is that July 2012 is hotter than July 1936. You can’t refute that claim by comparing just NOAA 2012 to USCRN 2012. USCRN 2012 is 2F cooler than NOAA’s 2012? So what? If USCRN 1936 were also 2F cooler than NOAA’s 2012, then NOAA’s claim would still be valid. You can’t refute a comparison between apples with a single orange.
Moot question of course, given that there is no USCRN 1936. You dont have the data you need to say what you want to say, but you are saying it anyways.
Leave that crap to the Team.”
Though the article does point out a few other issues, like the non-use/publication of higher quality data, sadly, I must agree with your comment. Let’s not play their games by over attribution of relevence to non-comparable data sets.

DR
August 9, 2012 9:31 am

Anthony,
Is USCRN the same as USHCN-M ?

REPLY:
No, they have a different sensor deployment. – Anthony

Jim G
August 9, 2012 9:34 am

Owen in Ga says:
August 9, 2012 at 9:10 am
“I see a lot of hand waving about “comparing trends between unlike networks.” My strong contention is that that argument undermines and invalidates the old networks the CAGW folks putting those arguments forward consider as gospel, because the old COOP network in 1936 is not even close to being the “same” COOP network of 2012. Find the same stations, read the same way, on the same equipment, with the same surrounding population densities, with the same surrounding land use, measured at the same times of day between 1936 and 2012 and you can compare them accurately. Anything else is rife with expectation bias and guesstimate adjustments.”
True! But we should not play their game, as JJ said.

Man Bearpig
August 9, 2012 9:35 am

It seems quite obvious that NOAA should not get any funding for their temperature data anymore. They can not even perform simple statistics and prefer to use the system that gives them the result they want rather than report the truth. Shame on them. Lobby your representatives, senators, whatever.

Dave_G
August 9, 2012 9:38 am

Do comparisons of previous, recent monthly records, show the same differences?

Elmer
August 9, 2012 9:54 am

It is not really the average temperature unless the slope from minimum to high is linear which it never is.

August 9, 2012 10:03 am

Climate Refugee
The GISS global temperature anomalies explain 33% (adjusted r-squared) of the us HCN anomalies, 1895-2011. Figures likely to be similar for other comparisons.

August 9, 2012 10:16 am

My curiosity got the better of me, and so I plotted, using individual state values for the temperature decline rate and the average elevation, to see how they changed across the country.
I can’t post that plot here, but put it into a short post on Bit Tooth Energy . It turns out that the rate of temperature change is itself sensitive to elevation with higher elevation impacts occurring in those states near sea level, again providing evidence of the impact of sea temperatures on the nearby land values.

August 9, 2012 10:20 am

Michel says:
August 9, 2012 at 12:32 am
It’s difficult to understand what is an average temperature: if you put one hand in a bucket of hot water and the other one in ice cold water will an average temperature mean anything?
If you mix 1 kg of boiling water with 1 kg of ice cold water you may expect a resulting temperature of 50 °C. But you have to mix it. So far, we cannot mix Houston with Minneapolis (any way you can’t mess with Texas)!
If a measuring station is at sea level (Tampa, FL) and another one at an altitude of one mile (Denver, CO) what is the signification of temperature averaged between these two?
Temperature anomalies (the difference between a measured temperature from a timed average for the same station) are understandable and these differences can be treated as cohort for statistical analysis. That’s what we see on all hockey stick or flat diagrams.
REPLY: Probably the best question to ask is this – how does NOAA calculate the area average absolute temperatures (not anomalies) for the CONUS they use in those press releases, combining that mishmash of dissimilar COOP stations? As far as I can tell they have not published the exact method they use for those. – Anthony

RE-REPLY
I understand your curiosity. But even if it’s made by NOAA it does not make any sense to average
an intensive property like temperature between different sites. A whole discussion about a wrong approach – Michel

Crispin in Waterloo
August 9, 2012 10:23 am

The Canadian CBC reported the record July heat in the US as fact.
BTW a large solar panel like that next to the temperature CONUS measurement station produces a lot of (solar) heat that would otherwise not be there. They are nearly flat black and turn reflective grass into a >2 kW heat source.

David Meriwether
August 9, 2012 10:24 am

I’m curious if the old COOP/USHCN stations could be filtered down to the set that were there in July,1936 and still there in an essentially unchanged condition in July, 2012 to calculate a direct comparison between the two points in time. There would probably be too many uncertainties and potential anomalies to know for sure if the results are valid. There could be too few stations too. But it would be interesting to see.

Bill Parsons
August 9, 2012 10:31 am

“That’s why climate scientists generally prefer to deal with anomalies”
This is is really the problem. So long as the “climate scientists” get to define what an anomaly is (not to mention the freedom to fudge the records of absolute temps on which “anomalies” are based), they have a complete control of the reality of temperature reporting. Why should we believe their reports of anomalies if they won’t even address the critical siting and time-of-observation issues recently raised by numerous people?

JC
August 9, 2012 10:33 am

The lapse rate information of any given location is meaningless when considering ground temperatures without also considering geographical differences. Altitude alone does not control temperature. The city where I live is 500 feet lower in altitude than the city where I work. They are only 40 miles apart yet where I work is consistently 2 to 5 degrees warmer. Both cities are about the same distance inland. Just averaging altitude will tell you nothing.

TC in the OC
August 9, 2012 10:34 am

Tom in Florida says:
August 9, 2012 at 5:29 am
Stokes does make a valid point. Comparing raw data from different sets is not correct when looking for changes.
Tom brings up a good point about keeping to the same data set when comparing US temps and I will tell you why.
I like and greatly appreciate Anthony’s work on pointing out all of the hi jinks that NOAA does to the data but on this one issue of July 2012 being the hottest month ever I think everyone is approaching this in the wrong way. We need to use the same data that the warmists are using so there is no way they can wriggle out of this.
In the 76 years since July 1936 we have been told over and over that CO2 has increased exponentially and this has caused “runaway” global warming that if not stopped will cause the end of the world or something like that.
I say we embrace this and shout it from the roof tops…the US average temp has increased 0.2° F in 76 years.
OMG…really…0.2° F in 76 years!!! Where is the runaway global warming?
We need to tell everyone this shocking truth and then ask why are we so freaking worried about 0.2° F rise in temperature in 76 years. And then ask “why was it so hot in 1936 if CO2 is the climate driver.”
0.2° F in 76 years…really!!!

August 9, 2012 10:41 am

Just looking at the frequency distribution of Northern Hemisphere temperatures in GHCNv3, I cannot see a drastic change in the shape & location of the distribution other than the fact that the number of observations varies. See Dude, don’t tell me it’s raining!

JJ
August 9, 2012 10:46 am

Owen in Ga says:
I see a lot of hand waving about “comparing trends between unlike networks.” My strong contention is that that argument undermines and invalidates the old networks the CAGW folks putting those arguments forward consider as gospel, because the old COOP network in 1936 is not even close to being the “same” COOP network of 2012.

Very good point.
Stokes’ red herring about anomalies ignores the fact that simply using anomalies does not necessarily provide for legitimate cross-comparison between networks. The most common example of a failure in that vein is the use of different base periods for the anomalies, either for comparisons between two datasets or two periods within one dataset. But if the network (or simply the properties thereof) changes from the base period to the period of analysis within a single dataset, the effect could be very similar…

cms
August 9, 2012 10:47 am

Jim G. it is Karl Marx not Carl and he was not really interested in the distribution of wealth. That is more a socialist bugaboo. His analysis was a lot more sophisticated than that. It was the distribution of power exemplified by ownership of capital that was his major concern.

Bill Parsons
August 9, 2012 10:56 am

One media outlet should find the “story” in Anthony’s blog releases. Amazingly enough, Wall Street Journal, maintains a split editorial “personality” on this issue:
Today:
http://online.wsj.com/article/SB10000872396390443991704577577242186369820.html?mod=googlenews_wsj Max Taves: “July Was Hottest Month on Record”
And 5 days ago:
http://online.wsj.com/article/SB10000872396390444405804577558973445002552.html Matt Ridley on “How Bias Heats up the Warming Debate”
From Ridley’s three-part article:

I argued last week that the way to combat confirmation bias—the tendency to behave like a defense attorney rather than a judge when assessing a theory in science—is to avoid monopoly. So long as there are competing scientific centers, some will prick the bubbles of theory reinforcement in which other scientists live.
For constructive critics, this is the problem with modern climate science. They don’t think it’s a conspiracy theory, but a monopoly that clings to one hypothesis (that carbon dioxide will cause dangerous global warming) and brooks less and less dissent. Again and again, climate skeptics are told they should respect the consensus, an admonition wholly against the tradition of science.

joshv
August 9, 2012 10:59 am

Sorry Anthony, I love you man, but you are wrong here. The high quality network does have different average altitude than the full network, and thus absolute temperatures are not comparable.
Your complaints about not knowing if the high quality network is adjusted are just backpedaling. They aren’t adjusting the high quality site averages down to sea level – they just aren’t. So there is some altitude effect in your average. Nick Stokes might be wrong that it’s 2 deg F – but there is an effect, and scientifically it means that the two temperatures just aren’t comparable.
From your quote:
“The vision of the USCRN program is to maintain a sustainable high-quality climate observation network that 50 years from now can with the highest degree of confidence answer the question: How has the climate of the nation changed over the past 50 years? ”
The high quality network is not meant to be used the way you are using. It is meant to observe long term trends in a pristine environment. When NOAA observes some long term trends using this network, I assume that they will then publish some results.
REPLY: if you can show me how NOAA derives the CONUS Tavg used in the press release, I can duplicate it with CRN. So far nobody has been able to show how NOAA arrives at that result, and how/if they compensate for altitude. – Anthony

Rattus Norvegicus
August 9, 2012 11:22 am

My best guess is that it is a straight average. BTW, is 2012 a record in the CRN record, because that is the proper comparison.

davidmhoffer
August 9, 2012 11:24 am

Bill Parsons says:
August 9, 2012 at 10:31 am
“That’s why climate scientists generally prefer to deal with anomalies”
This is is really the problem. So long as the “climate scientists” get to define what an anomaly is (not to mention the freedom to fudge the records of absolute temps on which “anomalies” are based), they have a complete control of the reality of temperature reporting.
>>>>>>>>>>>>>>>>>>>>>>
Is it SO much worse than that!
What we’re supposedly trying to figure out is if CO2 increases cause an energy imbalance at earth surface. We measure CO2’s effects in w/m2 which does NOT have a linear relationship with temperature (see Stefan-Boltzmann Law in any physics text if you want the explanation).
So…
an anomaly of +1C at -40 = 2.9 w/m2
an anomaly of +1C at 0 = 4.6 w/m2
an anomaly of +1C at +40 = 7.0 w.m2
Comparing anomalies from very different temperature ranges tells us pretty much nothing about CO2’s supposed effects at earth surface. Consider for example 2 degrees of warming at -40 which happens at the same time as 1 degree of cooling at +40. According to the “average temperature” we’d conclude that the earth was warmer by 1/2 degree. But based on w/m2, we’d actually be at a LOWER energy level.
Anomalies distract us from the real issue…. which is w/m2.

Frank K.
August 9, 2012 11:31 am

From NOAA’s own press release, the difference between 2012 (77.6 F) and 1936 (77.4 F) is miniscule and well within anyone’s error estimate for this kind of temperature measurement. So all we can really say is that the “average” temperature (as defined non-uniquely by NOAA algorithms) for July 2012 is comparable to (really the same as) July 1936 in the continental United States. That’s it.
Of course, please remember (as we are reminded whenever we cite any climate records for the U.S.) that the U.S. represents only about 2% of the Earth’s surface. Is this year remarkably warm globally? Not really…

Mindbuilder
August 9, 2012 11:34 am

Anthony, you need to update your original post right away with the lapse rate calculation prominately at the top. Don’t wait to do the calculations yourself, state Nick Stokes numbers as speculative until you can finish your caclulations. If you don’t do this as quickly as possible, you will be even more guilty than those who decided to “hide the decline”. You need to correct your mistakes, if any, as quickly as possible. If you delay, it makes you look like an oil company shill.
REPLY: Oh, please. People call me an ‘oil shill’ and bunches of other derogatory names no matter what I do or say. Typically that’s anonymous cowards like yourself. First, show me how NOAA calculated the national average. Did they do a straight average, or an altitude weighted one? Until we know, its all just speculation about what the correct method to match theirs is. For all we know, they may just use a straight area average. Nick Stokes method doesn’t deal with weather variation, which on the short time scale (1 month) may over/under correct. My point of this post, lost on you and Stokesy in the attempts to play “gotcha”, is that NOAA has this new network where they could use it to correctly weight the problems with the old troublesome uber adjusted network, or publish a new result outright. After four years of it being complete, they don’t say a peep. That’s the point. Be as upset as you wish. – Anthony

Mindbuilder
August 9, 2012 11:38 am

They really should put in some of the old temperature measuring devices along side the modern ones in the CRN. Perhaps just with digital thermometers. They also need to expand the network globally.

Paul K2
August 9, 2012 11:40 am

Anthony, How did you grid the USCRN station data? TIA.

1 4 5 6 7 8 11