Despite attempts to erase it globally, "the pause" still exists in pristine US surface temperature data

As readers know, the recent paper Karl et al. 2015, written by the head of the National Climatic Data Center now NCEI, went to great lengths to try to erase “the pause” from the surface temperature record using a series of adjustments. Those adjustments are deemed unacceptable and criticized by some climate scientists, such as Dr. Richard Lindzen, Dr. Chip Knappenberger, and Dr. Pat Michaels, who recently wrote:

In addition, the authors’ treatment of buoy sea-surface temperature (SST) data was guaranteed to create a warming trend. The data were adjusted upward by 0.12°C to make them “homogeneous” with the longer-running temperature records taken from engine intake channels in marine vessels.

As has been acknowledged by numerous scientists, the engine intake data are clearly contaminated by heat conduction from the structure, and as such, never intended for scientific use. On the other hand, environmental monitoring is the specific purpose of the buoys. Adjusting good data upward to match bad data seems questionable, and the fact that the buoy network becomes increasingly dense in the last two decades means that this adjustment must put a warming trend in the data.

Dr. Judith Curry added:

My bottom line assessment is this.  I think that uncertainties in global surface temperature anomalies is substantially understated.  The surface temperature data sets that I have confidence in are the UK group and also Berkeley Earth.  This short paper in Science is not adequate to explain and explore the very large changes that have been made to the NOAA data set.   The global surface temperature datasets are clearly a moving target.  So while I’m sure this latest analysis from NOAA will be regarded as politically useful for the Obama administration, I don’t regard it as a particularly useful contribution to our scientific understanding of what is going on.

Large adjustments accounted for the change, but one really should go back to the definition of “adjustments” to understand the true meaning and effect:

adjustment

But, what if there were a dataset of temperature that was so well done, so scientifically accurate, and so completely free of bias that by its design, there would never be any need nor justification for any adjustments to the data?

Such a temperature record exists, it is called the U.S. Climate Reference Network, (USCRN) and it is also operated by NOAA/NCDC’s (NCEI) head administrator,Tom Karl:

Data from NOAA’s premiere surface reference network. The contiguous U.S. network of 114 stations was completed in 2008. There are two USCRN stations in Hawaii and deployment of a network of 29 stations in Alaska continues. The vision of the USCRN program is to maintain a sustainable high-quality climate observation network that 50 years from now can with the highest degree of confidence answer the question: How has the climate of the Nation changed over the past 50 years?

These stations were designed with climate science in mind. Three independent measurements of temperature and precipitation are made at each station, insuring continuity of record and maintenance of well-calibrated and highly accurate observations. The stations are placed in pristine environments expected to be free of development for many decades. Stations are monitored and maintained to high standards and are calibrated on an annual basis. In addition to temperature and precipitation, these stations also measure solar radiation, surface skin temperature, and surface winds. They also include triplicate measurements of soil moisture and soil temperature at five depths, as well as atmospheric relative humidity for most of the 114 contiguous U.S. stations. Stations in Alaska and Hawaii provide network experience and observations in polar and tropical regions. Deployment of a complete 29-station USCRN network in Alaska began in 2009. This project is managed by NOAA’s National Climatic Data Center and operated in partnership with NOAA’s Atmospheric Turbulence and Diffusion Division.

Yes the USCRN is state of the art, and signed off on by Tom Karl here:

USCRN-paperSource: https://ams.confex.com/ams/pdfpapers/71748.pdf

So, since this state of the art network requires no adjustment, we can most surely trust the data presented by it. Right?

While we seldom if ever see the USCRN mentioned in NOAA’s monthly and annual “State of the Climate” reports to the U.S. public, buried in the depths of the NCDC website, one can get access to the data and have it plotted. We now have 10 years, a decade, of good data from this network and we are able to plot it.  I’ve done so, here, using a tool provided for that very purpose by NOAA/NCDC/NCEI:

USCRN-CONUS-PLOT-10YEARS

Note the NOAA watermark in the plot above.

Source: http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets[]=uscrn&parameter=anom-tavg&time_scale=p12&begyear=2005&endyear=2015&month=12

NOAA helpfully provides that data in a comma separated values (CSV) file, which I have converted into Excel: USCRN-CONUS-time-series

Plotting that USCRN data, provides a duplicate of the above plot from NOAA/NCDC/NCEI, but also allows for plotting the trend. I’ve done so using the actual data from NOAA/NCDC/NCEI they provided at the source link above, using the DPlot program:

USCRN-trend-plot-from-NCDC-data
USCRN monthly CONUS data with polynomial “least squares fit, order of 1” for trend line done in DPlot program

Clearly, a “pause” or “hiatus” exists in this most pristine climate data. In fact, a very slight cooling trend appears. But don’t take my word for it, you can replicate the plot above yourself using the links, free trial program, and USCRN data I provided from NOAA/NCDC/NCEI.

Let’s hope that Mr. Karl doesn’t see the need to write a future paper “adjusting” this data to make the last decade of no temperature trend in the contiguous USA disappear. That would be a travesty.

5 1 vote
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

225 Comments
Inline Feedbacks
View all comments
June 14, 2015 9:51 am

I would love to see a global chart of USCRN & the two Sat data for the same period. These three sources give us the clearest views of the natural climate changes through time.

Reply to  Jack H Barnes
June 14, 2015 9:58 am

You can go to woodfortrees and plot the two sat datasets for yourself for the same time period. I don’t think you can single out the US data from the sat data though…

Mike
Reply to  Jack H Barnes
June 14, 2015 11:12 am

That was my first thought, but UAH don’t give US cut out of data. NH TLT is only similar in slope, detail no so much, which is probably to be expected.

Reply to  Mike
June 14, 2015 11:26 am

Actually, in version 6 of UAH there is both US48 and US49.

June 14, 2015 9:57 am

Excellent report Anthony.
I wonder if it’s going to result in this data getting adjusted and/or harder to locate? /sarc

Stephen Richards
Reply to  kramer
June 14, 2015 10:07 am

That’s not /sarc. You have every right to be very concerned going by recent behaviour

Reply to  Stephen Richards
June 14, 2015 11:19 am

I was going to say the same thing. That record is going to meet with an unfortunate accident some time soon.

Tom J
Reply to  Stephen Richards
June 14, 2015 12:20 pm

I understand that data was involved in a mysterious single vehicle accident at a sharp turn on a mountain road. Apparently the data and vehicle were largely burned up in the fire resulting from the accident and the remains are too difficult to recover from the steep embankment. One official who began to investigate the accident rrsigned under accusations of marital infidelity.

Reply to  Stephen Richards
June 14, 2015 1:28 pm

I know it’s not /sarc. I believe there is a very high probability that the data will either get adjusted or become harder to find. Probably should have used a wink instead of the sarc.

Reply to  Stephen Richards
June 14, 2015 1:32 pm

Hey Michael Palmer, you mean like hard drives crashing ala the IRS Tea Party BS?
Somebody should have a contest on what is the most probable fate this data is going to meet…

Ted G
Reply to  Stephen Richards
June 14, 2015 2:11 pm

Climate Adjustment Bureau – 101 – Winston Smith.
George Orwell – had his finger on the pulse in 1949:
Political language is designed to make lies sound truthful, murder respectable, and to give an appearance of solidity to pure wind.
If George Orwell were alive, would he be a global warming skeptic?
“There are notions so foolish that only an intellectual and useful tools will believe them,”
Orwell would be sceptical of the non-sceptics and sadly would be all too familiar with some of the current behaviours of Big Government.

ferdberple
Reply to  Stephen Richards
June 14, 2015 3:52 pm

The data is safe. It is stored on Hilary’s private server.

cnxtim
Reply to  kramer
June 14, 2015 12:53 pm

“Pause “hiatus”?
Is it not just as well described at the top of the curve in “Peak Surface Temperature” ?
Time for the public purse trough dwellers to start tracking the downward curve for AGC(ooling)…

Dawtgtomis
Reply to  cnxtim
June 14, 2015 3:00 pm

If you can come up with a believable reason that CO2 can explain the recent cessation and apparent reversal of the previously evident warming, you will also win a Nobel Prize (or maybe they’ll transfer Al’s). They can then continue their conquest of controlling the world’s resources, unfettered by reality.

Ian W
Reply to  kramer
June 14, 2015 2:26 pm

Ask Phil Jones he managed to lose all the original data from University of East Anglia Hadley Center.

ferdberple
Reply to  Ian W
June 14, 2015 3:57 pm

from: Phil Jones
subject: Re: For your eyes only
to: “Michael E. Mann”
Mike,
….
If they ever hear there is a Freedom of Information Act now in the UK, I think I’ll delete the file rather than send to anyone.

Patrick
Reply to  Ian W
June 14, 2015 10:22 pm

The excuse was that the raw data was lost in office moves in the mid 1990’s. Shonky!

DD More
Reply to  kramer
June 15, 2015 9:50 am

Or just call it something else.
Steven Mosher | June 28, 2014 at 12:16 pm |
“One example of one of the problems can be seen on the BEST site at station 166900–not somempoorly sited USCHN starion, rather the Amundsen research base at the south pole, where 26 lows were ‘corrected up to regional climatology’ ( which could only mean the coastal Antarctic research stations or a model) creating a slight warming trend at the south pole when the actual data shows none-as computed by BEST and posted as part of the station record.”
The lows are not Corrected UP to the regional climatology.
There are two data sets. your are free to use either.
You can use the raw data
You can use the EXPECTED data.

http://judithcurry.com/2014/06/28/skeptical-of-skeptics-is-steve-goddard-right/
See how easy it is.

Reply to  DD More
June 15, 2015 9:54 am

DD More commented

The lows are not Corrected UP to the regional climatology.
There are two data sets. your are free to use either.
You can use the raw data
You can use the EXPECTED data.

Why would even the expected data be corrected up? Doesn’t that all by itself indicate something is wrong?

June 14, 2015 10:02 am

Excellent analysis Anthony. Thank you!

Charlie
June 14, 2015 10:03 am

It would be nice if there are as a thorough global independent ground data set. I don’t know if such a thing exists. I have heard of independant sea level assessments but not ground data.

Steve Clauter
June 14, 2015 10:09 am

Thank you for your efforts to share good data and the tools to use so we can understand more clearly. Then share our understanding with others.

Alan Robertson
June 14, 2015 10:10 am

Please allow me to mimic a response from the Climate Fearosphere: “Oh, but the US isn’t the world”.
Did I get that about right?
/s

Reply to  Alan Robertson
June 14, 2015 10:15 am

That’s what they consistently say about the 1930s.

TYoke
Reply to  Alan Robertson
June 14, 2015 7:22 pm

The Fearosphere do dismiss the USCRN for precisely that reason, right before telling us that Hurricane Sandy or the floods in Texas, or the California drought, or earlier springs in Massachusetts are all due to “Climate Change”.

Cashman
Reply to  Alan Robertson
June 14, 2015 7:25 pm

Easy retort , but the U.S. is part of the world .. We don’t call it agw-the us ..
If the entire golbe is warming that would include the U.S. .. Or does our military might somehow stop heat ?

KTM
Reply to  Alan Robertson
June 15, 2015 10:26 am

Bingo. This is precisely why skeptics must be vigilant in responding to those who use GLOBAL temperature data to assert all kinds of climactic catastrophes onto Americans. Like President Obama, who says that his daughters’ asthma makes him especially concerned about the impact of climate change on American health.
No mention that the USCRN shows clearly that there has been no warming at all in the US over the last 10 years. No mention that he was a two-packs-a-day smoker during his daughter’s young childhood. No, he instead claims that all Americans are at much greater risk of asthma because you use electricity or drive a car.
Every time an American politician or media pundit talks about the growing impacts of Climate Change on America or Americans, skeptics should loudly denounce them with the unassailable fact that America IS NOT WARMING according to the pristine USCRN data.

old construction worker
June 14, 2015 10:10 am

Maybe more data points would help?
The World in 2100, According to NASA’s New Big Dataset
‘The predictions shown in this daily max temperature map come from a new NASA dataset released to the public on June 9th, one that collates historical records and climate models to produce high-resolution forecasts for the end of the century.’
“NASA is in the business of taking what we’ve learned about our planet from space and creating new products that help us all safeguard our future,” said Ellen Stofan, NASA chief scientist. “With this new global dataset, people around the world have a valuable new tool to use in planning how to cope with a warming planet.”
According to NASA:
This NASA dataset integrates actual measurements from around the world with data from climate simulations created by the international Fifth Coupled Model Intercomparison Project. These climate simulations used the best physical models of the climate system available to provide forecasts of what the global climate might look like under two different greenhouse gas emissions scenarios: a “business as usual” scenario based on current trends and an “extreme case” with a significant increase in emissions.
The NASA climate projections provide a detailed view of future temperature and precipitation patterns around the world at a 15.5 mile (25 kilometer) resolution, covering the time period from 1950 to 2100. The 11-terabyte dataset provides daily estimates of maximum and minimum temperatures and precipitation over the entire globe.
http://gizmodo.com/the-world-in-2100-according-to-nasas-new-big-dataset-1710798646

zemlik
Reply to  old construction worker
June 14, 2015 10:56 am

I read that as “international Fifth Column Model Intercomparison Project.”

Dawtgtomis
Reply to  old construction worker
June 14, 2015 3:37 pm

That might be fun for comparing against the Farmer’s Almanac, O.C.W.!
I’ll still bet on the almanac, myself…

Jeff Alberts
Reply to  old construction worker
June 14, 2015 7:33 pm

More data points of intensive properties don’t make a “global average” any more meaningful.

steverichards1984
June 14, 2015 10:11 am

So Karl, who just published a paper showing a rising temperature or lack of a pause, is also jointly responsible for the most accurate terrestrial US temperature measurement system which is displaying a gentle fall in temperature.
Oh dear, who or what to believe!

Reply to  steverichards1984
June 14, 2015 12:09 pm

The questions have been raised over how Karl’s 2015 paper treated buoys and ship data.
There are remarkably few buoys and ships sailing across the continent of North America.
They lack wheels.

James stockton
June 14, 2015 10:13 am

Where did the y=0.6186x -0.002678 come from. I got y=-0.0024x +0.6042
The plot shows a negative slope on the trendline yet the equation shows positive.

herkimer
Reply to  Anthony Watts
June 15, 2015 8:16 am

The NOAA’s own interactive CLIMATE AT A GLANCE web page calculates the trend of the CONTIGUOUS US ANNUAL TEMPERATURE ANOMALY between 2005 and 2015 to be -0.69 F/decade using the base period of 2005-2015 Similar figure for 1998-2015 is -0.48 F/ decade.

Piotr
Reply to  James stockton
June 14, 2015 10:35 am

Dear James, you misread the equation on the plot, it is y=0.6186-0.002678x.

Reply to  James stockton
June 14, 2015 10:59 am

I get the same exact answer that Anthony has. I think the confusion stems from Anthony’s choice of units on the x-axis. Note that his x-axis is in units of months relative to the first data point.
So if you want the trend in degC/year, multiply by 12/1.8 to get -0.0179 degC/yr,
which is also -1.78 degC/century.

James stockton
Reply to  wxobserver
June 14, 2015 11:13 am

I usually present it in form y=mx+b.
the graph is b+mx. I added a column and used months =0 – 124. Data was in degrees F.

Mike
Reply to  James stockton
June 14, 2015 11:09 am

” yet the equation shows positive”
you misread, as I did initially, it’s not too clear. You are expecting the conventional y=m*x+c and it’s printed y=c+m*x
You both have neg. slope, small difference if fitted values.

AndyG55
Reply to  James stockton
June 14, 2015 3:29 pm

I get a trend of -0.032ºF/year… or since our data is now a decade long, -0.32ºF/decade
If you go to http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=uscrn&parameter=anom-tavg&time_scale=p12&begyear=2005&endyear=2015&month=6
You can also turn on USHCN or ClimDiv. These are very obviously being adjusted to match USCRN.
I don’t know if the USHCN data on that page is what gets fed into Giss or HadCrut, but from the manic adjustment happening to data elsewhere in the world, it could be.. they have to create their warming from somewhere !!!

ferdberple
Reply to  AndyG55
June 14, 2015 4:07 pm

-0.32ºF/decade
=========
Yeow, that is – 3.2F a century for the continental US. That is -1.8 C. Dangerously close to the 2 C that the IPCC says will wipe out all life on earth. How will the US survive the next century as temperatures dip ever closer to the dreaded Hansen tipping point?

June 14, 2015 10:13 am

My Excel Spreadsheet says the trend since January 2005 is -.00036°F per year.

Reply to  Anthony Watts
June 14, 2015 12:13 pm

=SLOPE(B3:B$127,A3:A$127)
Copied all the way back UP from
=SLOPE(B126:B$127,A126:A$127)

AndyG55
Reply to  Steve Case
June 14, 2015 3:30 pm

one too many placement zeros, Steve.

Old'un
June 14, 2015 10:14 am

I wonder what is the total cost to the U.S.Taxpayers of installing and maintaining this network, the output of which is seemingly ignored by its ‘owners’ because it doesn’t support the party line?
Great job Anthony. I hope that your charts are widely published.

June 14, 2015 10:17 am

I would ask Karl if he would make those same adjustments if they erased increasing temperatures when the raw data actually showed increasing temperatures.

indefatigablefrog
June 14, 2015 10:18 am

Can anyone explain to me why the astonishing and worrying rise in global surface temps has seemingly had absolutely zero effect on the global precipitation averages.
At least, an internet search on this topic returns various graphs from various sources, some dating back as far as the start of the last century, but all unified by their distinctly flat and unchanging trends.
Should not the dramatic warming of planet earth have driven some shift in the overall rate at which water is evaporated from and then showered back onto the surface?
Or is that just some silly idea that I got from Blade Runner?
Here is a typical example of a flat precipitation trend. How is this sort of graph explained away in light of the recent much trumpeted warming of planet earth?
Is this not a puzzle?
http://www.aqua.nasa.gov/meso/index.php?section=19

Billy Liar
Reply to  indefatigablefrog
June 14, 2015 3:58 pm

Yes, it’s a puzzle but since it doesn’t fit the narrative it will be studiously ignored. Sorry about that. Normal programming will resume forthwith …
PS from your link. Since the global precipitation doesn’t change, the pink area in the last image must balance the bluer areas.

Reply to  indefatigablefrog
June 15, 2015 12:56 am

If the air is warmer, does it not “hold” more? Therefore extra evaporation would increase humidity,but not necessarily rainfall.
Until, of course, the more humid air hits a cold spot.

AndyE
June 14, 2015 10:22 am

You wait, they will accuse you of cherry-picking that 10 years results, Anthony!!!!

JohnWho
Reply to  Anthony Watts
June 14, 2015 11:29 am

Made me laugh – the “at least not honestly” part.
As if that would stop them.
LOL

Eugene WR Gallun
Reply to  AndyE
June 14, 2015 12:35 pm

AndyE — Found myself laughing outloud — Eugene WR Gallun

Latitude
June 14, 2015 10:26 am

“And some important information is missing that should be plainly listed. NCDC is doing an anomaly calculation on USCRN data, but as we know, there is only 9 years and 4 months of data. So, what period are they using for their baseline data to calculate the anomaly?”
http://wattsupwiththat.com/2014/06/07/noaa-shows-the-pause-in-the-u-s-surface-temperature-record-over-nearly-a-decade/

Latitude
Reply to  Latitude
June 14, 2015 10:58 am

Anthony, did you ever figure this out??

AndyG55
Reply to  Latitude
June 14, 2015 3:33 pm

If you go to here.
http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=uscrn&parameter=anom-tavg&time_scale=p12&begyear=2005&endyear=2015&month=6
and turn on the USHCN, you can see that they are using USHCN anomaly for a starting point, then adjusting USHCN to match USCRN.

Reply to  AndyG55
June 15, 2015 6:48 am

So, what period are they using for their baseline data to calculate the anomaly?

http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/background:

The 30 years from 1981 through 2010 provide the basis for the normal period for each month and network. Data exist for nClimDiv from 1895 to present, so a normal is simply the 30-year average of the gridded data. USCRN observations since commissioning to the present (4-9 years) were used to find relationships to nearby COOP stations and estimate normals at the USCRN sites using the normals at the surrounding COOP sites derived from full 1981-2010 records (Sun and Peterson, 2005). The normal values for each month are then subtracted from each monthly temperature average to create anomalies or departures from normal that can be compared between networks over time. In the final step, the station data are interpolated to 0.25° latitude by 0.25° longitude grids over the lower 48 states and then combined in an area weighted average for the whole U.S.

J
June 14, 2015 10:36 am

Archive that data, and make sure there are no changes to past data !
This trend should be publicized far and wide.
Yea, the USA is only 4% of the world, but it is suspicious that all the warming (in the adjusted data sets) is alleged to be in far off hard to get to places (like the arctic).

AndyG55
Reply to  J
June 14, 2015 5:05 pm

I have the USCRN data in a spreadsheet, I check for changes each time they add a new month.
Its a matter of trust 😉

Jason Calley
Reply to  J
June 15, 2015 8:10 am

Hey J, one of the wonderfully elegant things about the “excess heat is hiding deep in the ocean” lie is that it was impossible for an average person to go check.
Of course, any parent has seen this behavior with very small children.
“Mommy! there is a monster hiding under my bed!”
“No, there is no monster. I just looked. No monster.”
“There is too a monster! It is an invisible monster!”

Mike
June 14, 2015 10:37 am

Great post, such an obvious thing to check. It’s only US land temps but it tells the story.
Now look at the data they provide.:
200501,1.75
200502,2.50
200503,-0.88
200504,0.41
etc.
With billion dollar budget that’s the best they can do. I now have to get out my calculator if I want to know what this is in degrees C ?? What a web site , they couldn’t manage a button for centigrade? And look at the dates 200502 WTF, so now I have to mess around splitting this in months and years, cool.
I suppose three simple columns of data would have been too obvious.

Editor
Reply to  Mike
June 14, 2015 12:13 pm

Perhaps you could find data more to you liking at http://www.ncdc.noaa.gov/crn/qcdatasets.html

Quality Controlled Datasets
Selected subsets of monthly, daily, hourly and sub-hourly (5-minute) USCRN/USRCRN data are available as text files for easy access by users ranging from the general public to science experts. The most useful variables, including air temperature, precipitation, solar radiation, surface temperature, soil moisture and soil temperature data, are available. Files contain raw observations with quality flags, as well as calculated variables. Calculated data are shown only if a sufficient amount of source data passes its quality control tests; otherwise these values are set to a missing value.
For the daily, hourly and sub-hourly data, users can retrieve the entire archive of data by downloading the most recent file in the snapshots subdirectory. The updates subdirectory in the daily and hourly directories contain a record of the real-time daily/hourly data transmitted over NOAAPORT. Detailed descriptions of the files and variables are available in the documentation links listed below.

The sub-hourly data is every 5 minutes. See ftp://ftp.ncdc.noaa.gov/pub/data/uscrn/products/subhourly01/README.txt
Remember, this the NCDC. They don’t seem to have a goal of making data easy to find, though in this case it’s actually pretty easy starting from their home page.

bw
June 14, 2015 10:44 am

The USCRN was created to provide “good” surface data for the USA. In this case, good means that the entire system was designed to follow methodical scientific standards of measurement.
The reason for the creation of the USCRN was the total lack of such good standards in the maintenance of most “weather” stations, that were intended for local weather forecasting and aircraft safety.
That means that temperature data prior to the USCRN can NOT be trusted, although some individual stations may be good for some periods of time.
BTW, some USCRN stations came online in 2002 with stations added every year since. The website provides the basic data organized by individual station. Looking at the individual stations that started in 2002 it is clear that there is NO trend in the temp data. No warming, no cooling anywhere. Some stations do show that 2012 was clearly warmer than any other by a few degrees, with 2013 returning to “normal”

Evan Jones
Editor
Reply to  bw
June 14, 2015 7:41 pm

What makes you say the USHCN is “good”. Microsite is bad.
CRN, OTOH, is a thing of beauty. Triple-redundant sensors and the dozen or so I looked at were so Class 1 it hurt.

June 14, 2015 10:46 am

Anthony, when are you going to set up a mirror to archive all this data before it gets adjusted?
I have some fairly technical suggestions about how to preserve the authenticity of the data (such as providing and independently archiving secure hash checksums). Email me for details if you want, but here are some ideas:
Here’s a summary paper on the topic:
https://www.utica.edu/academic/institutes/ecii/publications/articles/9C4EBC25-B4A3-6584-C38C511467A6B862.pdf
In the old world of “first to invent” patent rights, we used a service that would securely checksum the evidence and publish the checksum in a reputable place, thereby archiving the time and place of invention with a mere 256 bits of data.
IMHO these techniques should be required for your open atmosphere society publishing guidelines.
Peter

Reply to  Peter Sable
June 14, 2015 10:48 am

Timestamping service: https://www.digistamp.com/
(I have no affiliation)

Editor
Reply to  Peter Sable
June 14, 2015 11:37 am

Sounds to me as though you just volunteered. 🙂

Reply to  Ric Werme
June 14, 2015 2:29 pm

I can help with the design, but infrastructure requires money and organization…

June 14, 2015 10:46 am

The cool thing is that the Gold standard MATCHES the “bad data” from nearby sites!!!
DOH!
further:
1. The US is not the world
2. The Land is not the ocean.
[3. and Mosher’s drive by opinions aren’t the final word on anything -Anthony]

Gary Hladik
Reply to  Steven Mosher
June 14, 2015 11:30 am

“2. The Land is not the ocean”
Aren’t land temperatures ultimately be driven by the oceans?

JohnWho
Reply to  Steven Mosher
June 14, 2015 11:31 am

Uh, nearby properly sited stations or nearby stations that are not properly sited?

Reply to  JohnWho
June 14, 2015 1:04 pm

Read Renne’s comment on Venema’s blog. The USCHN compartors are all within 500 meters of a pristinr rural CRN. That they are similar is unsurprising, and says nothing aboutnthe less well sited over 1000 other USCHN stations. surface.org showed that most of those are problematic.

Gonzo
Reply to  Steven Mosher
June 14, 2015 11:37 am

[The US is not the world] Talk about beating a dead horse. Neither is one tree in Yamal or an ice core. On the other hand the US has the best network of weather/climate data stations in the world therefore a very good proxy, wait for it……….. the world! You like proxies right?

Grant
Reply to  Steven Mosher
June 14, 2015 11:52 am

If all the heating or cooling takes place in the oceans than there would not be any long term heating or cooling over land.

Harold
Reply to  Steven Mosher
June 14, 2015 11:54 am

If you can’t capitalize properly in your drive-bys, maybe you shouldn’t be texting while you drive-by. That’s dangerous.

rokshox
Reply to  Steven Mosher
June 14, 2015 12:07 pm

What do you mean by “bad data”? The unadjusted data?

Jquip
Reply to  rokshox
June 15, 2015 12:24 am

“I am adjusting the data. Pray I don’t adjust it further” — Darth Vader

rw
Reply to  Steven Mosher
June 14, 2015 12:30 pm

Of all the things to take a shot at: an analysis that is about as close to reciting the multiplication tables as you can get with empirical evidence. This rejoinder is more Islamist fire-away-and-if-Allah-wills-it-will-hit-the-target than it is American Sniper.

u.k.(us)
Reply to  Steven Mosher
June 14, 2015 12:49 pm

I love it when Mosher deigns to punctuate his sentences.

Pamela Gray
Reply to  Steven Mosher
June 14, 2015 1:05 pm

1. + 2. The US sits next to the largest ocean body and is directly affected by it. It is nicely populated close to the average for any one regional area on Earth (within the range of not populated all the way to heavily populated) and burns lots of fossil fuels. Meaning that if human-sourced CO2 were causing either the air over land to heat, or the oceans to heat beyond natural variation, you think you would notice it as a positive trend on these surface temperature stations. I think Mosher understands this very well and I consider his above argument to be quite hollow, not to mention his disregard for well maintained research plots. I would also be willing to bet that he believes human heat to be hiding in the deep oceans and actively searches for papers that purport to have found exactly how that heat transfers magically from the air to the deep without leaving a trail.
(note: to head off a counter-argument, to say that this dangerous heat is absorbed into a much bigger space and would thus be impossible to detect as it transfers to the deep, also means that even if humans were causing this heat, it is essentially harmless since it fails to rise above natural variation.)

carbon bigfoot
Reply to  Pamela Gray
June 14, 2015 5:31 pm

Pamela your Intelligence is soo sexy.

Reply to  Steven Mosher
June 14, 2015 1:15 pm

The really cool thing about Mr Karl’s dubious data adjustments to erase a pause is that resulting trends are still around 1.1 deg C/century. And that is still GCM fail.

ferdberple
Reply to  Steven Mosher
June 14, 2015 4:15 pm

1. The US is not the world
===============
so if the rest of the world is boiling and the US is freezing, the duty of the US president is to:
1. make the US warmer
2. make the US colder
3. make the world warmer
4. make the world colder.
I expect there is a very large disconnect between the current administration and the people on the correct answer to this question.

ferdberple
Reply to  ferdberple
June 14, 2015 4:17 pm

What actually happens:
5. have NAOO adjust the records, warming the US and cooling the rest of the planet.

Reply to  ferdberple
June 14, 2015 5:05 pm

The current prez. would surely chose #3.

Reply to  Steven Mosher
June 15, 2015 4:06 am

It seems such a waste to maintain so many expensive sites when NOAA could rely on 3 or 4 spaced with the standard 1200 km radius and just infill the rest. /sarc off

Jolan
June 14, 2015 10:51 am

Anybody happen to know where the 114 stations are situated? And what is the significance of the ‘high’ in March 2012?

Latitude
Reply to  Jolan
June 14, 2015 11:48 am

comment image

Jolan
Reply to  Latitude
June 14, 2015 11:55 am

Thanks for that. How about the ‘high’ march 2012? Surely it’s the wrong time of the year or am I just thick?

Editor
Reply to  Jolan
June 14, 2015 12:24 pm

The graph with the high in March is of temperature anomalies – the how much warmer the temperature is relative the long term average. Negative amounts reflect times went the country’s temperature was colder than average.
A graph of the average US temperature would have a much wider swing, so anomaly plots are much more useful than actual temperature. (Or graphs that start with Y = 0K by people who claim that better reflects reality.)

Reply to  Jolan
June 14, 2015 1:16 pm

Jolan,
they are anomalies. Not actual temps.

Jolan
Reply to  Joel O’Bryan
June 14, 2015 2:28 pm

Thanks to all you guys who answered my dumb question. I should have realised, just did not look hard enough. Interstestingly ‘Latitude’ has supplied a ‘wiki’ lead which shows a wide spread heat wave at the time, which i was not aware of.

Reply to  Joel O’Bryan
June 14, 2015 8:23 pm

My high school and college aged kids were/are not allowed to use wiki as a citation/reference in any research paper. I have personally found it to be badly inaccurate in some subject areas where I believe I am an expert, and especially if the subject has a political aspect. You might want to reconsider this source, and avoid it. There seems to be too many biased liberal wiki editors who apparently use their parents as their source of income and their parent’s basement as their abode (i.e. way more time to keep the message matching their world view than everyone else has to try to correct the record).

Chuck Wiese
June 14, 2015 11:00 am

Anthony: Good work here. The fact that this lack of warming exists over a large land continent like the whole USA, presents a severe problem to NOAA in the contradiction of getting a warming result when including the oceans. The physics of that doesn’t make sense. The specific heats of the continent are LOWER than that of the oceans, so if CO2 warming was real, the anomaly must emerge in this data set FIRST.
The fact that it doesn’t and the satellite record conforms to the land USA zero trend and slight cooling indicates a flawed and suspect manipulation of the NOAA treatment. If these guys wer doing science, they should have realized this as soon as they obtained the result that they did.
Taxpayers have a right to expect that the billions being spent on NOAA per year should be the promotion and reservation of the true ideas of atmospheric science and meteorology, not this asinine power grab that appears to have happened through the Obama administration that is just politicizing every agency it seems to be able to get it’s hands on and further the gross fraud that is CO2 AGW through abusing these agencies and soliciting their agents to help them.
Chuck Wiese
Meteorologist

JohnWho
Reply to  Chuck Wiese
June 14, 2015 11:40 am

Chuck Wiese –
you bring up a good point.
Wonder how the satellite records of just the US match with the 10 years of USCRN?

Editor
Reply to  Chuck Wiese
June 14, 2015 11:41 am

Anthony: Good work here. The fact that this lack of warming exists over a large land continent like the whole USA

Well, Canada is larger than the US. Central America is smaller.
I wouldn’t call the whole USA a continent….

Reply to  Chuck Wiese
June 14, 2015 8:42 pm

If you compare the average of today’s warming to tonight’s cooling for all NCDC GSOD listed stations since 1940 that have more than 360 samples per year ( a total of about 69 million daily samples ) 50 of the last 74 years are slightly negative, 30 of the last 34 years are negative, and the overall average is slightly negative.
Nightly cooling exceeds day time warming.

JohnWho
June 14, 2015 11:33 am

“So, since this state of the art network requires no adjustment, we can most surely trust the data presented by it. Right?”
[CAGW]Only if it matches our preconceived notions.[/CAGW]
/grin

JohnWho
June 14, 2015 11:38 am

Seriously, there was a time when NOAA had some form of a disclaimer saying something along the lines of “our data isn’t reliable, so we have formed the U.S. Climate Reference Network, (USCRN).” Overall, it was an admission that long term data collected was very much suspect.
Therefore, it is reasonable for one to wonder why the USCRN is not now the basis for NOAA’s monthly reports?

Editor
Reply to  JohnWho
June 14, 2015 11:55 am

Part of the reason, I’m sure, is that the CRN doesn’t have a full three decade record to fit the WMO range used to determine a normal climate.
The three decade has a questionable heritage. From these two snippets, you can deduce the WMO means any 30 year period, three full decades (I think Roy Spencer is using that), or non-overlapping decadess, i.e. 1961-1990 will be followed by 1991-2020).
I’ve also seen claims that 30 years is used because that’s what conveniently fit in the ledger books used in the pre-computer era.
Of course, Karl & co. are quite happy to ignore the CRN data as its short length encourages looking at other surface records that show the peak temperature was around 2005/2006 and it’s been (slowly) downhill since then.
From http://www.wmo.int/pages/prog/wcp/ccl/faqs.php :

What is Climate?
Climate, sometimes understood as the “average weather,” is defined as the measurement of the mean and variability of relevant quantities of certain variables (such as temperature, precipitation or wind) over a period of time, ranging from months to thousands or millions of years.
The classical period is 30 years, as defined by the World Meteorological Organization (WMO). Climate in a wider sense is the state, including a statistical description, of the climate system.

From http://www.wmo.int/pages/themes/climate/climate_data_and_products.php :

Climate Normals
Climate “normals” are reference points used by climatologists to compare current climatological trends to that of the past or what is considered “normal”. A Normal is defined as the arithmetic average of a climate element (e.g. temperature) over a 30-year period. A 30 year period is used, as it is long enough to filter out any interannual variation or anomalies, but also short enough to be able to show longer climatic trends. The current climate normal period is calculated from 1 January 1961 to 31 December 1990.

Reply to  Ric Werme
June 14, 2015 2:34 pm

The three decade has a questionable heritage.
If there are 60 year climate cycles (e.g. PDO), then by Nyquist it requires 120 years of observation.
30 years is definitely bad, as that is 1/2 a cycle, so you’ll be optimistic for one 30 year period and pessimistic for another 30 year period… I’d rather go with 60 years as long you know the exact 1-2 years of the start or end of the multi-decadal cycles.

Rob
June 14, 2015 11:45 am

Finally. 10-years of “good” data.

1 2 3
Verified by MonsterInsights