As readers know, the recent paper Karl et al. 2015, written by the head of the National Climatic Data Center now NCEI, went to great lengths to try to erase “the pause” from the surface temperature record using a series of adjustments. Those adjustments are deemed unacceptable and criticized by some climate scientists, such as Dr. Richard Lindzen, Dr. Chip Knappenberger, and Dr. Pat Michaels, who recently wrote:
In addition, the authors’ treatment of buoy sea-surface temperature (SST) data was guaranteed to create a warming trend. The data were adjusted upward by 0.12°C to make them “homogeneous” with the longer-running temperature records taken from engine intake channels in marine vessels.
As has been acknowledged by numerous scientists, the engine intake data are clearly contaminated by heat conduction from the structure, and as such, never intended for scientific use. On the other hand, environmental monitoring is the specific purpose of the buoys. Adjusting good data upward to match bad data seems questionable, and the fact that the buoy network becomes increasingly dense in the last two decades means that this adjustment must put a warming trend in the data.
Dr. Judith Curry added:
My bottom line assessment is this. I think that uncertainties in global surface temperature anomalies is substantially understated. The surface temperature data sets that I have confidence in are the UK group and also Berkeley Earth. This short paper in Science is not adequate to explain and explore the very large changes that have been made to the NOAA data set. The global surface temperature datasets are clearly a moving target. So while I’m sure this latest analysis from NOAA will be regarded as politically useful for the Obama administration, I don’t regard it as a particularly useful contribution to our scientific understanding of what is going on.
Large adjustments accounted for the change, but one really should go back to the definition of “adjustments” to understand the true meaning and effect:
But, what if there were a dataset of temperature that was so well done, so scientifically accurate, and so completely free of bias that by its design, there would never be any need nor justification for any adjustments to the data?
Such a temperature record exists, it is called the U.S. Climate Reference Network, (USCRN) and it is also operated by NOAA/NCDC’s (NCEI) head administrator,Tom Karl:
These stations were designed with climate science in mind. Three independent measurements of temperature and precipitation are made at each station, insuring continuity of record and maintenance of well-calibrated and highly accurate observations. The stations are placed in pristine environments expected to be free of development for many decades. Stations are monitored and maintained to high standards and are calibrated on an annual basis. In addition to temperature and precipitation, these stations also measure solar radiation, surface skin temperature, and surface winds. They also include triplicate measurements of soil moisture and soil temperature at five depths, as well as atmospheric relative humidity for most of the 114 contiguous U.S. stations. Stations in Alaska and Hawaii provide network experience and observations in polar and tropical regions. Deployment of a complete 29-station USCRN network in Alaska began in 2009. This project is managed by NOAA’s National Climatic Data Center and operated in partnership with NOAA’s Atmospheric Turbulence and Diffusion Division.
Yes the USCRN is state of the art, and signed off on by Tom Karl here:
Source: https://ams.confex.com/ams/pdfpapers/71748.pdf
So, since this state of the art network requires no adjustment, we can most surely trust the data presented by it. Right?
While we seldom if ever see the USCRN mentioned in NOAA’s monthly and annual “State of the Climate” reports to the U.S. public, buried in the depths of the NCDC website, one can get access to the data and have it plotted. We now have 10 years, a decade, of good data from this network and we are able to plot it. I’ve done so, here, using a tool provided for that very purpose by NOAA/NCDC/NCEI:
Note the NOAA watermark in the plot above.
Source: http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets[]=uscrn¶meter=anom-tavg&time_scale=p12&begyear=2005&endyear=2015&month=12
NOAA helpfully provides that data in a comma separated values (CSV) file, which I have converted into Excel: USCRN-CONUS-time-series
Plotting that USCRN data, provides a duplicate of the above plot from NOAA/NCDC/NCEI, but also allows for plotting the trend. I’ve done so using the actual data from NOAA/NCDC/NCEI they provided at the source link above, using the DPlot program:

Clearly, a “pause” or “hiatus” exists in this most pristine climate data. In fact, a very slight cooling trend appears. But don’t take my word for it, you can replicate the plot above yourself using the links, free trial program, and USCRN data I provided from NOAA/NCDC/NCEI.
Let’s hope that Mr. Karl doesn’t see the need to write a future paper “adjusting” this data to make the last decade of no temperature trend in the contiguous USA disappear. That would be a travesty.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.


I would love to see a global chart of USCRN & the two Sat data for the same period. These three sources give us the clearest views of the natural climate changes through time.
You can go to woodfortrees and plot the two sat datasets for yourself for the same time period. I don’t think you can single out the US data from the sat data though…
That was my first thought, but UAH don’t give US cut out of data. NH TLT is only similar in slope, detail no so much, which is probably to be expected.
Actually, in version 6 of UAH there is both US48 and US49.
Excellent report Anthony.
I wonder if it’s going to result in this data getting adjusted and/or harder to locate? /sarc
That’s not /sarc. You have every right to be very concerned going by recent behaviour
I was going to say the same thing. That record is going to meet with an unfortunate accident some time soon.
I understand that data was involved in a mysterious single vehicle accident at a sharp turn on a mountain road. Apparently the data and vehicle were largely burned up in the fire resulting from the accident and the remains are too difficult to recover from the steep embankment. One official who began to investigate the accident rrsigned under accusations of marital infidelity.
I know it’s not /sarc. I believe there is a very high probability that the data will either get adjusted or become harder to find. Probably should have used a wink instead of the sarc.
Hey Michael Palmer, you mean like hard drives crashing ala the IRS Tea Party BS?
Somebody should have a contest on what is the most probable fate this data is going to meet…
Climate Adjustment Bureau – 101 – Winston Smith.
George Orwell – had his finger on the pulse in 1949:
Political language is designed to make lies sound truthful, murder respectable, and to give an appearance of solidity to pure wind.
If George Orwell were alive, would he be a global warming skeptic?
“There are notions so foolish that only an intellectual and useful tools will believe them,”
Orwell would be sceptical of the non-sceptics and sadly would be all too familiar with some of the current behaviours of Big Government.
The data is safe. It is stored on Hilary’s private server.
“Pause “hiatus”?
Is it not just as well described at the top of the curve in “Peak Surface Temperature” ?
Time for the public purse trough dwellers to start tracking the downward curve for AGC(ooling)…
If you can come up with a believable reason that CO2 can explain the recent cessation and apparent reversal of the previously evident warming, you will also win a Nobel Prize (or maybe they’ll transfer Al’s). They can then continue their conquest of controlling the world’s resources, unfettered by reality.
Ask Phil Jones he managed to lose all the original data from University of East Anglia Hadley Center.
from: Phil Jones
subject: Re: For your eyes only
to: “Michael E. Mann”
Mike,
….
If they ever hear there is a Freedom of Information Act now in the UK, I think I’ll delete the file rather than send to anyone.
The excuse was that the raw data was lost in office moves in the mid 1990’s. Shonky!
Or just call it something else.
Steven Mosher | June 28, 2014 at 12:16 pm |
“One example of one of the problems can be seen on the BEST site at station 166900–not somempoorly sited USCHN starion, rather the Amundsen research base at the south pole, where 26 lows were ‘corrected up to regional climatology’ ( which could only mean the coastal Antarctic research stations or a model) creating a slight warming trend at the south pole when the actual data shows none-as computed by BEST and posted as part of the station record.”
The lows are not Corrected UP to the regional climatology.
There are two data sets. your are free to use either.
You can use the raw data
You can use the EXPECTED data.
http://judithcurry.com/2014/06/28/skeptical-of-skeptics-is-steve-goddard-right/
See how easy it is.
DD More commented
Why would even the expected data be corrected up? Doesn’t that all by itself indicate something is wrong?
Excellent analysis Anthony. Thank you!
It would be nice if there are as a thorough global independent ground data set. I don’t know if such a thing exists. I have heard of independant sea level assessments but not ground data.
Thank you for your efforts to share good data and the tools to use so we can understand more clearly. Then share our understanding with others.
Please allow me to mimic a response from the Climate Fearosphere: “Oh, but the US isn’t the world”.
Did I get that about right?
/s
That’s what they consistently say about the 1930s.
The Fearosphere do dismiss the USCRN for precisely that reason, right before telling us that Hurricane Sandy or the floods in Texas, or the California drought, or earlier springs in Massachusetts are all due to “Climate Change”.
Easy retort , but the U.S. is part of the world .. We don’t call it agw-the us ..
If the entire golbe is warming that would include the U.S. .. Or does our military might somehow stop heat ?
Bingo. This is precisely why skeptics must be vigilant in responding to those who use GLOBAL temperature data to assert all kinds of climactic catastrophes onto Americans. Like President Obama, who says that his daughters’ asthma makes him especially concerned about the impact of climate change on American health.
No mention that the USCRN shows clearly that there has been no warming at all in the US over the last 10 years. No mention that he was a two-packs-a-day smoker during his daughter’s young childhood. No, he instead claims that all Americans are at much greater risk of asthma because you use electricity or drive a car.
Every time an American politician or media pundit talks about the growing impacts of Climate Change on America or Americans, skeptics should loudly denounce them with the unassailable fact that America IS NOT WARMING according to the pristine USCRN data.
Maybe more data points would help?
The World in 2100, According to NASA’s New Big Dataset
‘The predictions shown in this daily max temperature map come from a new NASA dataset released to the public on June 9th, one that collates historical records and climate models to produce high-resolution forecasts for the end of the century.’
“NASA is in the business of taking what we’ve learned about our planet from space and creating new products that help us all safeguard our future,” said Ellen Stofan, NASA chief scientist. “With this new global dataset, people around the world have a valuable new tool to use in planning how to cope with a warming planet.”
According to NASA:
This NASA dataset integrates actual measurements from around the world with data from climate simulations created by the international Fifth Coupled Model Intercomparison Project. These climate simulations used the best physical models of the climate system available to provide forecasts of what the global climate might look like under two different greenhouse gas emissions scenarios: a “business as usual” scenario based on current trends and an “extreme case” with a significant increase in emissions.
The NASA climate projections provide a detailed view of future temperature and precipitation patterns around the world at a 15.5 mile (25 kilometer) resolution, covering the time period from 1950 to 2100. The 11-terabyte dataset provides daily estimates of maximum and minimum temperatures and precipitation over the entire globe.
http://gizmodo.com/the-world-in-2100-according-to-nasas-new-big-dataset-1710798646
I read that as “international Fifth Column Model Intercomparison Project.”
That might be fun for comparing against the Farmer’s Almanac, O.C.W.!
I’ll still bet on the almanac, myself…
More data points of intensive properties don’t make a “global average” any more meaningful.
So Karl, who just published a paper showing a rising temperature or lack of a pause, is also jointly responsible for the most accurate terrestrial US temperature measurement system which is displaying a gentle fall in temperature.
Oh dear, who or what to believe!
The questions have been raised over how Karl’s 2015 paper treated buoys and ship data.
There are remarkably few buoys and ships sailing across the continent of North America.
They lack wheels.
Where did the y=0.6186x -0.002678 come from. I got y=-0.0024x +0.6042
The plot shows a negative slope on the trendline yet the equation shows positive.
No, the plot I have shows a negative trend, note the -0.002678.
I have no idea what you are doing, or with what, so I can’t comment on your results since you don’t show your work.
The NOAA’s own interactive CLIMATE AT A GLANCE web page calculates the trend of the CONTIGUOUS US ANNUAL TEMPERATURE ANOMALY between 2005 and 2015 to be -0.69 F/decade using the base period of 2005-2015 Similar figure for 1998-2015 is -0.48 F/ decade.
Dear James, you misread the equation on the plot, it is y=0.6186-0.002678x.
I get the same exact answer that Anthony has. I think the confusion stems from Anthony’s choice of units on the x-axis. Note that his x-axis is in units of months relative to the first data point.
So if you want the trend in degC/year, multiply by 12/1.8 to get -0.0179 degC/yr,
which is also -1.78 degC/century.
I usually present it in form y=mx+b.
the graph is b+mx. I added a column and used months =0 – 124. Data was in degrees F.
” yet the equation shows positive”
you misread, as I did initially, it’s not too clear. You are expecting the conventional y=m*x+c and it’s printed y=c+m*x
You both have neg. slope, small difference if fitted values.
I get a trend of -0.032ºF/year… or since our data is now a decade long, -0.32ºF/decade
If you go to http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=uscrn¶meter=anom-tavg&time_scale=p12&begyear=2005&endyear=2015&month=6
You can also turn on USHCN or ClimDiv. These are very obviously being adjusted to match USCRN.
I don’t know if the USHCN data on that page is what gets fed into Giss or HadCrut, but from the manic adjustment happening to data elsewhere in the world, it could be.. they have to create their warming from somewhere !!!
-0.32ºF/decade
=========
Yeow, that is – 3.2F a century for the continental US. That is -1.8 C. Dangerously close to the 2 C that the IPCC says will wipe out all life on earth. How will the US survive the next century as temperatures dip ever closer to the dreaded Hansen tipping point?
My Excel Spreadsheet says the trend since January 2005 is -.00036°F per year.
There, replicated using a different technique than mine, slightly different result, but still a slight cooling.
=SLOPE(B3:B$127,A3:A$127)
Copied all the way back UP from
=SLOPE(B126:B$127,A126:A$127)
one too many placement zeros, Steve.
I wonder what is the total cost to the U.S.Taxpayers of installing and maintaining this network, the output of which is seemingly ignored by its ‘owners’ because it doesn’t support the party line?
Great job Anthony. I hope that your charts are widely published.
I would ask Karl if he would make those same adjustments if they erased increasing temperatures when the raw data actually showed increasing temperatures.
Can anyone explain to me why the astonishing and worrying rise in global surface temps has seemingly had absolutely zero effect on the global precipitation averages.
At least, an internet search on this topic returns various graphs from various sources, some dating back as far as the start of the last century, but all unified by their distinctly flat and unchanging trends.
Should not the dramatic warming of planet earth have driven some shift in the overall rate at which water is evaporated from and then showered back onto the surface?
Or is that just some silly idea that I got from Blade Runner?
Here is a typical example of a flat precipitation trend. How is this sort of graph explained away in light of the recent much trumpeted warming of planet earth?
Is this not a puzzle?
http://www.aqua.nasa.gov/meso/index.php?section=19
Yes, it’s a puzzle but since it doesn’t fit the narrative it will be studiously ignored. Sorry about that. Normal programming will resume forthwith …
PS from your link. Since the global precipitation doesn’t change, the pink area in the last image must balance the bluer areas.
If the air is warmer, does it not “hold” more? Therefore extra evaporation would increase humidity,but not necessarily rainfall.
Until, of course, the more humid air hits a cold spot.
You wait, they will accuse you of cherry-picking that 10 years results, Anthony!!!!
They can’t (at least not honestly). I plotted the entire dataset available. No choice was made of any kind.
Made me laugh – the “at least not honestly” part.
As if that would stop them.
LOL
AndyE — Found myself laughing outloud — Eugene WR Gallun
“And some important information is missing that should be plainly listed. NCDC is doing an anomaly calculation on USCRN data, but as we know, there is only 9 years and 4 months of data. So, what period are they using for their baseline data to calculate the anomaly?”
http://wattsupwiththat.com/2014/06/07/noaa-shows-the-pause-in-the-u-s-surface-temperature-record-over-nearly-a-decade/
Anthony, did you ever figure this out??
If you go to here.
http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=uscrn¶meter=anom-tavg&time_scale=p12&begyear=2005&endyear=2015&month=6
and turn on the USHCN, you can see that they are using USHCN anomaly for a starting point, then adjusting USHCN to match USCRN.
http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/background:
Archive that data, and make sure there are no changes to past data !
This trend should be publicized far and wide.
Yea, the USA is only 4% of the world, but it is suspicious that all the warming (in the adjusted data sets) is alleged to be in far off hard to get to places (like the arctic).
I have the USCRN data in a spreadsheet, I check for changes each time they add a new month.
Its a matter of trust 😉
Hey J, one of the wonderfully elegant things about the “excess heat is hiding deep in the ocean” lie is that it was impossible for an average person to go check.
Of course, any parent has seen this behavior with very small children.
“Mommy! there is a monster hiding under my bed!”
“No, there is no monster. I just looked. No monster.”
“There is too a monster! It is an invisible monster!”
Great post, such an obvious thing to check. It’s only US land temps but it tells the story.
Now look at the data they provide.:
200501,1.75
200502,2.50
200503,-0.88
200504,0.41
etc.
With billion dollar budget that’s the best they can do. I now have to get out my calculator if I want to know what this is in degrees C ?? What a web site , they couldn’t manage a button for centigrade? And look at the dates 200502 WTF, so now I have to mess around splitting this in months and years, cool.
I suppose three simple columns of data would have been too obvious.
Perhaps you could find data more to you liking at http://www.ncdc.noaa.gov/crn/qcdatasets.html
The sub-hourly data is every 5 minutes. See ftp://ftp.ncdc.noaa.gov/pub/data/uscrn/products/subhourly01/README.txt
Remember, this the NCDC. They don’t seem to have a goal of making data easy to find, though in this case it’s actually pretty easy starting from their home page.
The USCRN was created to provide “good” surface data for the USA. In this case, good means that the entire system was designed to follow methodical scientific standards of measurement.
The reason for the creation of the USCRN was the total lack of such good standards in the maintenance of most “weather” stations, that were intended for local weather forecasting and aircraft safety.
That means that temperature data prior to the USCRN can NOT be trusted, although some individual stations may be good for some periods of time.
BTW, some USCRN stations came online in 2002 with stations added every year since. The website provides the basic data organized by individual station. Looking at the individual stations that started in 2002 it is clear that there is NO trend in the temp data. No warming, no cooling anywhere. Some stations do show that 2012 was clearly warmer than any other by a few degrees, with 2013 returning to “normal”
What makes you say the USHCN is “good”. Microsite is bad.
CRN, OTOH, is a thing of beauty. Triple-redundant sensors and the dozen or so I looked at were so Class 1 it hurt.
Anthony, when are you going to set up a mirror to archive all this data before it gets adjusted?
I have some fairly technical suggestions about how to preserve the authenticity of the data (such as providing and independently archiving secure hash checksums). Email me for details if you want, but here are some ideas:
Here’s a summary paper on the topic:
https://www.utica.edu/academic/institutes/ecii/publications/articles/9C4EBC25-B4A3-6584-C38C511467A6B862.pdf
In the old world of “first to invent” patent rights, we used a service that would securely checksum the evidence and publish the checksum in a reputable place, thereby archiving the time and place of invention with a mere 256 bits of data.
IMHO these techniques should be required for your open atmosphere society publishing guidelines.
Peter
Timestamping service: https://www.digistamp.com/
(I have no affiliation)
Sounds to me as though you just volunteered. 🙂
I can help with the design, but infrastructure requires money and organization…
The cool thing is that the Gold standard MATCHES the “bad data” from nearby sites!!!
DOH!
further:
1. The US is not the world
2. The Land is not the ocean.
[3. and Mosher’s drive by opinions aren’t the final word on anything -Anthony]
“2. The Land is not the ocean”
Aren’t land temperatures ultimately be driven by the oceans?
Uh, nearby properly sited stations or nearby stations that are not properly sited?
Read Renne’s comment on Venema’s blog. The USCHN compartors are all within 500 meters of a pristinr rural CRN. That they are similar is unsurprising, and says nothing aboutnthe less well sited over 1000 other USCHN stations. surface.org showed that most of those are problematic.
dont believe Rud
http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=cmbushcn¶meter=anom-tavg&time_scale=p12&begyear=2005&endyear=2015&month=12
[The US is not the world] Talk about beating a dead horse. Neither is one tree in Yamal or an ice core. On the other hand the US has the best network of weather/climate data stations in the world therefore a very good proxy, wait for it……….. the world! You like proxies right?
If all the heating or cooling takes place in the oceans than there would not be any long term heating or cooling over land.
If you can’t capitalize properly in your drive-bys, maybe you shouldn’t be texting while you drive-by. That’s dangerous.
What do you mean by “bad data”? The unadjusted data?
“I am adjusting the data. Pray I don’t adjust it further” — Darth Vader
Of all the things to take a shot at: an analysis that is about as close to reciting the multiplication tables as you can get with empirical evidence. This rejoinder is more Islamist fire-away-and-if-Allah-wills-it-will-hit-the-target than it is American Sniper.
I love it when Mosher deigns to punctuate his sentences.
1. + 2. The US sits next to the largest ocean body and is directly affected by it. It is nicely populated close to the average for any one regional area on Earth (within the range of not populated all the way to heavily populated) and burns lots of fossil fuels. Meaning that if human-sourced CO2 were causing either the air over land to heat, or the oceans to heat beyond natural variation, you think you would notice it as a positive trend on these surface temperature stations. I think Mosher understands this very well and I consider his above argument to be quite hollow, not to mention his disregard for well maintained research plots. I would also be willing to bet that he believes human heat to be hiding in the deep oceans and actively searches for papers that purport to have found exactly how that heat transfers magically from the air to the deep without leaving a trail.
(note: to head off a counter-argument, to say that this dangerous heat is absorbed into a much bigger space and would thus be impossible to detect as it transfers to the deep, also means that even if humans were causing this heat, it is essentially harmless since it fails to rise above natural variation.)
Pamela your Intelligence is soo sexy.
The really cool thing about Mr Karl’s dubious data adjustments to erase a pause is that resulting trends are still around 1.1 deg C/century. And that is still GCM fail.
1. The US is not the world
===============
so if the rest of the world is boiling and the US is freezing, the duty of the US president is to:
1. make the US warmer
2. make the US colder
3. make the world warmer
4. make the world colder.
I expect there is a very large disconnect between the current administration and the people on the correct answer to this question.
What actually happens:
5. have NAOO adjust the records, warming the US and cooling the rest of the planet.
The current prez. would surely chose #3.
It seems such a waste to maintain so many expensive sites when NOAA could rely on 3 or 4 spaced with the standard 1200 km radius and just infill the rest. /sarc off
Anybody happen to know where the 114 stations are situated? And what is the significance of the ‘high’ in March 2012?
Thanks for that. How about the ‘high’ march 2012? Surely it’s the wrong time of the year or am I just thick?
https://en.wikipedia.org/wiki/March_2012_North_American_heat_wave
The graph with the high in March is of temperature anomalies – the how much warmer the temperature is relative the long term average. Negative amounts reflect times went the country’s temperature was colder than average.
A graph of the average US temperature would have a much wider swing, so anomaly plots are much more useful than actual temperature. (Or graphs that start with Y = 0K by people who claim that better reflects reality.)
Jolan,
they are anomalies. Not actual temps.
Thanks to all you guys who answered my dumb question. I should have realised, just did not look hard enough. Interstestingly ‘Latitude’ has supplied a ‘wiki’ lead which shows a wide spread heat wave at the time, which i was not aware of.
My high school and college aged kids were/are not allowed to use wiki as a citation/reference in any research paper. I have personally found it to be badly inaccurate in some subject areas where I believe I am an expert, and especially if the subject has a political aspect. You might want to reconsider this source, and avoid it. There seems to be too many biased liberal wiki editors who apparently use their parents as their source of income and their parent’s basement as their abode (i.e. way more time to keep the message matching their world view than everyone else has to try to correct the record).
Anthony: Good work here. The fact that this lack of warming exists over a large land continent like the whole USA, presents a severe problem to NOAA in the contradiction of getting a warming result when including the oceans. The physics of that doesn’t make sense. The specific heats of the continent are LOWER than that of the oceans, so if CO2 warming was real, the anomaly must emerge in this data set FIRST.
The fact that it doesn’t and the satellite record conforms to the land USA zero trend and slight cooling indicates a flawed and suspect manipulation of the NOAA treatment. If these guys wer doing science, they should have realized this as soon as they obtained the result that they did.
Taxpayers have a right to expect that the billions being spent on NOAA per year should be the promotion and reservation of the true ideas of atmospheric science and meteorology, not this asinine power grab that appears to have happened through the Obama administration that is just politicizing every agency it seems to be able to get it’s hands on and further the gross fraud that is CO2 AGW through abusing these agencies and soliciting their agents to help them.
Chuck Wiese
Meteorologist
Chuck Wiese –
you bring up a good point.
Wonder how the satellite records of just the US match with the 10 years of USCRN?
Well, Canada is larger than the US. Central America is smaller.
I wouldn’t call the whole USA a continent….
If you compare the average of today’s warming to tonight’s cooling for all NCDC GSOD listed stations since 1940 that have more than 360 samples per year ( a total of about 69 million daily samples ) 50 of the last 74 years are slightly negative, 30 of the last 34 years are negative, and the overall average is slightly negative.
Nightly cooling exceeds day time warming.
“So, since this state of the art network requires no adjustment, we can most surely trust the data presented by it. Right?”
[CAGW]Only if it matches our preconceived notions.[/CAGW]
/grin
Seriously, there was a time when NOAA had some form of a disclaimer saying something along the lines of “our data isn’t reliable, so we have formed the U.S. Climate Reference Network, (USCRN).” Overall, it was an admission that long term data collected was very much suspect.
Therefore, it is reasonable for one to wonder why the USCRN is not now the basis for NOAA’s monthly reports?
Part of the reason, I’m sure, is that the CRN doesn’t have a full three decade record to fit the WMO range used to determine a normal climate.
The three decade has a questionable heritage. From these two snippets, you can deduce the WMO means any 30 year period, three full decades (I think Roy Spencer is using that), or non-overlapping decadess, i.e. 1961-1990 will be followed by 1991-2020).
I’ve also seen claims that 30 years is used because that’s what conveniently fit in the ledger books used in the pre-computer era.
Of course, Karl & co. are quite happy to ignore the CRN data as its short length encourages looking at other surface records that show the peak temperature was around 2005/2006 and it’s been (slowly) downhill since then.
From http://www.wmo.int/pages/prog/wcp/ccl/faqs.php :
From http://www.wmo.int/pages/themes/climate/climate_data_and_products.php :
The three decade has a questionable heritage.
If there are 60 year climate cycles (e.g. PDO), then by Nyquist it requires 120 years of observation.
30 years is definitely bad, as that is 1/2 a cycle, so you’ll be optimistic for one 30 year period and pessimistic for another 30 year period… I’d rather go with 60 years as long you know the exact 1-2 years of the start or end of the multi-decadal cycles.
Finally. 10-years of “good” data.