Today, there’s all sorts of caterwauling over the NYT headline by Justin Gillis that made it above the fold in all caps, no less: FOR THIRD YEAR, THE EARTH IN 2016 HIT RECORD HEAT.
I’m truly surprised they didn’t add an exclamation point too. (h/t to Ken Caldiera for the photo)
Much of that “record heat” is based on interpolation of data in the Arctic, such as BEST has done. For example:
A different view of the record #Arctic warmth in 2016, which contributes to the ongoing decline in #seaice. pic.twitter.com/m1vt4k1wNo
— Dr. Robert Rohde (@RARohde) January 18, 2017
Since 1970, #globalwarming has continued at a furious pace, and disproportionately impacted continents and the Arctic. pic.twitter.com/HOBAZuirJA
— Dr. Robert Rohde (@RARohde) January 18, 2017
But in reality, there’s just not much data at the poles, there is no permanent thermometers at the North pole, since sea ice drifts, is unstable, and melts in the summer as it has for millennia. Weather stations can’t be permanent in the Arctic ocean. So, the data is often interpolated from the nearest land-based thermometers.
To show this, look at how NASA GISS shows data with and without data interpolation to the North pole:
WITH 1200 kilometer interpolation:

WITHOUT 1200 kilometer interpolation:

Here is the polar view:
WITH 1200 kilometer interpolation:
WITHOUT 1200 kilometer interpolation:
Source: https://data.giss.nasa.gov/gistemp/maps/https://data.giss.nasa.gov/gistemp/maps/
Grey areas in the maps indicate missing data.
What a difference that interpolation makes.
So you can see that much of the claims of “global record heat” hinge on interpolating the Arctic temperature data where there is none. For example, look at this map of Global Historical Climatological Network (GHCN) coverage:
As for the Continental USA, which has fantastically dense thermometer coverage as seen above, we were not even close to a record year according to NOAA’s own data. Annotations mine on the NOAA generated image:
- NOAA National Centers for Environmental information, Climate at a Glance: U.S. Time Series, Average Temperature, published January 2017, retrieved on January 19, 2017 from http://www.ncdc.noaa.gov/cag/
That plot was done using NOAA’s own plotter, which you can replicate using the link above. Note that 2012 was warmer than 2016, when we had the last big El Niño. That’s using all of the thermometers in the USA that NOAA manages and utilizes, both good and bad.
What happens if we select the state-of-the-art pristine U.S. Climate Reference Network data?
Same answer – 2016 was not a record warm year in the USA, 2012 was:
Interestingly enough, if we plot the monthly USCRN data, we see that sharp cooling in the last datapoint which goes below the zero anomaly line:
Cool times ahead!
Added: In the USCRN annual (January to December) graph above, note that the last three years in the USA were not record high temperature years either.
Added2: AndyG55 adds this analysis and graph in comments
When we graph USCRN with RSS and UAH over the US, we see that USCRN responds slightly more to warming surges.
As it is, the trends for all are basically identical and basically ZERO. (USCRN trend was exactly parallel and with RSS and UAH (all zero trend) before the slight El Nino surge starting mid 2015 )

Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.







1. You make the mistake of focusing on CONUS. if you include the entire US you have a record
For reference here is every country
http://berkeleyearth.org/wp-content/uploads/2017/01/2016_CountryRankMap-1024×571.png
2. You make a mistake of using GHCN version 2 !!!! that data is deprecated. There are plenty of stations
near the arctic IF you.
A) Use The latest NOAA data
B) Use ALL the stations
C) Use NON NOAA stations. Early on we discovered that there are many stations that NOAA doesnt
have in its GHCN daily datasets. In fact we’ve done NOAA versus NON NOAA studies
( Guess what)
3. The arctics was particularly warm and its shows the flaws in interpolation methods that are
area based. CRU for example interpolates only within cell boundaries. A simple example
will illustrate the problem
Suppose you have a station at 84North. -152.5W In the CRU approach that will be interpolated
1 degree to the north ( about 110km) and 4 degrees to the south. and it will be interpolated
2.5Degress to the east and west.. But at that latitude 2.5 degrees east and west is a very short
distance.. At the equator it would be interpolated ~ 260km east and west. So the amount
of interpolation they do is a function of latitude and not strictly speaking on distance. Another way
to look at it is this. If they centered their grid differently that point at 84 North would be interpolated
2.5 degrees north and 2.5 degrees south. IN other words the weight of their stations changes with
latitude and the gridding method. In a spatial statistics approach you dont have this problem as
the temperature is a function of latitude and you grid afterwards.
Normally the CRU approach is ok the only issue happens when the areas they dont cover
are warming faster than the rest of the planet.
To get a sense of how warm, the arctic is we can use a favorite chart here at WUWT
Note this is based on the approach that Judith Curry prefers for the arctic physics based
estimation
So ya Its warmer in the Arctic.. CRU miss that because in their approach they assume that
the arctic warms as fast as the rest of the planet..
Mosh …. Why are you using an Arctic temperature chart from March-April last year?
R
Got the link from WUWT reference page.
consider the source
For the mathematically inclined
How many stations do you need
http://journals.ametsoc.org/doi/abs/10.1175/1520-0442(1994)007%3C1999%3ASATOEO%3E2.0.CO%3B2
” AndyG55 adds this analysis and graph in comments”
And Donald Klipstein pointed out that the USCRN trend, as marked on the plot, is 4.86°C/Century, a very high warming rate indeed. And about three times the satellite rates, which were also far from low. So I don’t see how that helps the argument.
Eh, go pull the data instead of eyeballing it. The sum of the twelve monthly 2005 anomalies is +9.34F. The average of those monthly 2005 anomalies is +0.778F.
Look at the full dataset and the sum of the 11 years is +100F (roughly the same as 11 repeats of 2005), and the overall average anomaly is +0.701F, below the same value for 2005 alone.
When you realize that they started the USCRN record at an anomaly of +1.75 (+0.8C), having the one year and 11 year average anomalies come in at +0.778 and +0.701 doesn’t indicate rapid warming as you suggest.
I didn’t eyeball the data. DK did, and figured 0.46 C.dec. I just read the annotation toward top right. 0.0468C/yr, or 0.468 C/dec. I’m impressed by DK’s eyeball. And that is fast warming. There has been a lot of “if you take this out, then…” lately. But that is what it was. And it had a spike, not a dip, at the start.
It didn’t support any of DK’s claims. It doesn’t match satellite at all. And it is very far from zero (satellites aren’t close either).
“It didn’t support any of DK’s claims”
Sorry, I meant AndyG’s claims.
Please note that in the USCRN data, the ” true zero point” is really +0.8, since that is where they start the data set.
So when December dips below “0 anomaly”, it is actually closer to -1.0 true anomaly from the actual starting point of the data series.
Yet another example of the Warmists polluting the most pristine data with garbage data, by using less pristine stations to set the starting point.
NYT- “All the news that’s fit to print” -and then some!
I still think all this BS was already in the works to give HRC a kick-start towards guilting the nation into submission to UN domination. Now it has become their last-ditch effort to retain what portion of the populus who still are entranced believers in the fantasy of human climate culpability.
Science is about observation, experimentation and replication.
The easiest way to settle matters would be to select in each country say a dozen or 20 best sited stations (rural/free from encroachment of UHI) with no station moves and with the best record keeping standards, and then to retrofit these stations with the same LIG thermometers as were used in the late 1930s/early 1940s.
One would then replicate by observing, for the next few years, using the same TOB as used in the 1930s/1940s and taking readings in Fahrenheit using the same LIG thermometers.
In this manner there would be no need for any adjustments. Just use the raw data from the late 1930s/early 1940s, and compare that to the data collected between say 2017 to 2022.
One would not make a global temperature set. Just a different set for each location, and see what has happened at each of those locations.
Within 5 years we would have a very good insight into how much warmer temperatures have become since the highs of the late 193s/early 1940s.
TOB only applies in the US and a couple other countries.
Next, Even though sateliites have gone through More changes than surface observing changes its funny that you dont suggest launching new satellites with old instruments.
We already have 10+ years of pristine stations (CRN) to compare with “bad” data..
Guess what.. The “bad” stations are good
[so you say – mod]
Reblogged this on WeatherAction News and commented:
As Bill Illis writes in the comments:
The two most northerly stations are Eureka Canada (84N) and Svalbard (78N).
They both had very warm years about 6C above normal in 2016 (yes I checked). Probably a fluke more than anything else but they also have very variable year-by-year records, just like every station. +/- 6.0C is not that unusual for these two stations.
BUT, this does not mean the entire Arctic Ocean was 6C above normal in 2016. If that was the case, ALL of the sea ice would have melted out this summer. At best, the Arctic Ocean was 1.0C above normal, probably just 0.5C.
This extrapolation technique across the polar oceans is completely BS.
There are physical signs that have to be evident to show any ocean area being so far above normal.
Meanwhile I’m left wondering how much of this heat (record or not) was vented into space?
Taking data from Wunderground – a selection of some of the longest running stations I can find, I get:
2016 was…
Brampton village: 9th out of the last 17 years
Manchester city: 12th out of the last16 years
Bedford rural: 10th out of the last 12 years
Derbyshire rural: 10th out of the last 14 years
Suffolk rural: 6th out the last 12 years
Taunton town: 5th out the last 15 years
Lancashire coastal: 8th out of the last 15 years
No record heat in England – apart from when a 747 warmed up the thermometer at Heathrow during the summer – and all trending downwards
Warmest year for most of them was 2006
“THAT’s an analysis of data. Bravo.”
So what did you learn from it?
According to the widely quoted Central England Temperature (CET), 2014 was slightly warmer than 2006, with 10.93 degrees, against 10.82.
2016 came in at 10.31, the 9th warmest out of the last 16, and was unremarkable throughout. Three months had temperatures lower than the 1981-2010 mean (March, April and November).
The most recent 10-year mean is 10.08 degrees. The 10-year mean has been above 10 degrees since 1988-97, and never before that since the series began in 1659, and it .peaked at 10.46 in 1997-2006
The 1930s had an average temperature of 9.62 degrees C
This morning the BBC reported the temperature in Farnborough (near London, S of England) as -6 degC and in Edinburgh in Scotland it was +6 degC, of difference of 12 degC
The distance between those two places is 550 km. Now explain to me the justification of interpolating data out to 1200 km.
Every winter where I live I see a 12 degree Fahrenheit difference (7 degrees Celsius) after driving a measly mile from the center of a tiny little 6,000 person town to just outside of town. UHI is massive in the winter in tiny little rural towns in the USA. But these all knowing scientists just smear for 1200 km’s, and tell us that rural places do not receive much at all in UHI.
Look at the difference between the UK and Spain. say Madrid which is no more than about 1200km. They have very different temperatures.
Of course, the warmists argue that the anomalies are similar over these distances, however, whilst it is likely that the anomalies may be more similar, geography and topography no doubt plays an important role.
If a station is influenced by it position relative to the oceans, or winds/weather fronts coming mainly from one direction, it is very unlikely to have similar anomalies to stations which are not impacted by their position relative to oceans, or where winds/weather fronts usually come from a different direction.
Infilling definitely gives rise to potentially wider error margins.
90+ % of the variance in monthly temperature is explained by LATITUDE and Elevation of the station.
Check the latitude of spain and the UK.
Duh.
Steven
What are the latitudes of the stations that infill the Arctic?
Forrest Gardener on January 20, 2017 at 12:43 pm
Could you PLEASE stop writing about your stupid Rutherglen?
I started 1903 and ended… 1921.
That’s nearly a century ago.
richard verney on January 20, 2017 at 2:21 pm
What are the latitudes of the stations that infill the Arctic?
Why do you expect other people to do your job? Look at
ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v3/
The three northernmost GHCN stations are:
22220046000 80.6200 58.0500 20.0 GMO IM.E.T.
40371082000 82.5000 -62.3300 66.0 ALERT,N.W.T.
43104312000 81.6000 -16.6700 34.0 NORD ADS
Averaged trend: 0.7 °C / decade
Regarding Steve Mosher response:
“90+ % of the variance in monthly temperature is explained by LATITUDE and Elevation of the station.”
Err…in the example I gave the two locations are at pretty much the same elevation. As regards latitude, UK geography may not be a strong point for US based posters, but I can assure you that Scotland is further North than Southern England.
Edinburgh +6 degC Latitude 56 degN
Farnborough -6 degC Latitude 51 degN
You got it. If the globe is warming then it shouldn’t take but a few thermometers around the globe to show it. Using fake, interpolated, imputed data in order to obtain a “global” figure is way over doing it.
Hotttttttessssssssst innnnnnnnnnnnnnnn onnnnnnnnne hunnnnnnnnnnnnndred twennnnnnnnty thouuuuuuuuusand yearrrrrrrrrrrrrrrs!
Where could I find a table for the last 15 years showing world temperatures- also hottest and mimimums please? Satellite and land based.
Surface for example here:
ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v3/
There you find 2 x 3 zipped files
– qcu = unadjusted, qca = adjusted
– tavg, tmin, tmax
All zips contain 2 files each
– data
– station list
Satellite data (tavg):
http://www.nsstc.uah.edu/data/msu/v6.0/tlt/uahncdc_lt_6.0.txt
Sure they have tmin and tmax as well but I don not know where that data is.
Not sure they do, Bindidon. Stations report Tmin and max, but satellites have a general coverage for atmos rather than specific locations. Global temps don’t have a diurnal cycle.
Well they do have a diurnal cycle, and the data is there, and it shows that a small forcing increasing min temps is not what is happening.
And I can show why. This is the surface effect that supports Willis and Watts satellite data paper they just published.
https://micro6500blog.wordpress.com/2016/12/01/observational-evidence-for-a-nonlinear-night-time-cooling-mechanism/
Everyday thousands of surface station capture the dynamic response of the atm to a solar cycle.
It’s all the ocean cycles.
I don’t think you’ll find a diurnal range for satellite temps. Your point is a different subject.
The latest UAH satellite anomalies for the Arctic region have extremely high temperatures.
For UAH satellite, 2016 was definitely the warmest annual average, and also had the 2 warmest months in the record January and October. October was the warmest, and both are the only two months in that data series where the anomaly exceeded 2 degrees C above average (UAH baseline).
Satellite data have the Arctic region warming the strongest – more than twice as much as global – so it appears that the surface records are not inventing warm blobs through interpolation, but rather are corroborated by the satellite record.