UPDATE: Corrected the typo in Figure 3. 1988 now correctly reads 1989.
# # #
There’s lots of blogosphere chatter about the warm temperatures in Russia in November 2013. In their global State of the Climate Report this month, NOAA stated:
According to Roshydromet, Russia observed its warmest November since national records began in 1891. Some areas of the Urals, Siberia, south of the Far East region, and on the Arctic islands in the Kara Sea had temperatures that were more than 8°C (14°F) higher than the monthly average.
NOAA even discussed the record warm temperatures on their global map here.
It might be true that Russian land surface air temperatures were at record levels for the month of November, but NOAA failed to present something that’s blatantly obvious in the data. In 1988, surface air temperature anomalies for much of Russia shifted upwards by more than 1 deg C.
The Russian “hotspot” stands out very clearly in the NOAA map presented in Figure 1. Based on it, I’ve used the coordinates of 50N-70N, 30E-140E for the NOAA NCDC data, and the climate model outputs, presented in the following graphs. That region covers a major portion of Russia.
Figure 1
Figure 2 presents the NCDC land surface air temperature anomalies for the Russian “hotspot”, for the period of January 1920 to November 2013. I’ve highlighted about when the shift occurred. Before that shift, surface temperatures there warmed very little, if at all. And after it, surface temperatures appear to have warmed, but not at an excessing rate. We’ll confirm that later.
Figure 2
The shift is much easier to see if we smooth the data with a 13-month filter, minimizing the visual impact of the monthly variations. In fact, with the aid of period average temperatures (the horizontal lines) and with some color-coding, the shift in 1988 becomes obvious. See Figure 3. Based on the period-average temperatures before and after 1988, that climate shift raised Russian “hotspot” surface temperatures by about 1.1 deg C.
Figure 3
MODEL-DATA COMPARISON BEFORE AND AFTER THE 1988 SHIFT
Figure 4 is a model-data comparison graph for the surface air temperature anomalies of the Russian “hotspot” for the period of January 1920 through December 1987. Both the NCDC surface temperature data and the climate model outputs have been smoothed w/ 13-month running average filters. The climate models are the multi-model ensemble mean of the models stored in the CMIP5 archive, using the historic and RCP6.0 scenarios. The CMIP5 archive, as you’ll recall, was used by the IPCC for their 5th Assessment Report. And we discussed why we use the model mean in the post here.
Figure 4
NOTE: The trends in Figures 4 and 5 are based on the “raw” data and model outputs, not the smoothed versions.
The models did a reasonable job of simulating the warming rate from 1920 to 1987. In more than 65 years, they only overestimated the warming by about 0.23 Deg C. But the models perform quite poorly for the period from January 1989 to November 2013. See Figure 5. During this much-shorter 25-year period, the models overestimated the warming by more than 1.1 deg C.
Figure 5
Let’s state that again: the models overestimated the warming by more than 1.1 deg C over the most recent 25-year period.
Climate model failings at the regional levels are not unusual. We discussed those failings in numerous posts over the past year and in my book Climate Models Fail.
WHAT CAUSED THE SHIFT?
The timing of the shift in the Russian surface temperatures is similar to the shift in Scandinavian surface air temperatures. See the post here. There we discussed that the shift in surface temperature was possibly a response to a shift in the sea level pressure and interrelated wind patterns associated with the Arctic Oscillation.
Additionally, see de Laat and Crok (2013) A Late 20th Century European Climate Shift: Fingerprint of Regional Brightening? The authors argue that a shift in the North Atlantic Oscillation (similar to the Arctic Oscillation) in the late 1980s caused more sunlight to warm European surface temperatures in an apparent shift. I would suspect that something similar occurred over Russia at that time as well.
CLOSING
Like other regions, a climate shift, not the long-term effects of manmade greenhouse gases, is responsible for a major portion of the warming that occurred over much of Russia.
And, of course, climate models performed poorly when attempting to simulate the warming that occurred there since the 1988 shift, overestimating the warming by a large amount. So what else is new?
SOURCE
The NCDC surface temperature data and the CMIP5-archived climate model outputs are available through the KNMI Climate Explorer.





We often refer to and use the AO(Arctic Oscillation) and there are probably some that don’t know exactly what that is. This site does a great job explaining it:
http://www.nc-climate.ncsu.edu/climate/patterns/NAO.html
You can find a chart of the great dying of Russian thermometers here: http://s1244.photobucket.com/user/stanrobertson/media/tempstations.jpg.html?sort=3&o=94
The irony is that Mosher’s BEST product is the one data set designed to simply remove such data jumps, algorithmically, so if it’s still there in their final plot, they must have had to tweak the overall parameter knobs and gizmos to obtain their global hockey stick, Russia be damned.
The Arctic Oscillation shift from 1988 is well apparent in CET and UK temperatures:
http://www.metoffice.gov.uk/climate/uk/actualmonthly/17/Tmax/UK.gif
http://wattsupwiththat.com/2013/10/30/how-long-before-we-reach-the-catastrophic-2c-warming/#comment-1464723
could this have something to do with “The Great Thermometer Cull” from the 90’s
I was born and raised during the Cold War. (Not the Warm War today’s kids are in.) Of course the Cold War was political. The USSR was very secretive.
I’ve read that the US Midwest was once an inland sea.
Perhaps Russia has a secret inland ocean and that’s where the heat has been hiding?
(Long way around to make a joke.)
izen says:
“The climate shift seen in Russia coincides with the rapid rate of ice loss in the Arctic that started in the late eighties as well as warming in Scandinavia.”
The accelerated ice loss is from increasingly *negative* AO/NAO episodes from 1996 onwards:
http://www.cpc.ncep.noaa.gov/products/precip/CWlink/pna/nao.timeseries.gif
http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/seaice.anomaly.arctic.png
MarkB, Curt, and Steve from Rockwood,
Thanks a lot for the responses.
I actually was aware of the symmetry issue. And, if anomaly use suppressed all of the 1 / year component, there would be no preference for 13 months over any other period. So I wouldn’t have thought those reasons explain the high popularity.
But your answers suggest to me that there’s likely no better justification, so they probably do.
Bob, Concerning the model/data comparison (figure 4 & 5) and associated discussion, you haven’t accounted for the presumed “shift” in your accounting of the measured data trend.
To wit, Figure 4 shows 88 years of 0.005 degree/decade or about 0.044 degrees over the interval. Figure 5 shows 25 years of 0.079 degree/decade or about 0.198 degrees, so a total of 0.242 degrees over your data set. In contrast, Figure 3 suggests something more like 1 degree over your data set so you’ve lost something on the order of 0.75 degrees in your accounting.
Why is it that when a region is warmer than normal its supposed proof of Global Warming (i.e. Climate Change), but when a region is cooler than normal it is discounted as “just weather”????
So the take away I am getting is: Using a 13 month moving average is OK because the results are only a little wrong because we are using residuals, it is only 1/13 of the result, etc. Whatever your excuse for this practice, the fact remains it re-introduces a periodicity and hence defeats the purpose of a moving average which is to smooth periodicities. Yes, Yes, it isn’t huge. About 7% in the data set I showed which varies seasonally from 5 to 10. If those seasonal variations are in the residual rather than the raw, it is still a 7% difference. For a time series with a periodicity over an even number of time units, there is no justification for using an odd number of time units except you can’t shift your graph 1/2 unit. There is no mathematical justification for using 13 other than it makes the point on the graph “fall on the line” rather than “between the lines”. Aesthetically pleasing is not a mathematical argument I’m familiar with.
Following on from fredberple’s notes, McKitric & Michael 2004 show an order of magnitude increase in empty temperature data cells in the USSR after 1990 than before.
A test of corrections for extraneous signals in gridded surface temperature data
In considering averaging over 12 months or 13 months, I have suggested that a much better length 13 filter is:
h13modified=[ 1/2 1 1 1 1 1 1 1 1 1 1 1 1/2 ] / 12
as compared to
h13=[1 1 1 1 1 1 1 1 1 1 1 1 1] / 13
We can also compare this to length 12:
h12 = [1 1 1 1 1 1 1 1 1 1 1 1] / 12
Indeed, h13modified, like h12, completely rejects a frequency of 1 (one year) where the “sampling frequency” is 12/year. The h13 filter, in contrasts, lets through about 7.7% at the frequency 1.
This is just simple digital filtering theory.
It may not show on graphs as much difference, but h13 is clearly subject to question, as Joe Born originally suspected.
==================================================================
Without the myth that what Man does is controlling the weather then there is no excuse for the measures implemented for controlling Man.
To John Eggert at 11:05 AM Dec. 20, 2013:
Indeed. I agree that having the output shifted 1/2 sample (half a month), zero phase with respect to the center of symmetry, is far preferable to non-zero magnitude error. I calculated that error using freqz in Matlab and got 0.07692307692308. And you then just say it’s 1/13! Sure is. Nice – very nice.
Folks who have studied the 1/2-sample shift associated with the four cases of even/odd symmetry and even/odd lengths of linear-phase FIR digital filters won’t bat an eye at this. But if someone has not done this, and found it natural, perhaps it makes some uneasy.
Why do we expect every day, every month, every year, everywhere to be average?
Russia did experience a very mild November in 2013. See: http://weatherspark.com/history/33893/2013/Moscow-Moskovskaya-oblast-Russian-Federation
They had October temps in November. This is was a regional not global event…that happens somewhere every year. Thus the phrases: “early spring” “late spring” “Indian summer” ….. We are supposed to freak out because a small region of the planet had 1 month of pleasant weather?
Obviously, HPs of 1045hPa in November over Siberia must be warm air according to the surface temperature grid… LOL
Once again, without an accompanying analysis of lower tropospheric circulation, these monthly statistics are meaningless on a climatological level. For instance, during the first 10 days of that month, very cold air 1045hPa over central Siberia advected warm, moist air over western Russia and just as it is happening today on the West Coast, in a clockwise vortex in the wake of the cold air pushing southward.
NikFromNYC says:
December 20, 2013 at 10:10 am
The irony is that Mosher’s BEST product is the one data set designed to simply remove such data jumps, algorithmically, so if it’s still there in their final plot, they must have had to tweak the overall parameter knobs and gizmos to obtain their global hockey stick, Russia be damned.
——————————-
On the contrary Nik. The BEST algorithm favors “up” jumps and will then add that to any nearby stations that don’t have the “up” jump through the “regional expectations” filter. Its the “down” jumps that are filtered out.
Its not that anyone has a copy of the actual algorithm showing this but it is inevitable in the math on how the raw data turns into such a large increase in the temperature trends.
I just ran two statistical tests, one on GHCN Tavg data bounded by lat/lon that Tisdale identified, another on BEST Russian Tavg data.
Both show a shift upwards about 1 degree in 1989. BEST is particularly troublesome since their scalpel method is supposed to fix such things (assuming it isn’t a natural event). The stats test I used has a peer reviewed provenance, so it isn’t just some Mannian made up methodology.
I will have a detailed post on this coming up this weekend.
John Eggert says:
December 20, 2013 at 11:05 am
Bernie Hutchins says:
December 20, 2013 at 12:11 pm
—————————————————————-
Look at the raw data. Hadcrut3 from 1850 to present shows monthly changes of up to +/-0.9 while the difference between a 13 point smoothed time series using equal coefficients and one with half end points shows a peak variation of +/- 0.025 degrees. Is this something you really want to quibble over?
Try looking at the unfiltered time series of monthly temperature data (e.g. Hadcrut3). The monthly variation is up to +/- 0.9 degrees. Now filter the data with a 13 point equal weighted box car filter. Monthly variation drops to +/- 0.07. Now filter the original data with your half-end-weighted (“quibble”) filter and the variation drops to +/-0.05. Is this difference meaningful given that the accuracy of the original data is likely +/- 0.5 degrees (25 times greater than the difference in the filtering methods).
@Bernie. 7% may sound like a lot but it isn’t. Different filters often show different results relative to each other, but when you compare the filtered data to the raw original data the variation in filter response is well below the noise level of the original time series. You are left defending signal variations due to filtering differences that are not even measureable.
One physics phact to keep in mind is that condensation and freezing of water involve a massive “swap” of heat energy as it changes state. That is, the air must absorb all the latent heat that is removed from the water as it goes “downscale” from vapour to liquid to solid.
AFAIK, Anth.y discovered that the pristine rural cool stations were rejected as “outliers” and homogenized with the nice warm urban ones, systematically.
Same “outlier” trick that was pulled with ARGO, coincidentally. Cold buoys’ data stripped from the “raw” records. Same with the 1990 “Great Dying of the Thermometers”. Inconvenient Andes stations’ data replaced with the average of equidistant coastal and jungle ones. Duh!
Steve from Rockwood on December 20, 2013 at 1:44 pm said in part:
“@Bernie. 7% may sound like a lot but it isn’t.”
Really? 7% may not be a lot compared to 100%, but here we are saying that it is 7% compared to a desired 0%. We are supposed to have, and can get, a null there – that was the purpose. Are you happy to let 7% of what is exactly the (yearly) component through? That’s why our first priority is to place zeros (nulls) on components to be rejected.
And I showed you how to get a good null AND avoid the shift of 1/2 a month.
Is there a reason for intentionally doing it in a way you know is not the best?