Pielke Sr. on the 30 year random walk in surface temperature record

First some background for our readers that may not be familiar with the term “random walk”

See: http://en.wikipedia.org/wiki/Random_walk

From Wikipedia: Example of eight random walks in one dimension starting at 0. The plot shows the current position on the line (vertical axis) versus the time steps (horizontal axis). Click for more info on the random walk concept
============================================================

New Paper “Random Walk Lengths Of About 30 Years In Global Climate” By Bye Et Al 2011

There is a new paper [h/t to Ryan Maue and Anthony Watts] titled

Bye, J., K. Fraedrich, E. Kirk, S. Schubert, and X. Zhu (2011), Random walk lengths of about 30 years in global climate, Geophys. Res. Lett., doi:10.1029/2010GL046333, in press. (accepted 7 February 2011)

The abstract reads [highlight added]

“We have applied the relation for the mean of the expected values of the maximum excursion in a bounded random walk to estimate the random walk length from time series of eight independent global mean quantities (temperature maximum, summer lag, temperature minimum and winter lag over the land and in the ocean) derived from the NCEP twentieth century reanalysis (V2) (1871-2008) and the ECHAM5 IPCC AR4 twentieth century run for 1860-2100, and also the Millenium 3100 yr control run mil01, which was segmented into records of specified period. The results for NCEP, ECHAM5 and mil01 (mean of thirty 100 yr segments) are very similar and indicate a random walk length on land of 24 yr and over the ocean of 20 yr. Using three 1000 yr segments from mil01, the random walk lengths increased to 37 yr on land and 33 yr over the ocean. This result indicates that the shorter records may not totally capture the random variability of climate relevant on the time scale of civilizations, for which the random walk length is likely to be about 30 years. For this random walk length, the observed standard deviations of maximum temperature and minimum temperature yield respective expected maximum excursions on land of 1.4 and 0.5 C and over the ocean of 2.3 and 0.7 C, which are substantial fractions of the global warming signal.”

The text starts with

The annual cycle is the largest climate signal, however its variability has often been overlooked as a climate diagnostic, even though global climate has received intensive study in recent times, e.g. IPCC (2007), with a primary aim of accurate prediction under global warming.”

We agree with the authors of the paper on this statement. This is one of the reasons we completed the paper

Herman, B.M. M.A. Brunke, R.A. Pielke Sr., J.R. Christy, and R.T. McNider, 2010: Global and hemispheric lower tropospheric temperature trends. Remote Sensing, 2, 2561-2570; doi:10.3390/rs2112561

where our abstract reads

“Previous analyses of the Earth’s annual cycle and its trends have utilized surface temperature data sets. Here we introduce a new analysis of the global and hemispheric annual cycle using a satellite remote sensing derived data set during the period 1979–2009, as determined from the lower tropospheric (LT) channel of the MSU satellite. While the surface annual cycle is tied directly to the heating and cooling of the land areas, the tropospheric annual cycle involves additionally the gain or loss of heat between the surface and atmosphere. The peak in the global tropospheric temperature in the 30 year period occurs on 10 July and the minimum on 9 February in response to the larger land mass in the Northern Hemisphere. The actual dates of the hemispheric maxima and minima are a complex function of many variables which can change from year to year thereby altering these dates.

Here we examine the time of occurrence of the global and hemispheric maxima and minima lower tropospheric temperatures, the values of the annual maxima and minima, and the slopes and significance of the changes in these metrics. The statistically significant trends are all relatively small. The values of the global annual maximum and minimum showed a small, but significant trend. Northern and Southern Hemisphere maxima and minima show a slight trend toward occurring later in the year. Most recent analyses of trends in the global annual cycle using observed surface data have indicated a trend toward earlier maxima and minima.”

The 2011 Bye et al GRL paper conclusion reads

“In 1935, the International Meteorological Organisation confirmed that ‘climate is the average weather’ and adopted the years 1901-1930 as the ‘climate normal period’. Subsequently a period of thirty years has been retained as the classical period of averaging (IPCC 2007). Our analysis suggests that this administrative decision was an inspired guess. Random walks of length about 30 years within natural variability are an ‘inconvenient truth’ which must be taken into account in the global warming debate. This is particularly true when the causes of trends in the temperature record are under consideration.”

This paper is yet another significant contribution that raises further issues on the use of multi-decadal linear surface temperature trends to diagnose climate change.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

107 Comments
Inline Feedbacks
View all comments
February 14, 2011 9:13 am

Yes, 2 is a substantial fraction. Especially when you place it over a 1.

Douglas DC
February 14, 2011 9:13 am

Hmm. How about 30-60 year Ocean cycles? Another inconvenient factor ?..

David Larsen
February 14, 2011 9:19 am

How about interglacial periods of 50-100 thousand years. Oscillations during interglacials. Another inconvenient truth.

reliapundit
February 14, 2011 9:33 am

how about they select any period which helps them prove their case against industrialism and helps them advocate socialism?

February 14, 2011 9:39 am

Chaos and randomness are in the mind of the beholders. Chaos and randomness are always politically convenient.

John Blake
February 14, 2011 9:43 am

Paper ought to state that a 30-component set is the threshold for computing a statistically “normal distribution.” Though 95% Confidence Limits require 120 components, 30-units suffice to plot a Gaussian “normal curve.”
The fact that 30-year climate cycles tend to stabilize at “normal” intervals during this relatively quiescent Holocene Interglacial Epoch indicates a climate-dynamic akin to “random walks” exhibited by broad-market equities as measured by the classic 30 Dow Industrials, representing a disparate amalgam of economic entities related only by large-scale capitalization.
To the extent 30-year climate epochs reflect both land- and sea-surface temperatures, generated for whatever reason, recurrent random-action makes nonsense of linear extrapolations (functions in any case of corrupt data-manipulation, skewed by agenda-driven parties careless of objective fact). As Edward Lorenz showed c. 1960, complex dynamic systems exhibit non-random but indeterminate patterns keyed to “strange attractors”– mathematical artifacts which nonetheless incorporate inescapable features of the real world.
Quite likely, the Green Gang of climate hysterics will (as ever) belittle and bemoan such findings as reported here. Alas for Luddite Thanatists such as Ehrlich, Holdren, lately Keith Farnish, nature takes her course regardless of their vicious prejudices. Meantime, Science as a philosophy, a method, and a practice distinguishes fact from fiction in ways that Briffa, Hansen, Jones, Mann, Trenberth and Steig et al. will not be able to evade for long.

February 14, 2011 9:45 am

If I understand the drift of this work correctly, it goes to the heart of what I have been arguing for a while: time and location have an important impact when daily and annual variation in insolation, non-cloud-based albedo differences and cloud type/amounts are taken into account. The averaging of these things insufficiently reflects the impacts the short-term variations have on global heating and cooling. This results in regional differences that in a sense distort the global record on the one hand, and give an appearance to a minor variation in input-output that can be discounted within even a decadal-long period. Small differences add up or add down in a non-random manner within time periods we don’t understand. Feedbacks of a natural nature amplify these perturbations. Going into and out of a LIA could be a manifestation of such a combination as a “random walk”. Certainly the period from 1965 to today could be such a thing.
A gentle global mist gives you the same statistical precipitation as a major cyclone over northeast Australia, but they do not have the same impact or place in the records.

mikep
February 14, 2011 9:46 am

You might also like to look at the article by T C Mills in the Journal of Cosmology 2010, here
http://journalofcosmology.com/ClimateChange112.html
He finds the temperature history to be best described as the combination of a cyclic component with a random element, and a random walk. He speculates that the
” rather volatile behaviour of the cyclical component is also apparent , with the random innovations continually shocking the component away from a smooth oscillation, although a stochastic cycle with an average period of approximately six years is still evident, perhaps reflecting El Niño induced temperature movements.”
The pre-print linked to has some omissions, but the basic idea is clear enough. From the abstract
“This paper considers fitting a flexible model, known as the structural model, to global hemispheric temperature series for the period 1850 to 2009 with the intention of assessing whether a global warming signal can be detected from the trend component of the model. For all three series this trend component is found to be a simple driftless random walk, so that all trend movements are attributed to natural variation and the optimal forecast of the long term trend is the current estimate of the trend, thus ruling out continued global warming and, indeed, global cooling. Polynomial trend functions, being special cases of the general structural model, are found to be statistically invalid, implying that to establish a significant warming signal in the temperature record, some other form of nonlinear trend must be considered.”

Slabadang
February 14, 2011 9:51 am

This is to test the basics … and this is back to what climate science should have started with….but ignored.

Gary Pearse
February 14, 2011 9:55 am

I know what random walk stats are but still don’t see very clearly what is being pointed out here, I’m a bit studpid today it seems. Does its use not imply that temp fluctuations are natural variations. A year or so ago in a comment on major floooding of the Red River of the North on WUWT, I used permutation analysis of the flood record going back 150 years (I believe) to show that the frequency of new records in flooding (assuming that year 1 was a record) roughly equalled the function ln(n) where n is the number of years considered (150). Ln 150 = 5, which, if flooding is a random event there will be 4 new records established after years one. As an aside, this also means, if changes are random, that there should be about 4 new snowfall records (season), rainfall records (annual period), temperature records (annual average) at a given location. Note also that the nature of the function is such that the records get ever more widely space in time. Eg. ln (400) is 6, meaning that there would only be 5 new records set in 400 years. Its fun to play with this simple function in putting records into perspective and in judging whether “weather” variation is random or not. Check out CET by making year 1 a record and then counting the number of successively higher temp records there are over 350 years – I haven’t done this, honest, but it should be 4 or 5 if random. If much more than this, then the warming is real.

RiHo08
February 14, 2011 9:56 am

Temperature time series data correlation was assessed last year March 2010 by VS on Bart Verhengeen’ s blog using 1880 to 2008 data and concluded that all temperatures fell within normal variance parameters. I described and commented upon the process and conclusions on Real Climate and the usual vitriolic discussion ensued including comments from Bart V. and VS. Again, the data that is available does not show a signal apart from normal variance. Over the last year, there are now at least 3 informed discussions regarding time series temperature correlations demonstrating temperatures having characteristics of random walks. This recent paper adds to the increasing cloudy picture of climate science and the gathering storm of public awareness.

Brian H
February 14, 2011 10:04 am

Random walks are often staggering. By definition.
😉

Brian H
February 14, 2011 10:06 am

Oops, meant to say “drunkards’ walks”. Oh, well.
Is there, btw, a “technical” distinction between “random” and “drunkards'”, anyone? Inebriated minds want to know.

art johnson
February 14, 2011 10:06 am

reliapundit says:
“how about they select any period which helps them prove their case against industrialism and helps them advocate socialism?”
Of course everyone’s entitled to their point of view. My point of view is that I really dislike this sort of comment. The notion that alarmist scientists are somehow all conspiring to destroy capitalism is just hogwash. More importantly, it feeds into certain stereotypes that do the skeptics no good. I wish people would stick to the science, or lack thereof.

ge0050
February 14, 2011 10:19 am

It is amazing the number of climate measures that are integral harmonics of the orbital periods of the planets. It is quite something that the climate on earth, from its position at the center of the solar system, is able to drive the orbit of the planets in this fashion.

Tenuc
February 14, 2011 10:31 am

John Blake says:
February 14, 2011 at 9:43 am
“…To the extent 30-year climate epochs reflect both land- and sea-surface temperatures, generated for whatever reason, recurrent random-action makes nonsense of linear extrapolations (functions in any case of corrupt data-manipulation, skewed by agenda-driven parties careless of objective fact). As Edward Lorenz showed c. 1960, complex dynamic systems exhibit non-random but indeterminate patterns keyed to “strange attractors”– mathematical artefacts which nonetheless incorporate inescapable features of the real world…”
Thanks for bringing this one up, John, and it horrifies me that the IPCC cabal of climate scientists still try to ignore deterministic chaos and rely on linear trends to try to ‘prove’ their case.
Understanding chaos and how complex dynamic systems, like climate, can change behaviour due to strange attractors and the law of MEP is fundamental to being able to understand what the future will bring. It would seem climate scientists today understand less about climate than Lorenz did back in the 60’s – go figure!

Mohib
February 14, 2011 10:39 am

On my timeline, “ClimateGate:30 Years in the Making” (http://joannenova.com.au/global-warming/climategate-30-year-timeline/), I made the following entry in 2001 about a little known 2001 report by the UN FAO It is quite ironic that one side of the UN was making this report while the other side was producing the IPCC reports:
UN FAO: EARTH WARMS AND COOLS EVERY 30 YEARS
The UN Food and Agriculture Organization sought to understand the effect of climate change on long-term fluctuations of commercial catches and reported that several independent measures showed “a clear 55-65 year periodicity” (i.e. approx 30 year warming then cooling) over both short terms (150 years) and long terms (1500 years). [146:1] The report also highlighted that the current “‘latitudinal’ … epoch of the 1970-1990s” is in its final stage and a “‘meridional’ epoch … is now in its initial stage.” [146:2] Latidudinal circulations have corresponded to warm periods and meridional circulations to cool ones.
Abstract [146:1] from the paper:
The main objective of this study was to develop a predictive model based on the observable correlation between well-known climate indices and fish production, and forecast the dynamics of the main commercial fish stocks for 5–15 years ahead. Spectral analysis of the time series of the global air surface temperature anomaly (dT), the Atmospheric Circulation Index (ACI), and Length Of Day (LOD) estimated from direct observations (110-150 years) showed a clear 55-65 year periodicity. Spectral analysis also showed similar periodicity for a reconstructed time series of the air surface temperatures for the last 1500 years, a 1600 years long reconstructed time series of sardine and anchovy biomass in Californian upwelling areas, and catch statistics for the main commercial species during the last 50-100 years. These relationships are used as a basis for a stochastic model intended to forecast the long-term fluctuations of catches of the 12 major commercial species for up to 30 years ahead. According to model calculations, total catch of Atlantic and Pacific herring, Atlantic cod, South African sardine, and Peruvian and Japanese anchovy for the period 2000–2015 will increase by approximately two million tons, and will then decrease. During the same period, total catch of Japanese, Peruvian, Californian and European sardine, Pacific salmon, Alaska pollock and Chilean jack mackerel is predicted to decrease by about 4 million tons, and then increase. The probable scenario of climate and biota changes for next 50-60 years is considered.
Here are the references:
[146]
1. Leonid Klyashtorin, “Climate Change and Long-Term Fluctuations…” (abstract), Food and Agriculture Organization of the UN, Rome, 2001
http://www.fao.org/documents/pub_dett.asp?pub_id=61004&lang=en
2. “Chapter 2: Dynamics Of Climatic And Geophysical Indices”
http://www.fao.org/docrep/005/Y2787E/y2787e03.htm#bm03

Dave Springer
February 14, 2011 10:42 am

A truly random walk might not exist in the real world. The debate over whether the universe is deterministic or not has not been settled. In the meantime there are plenty of things that appear random but in reality are simply too complex for practical prediction or things that appear random but in reality have deterministic influences too subtle to detect over short periods of time.

ge0050
February 14, 2011 10:43 am

>>Gary Pearse says:
February 14, 2011 at 9:55 am
Check out CET by making year 1 a record and then counting the number of successively higher temp records there are over 350 years – I haven’t done this, honest, but it should be 4 or 5 if random. If much more than this, then the warming is real.<<
From my eyeball check of the CTE currently on the WUWF home page the number of record highs and lows over the past 350 years appears pretty well matched at around 5. There are a couple of entries that are so close that you really would need to call them tied given the likely size of the error bars.

A C Osborn
February 14, 2011 10:47 am

Is one of the Co Authors VS form the Bart Verhengeen’ s blog?

Solomon Green
February 14, 2011 10:48 am

For a walk to be truly random the size and direction of each step must be independent of those of its predecessors. Where the size of the steps is predetermined the directions must still be random. Little in his book “Higglety Pigglety Growth” supposed that share price movements in the stock market were independent and that therefore a random walk could be assumed.
Forty years ago Granger, who later went on to gain a Nobel in economics, proved that this supposition did not hold in the UK stockmarket and the supposition has also been shown to be invalid in the US and Australian stock markets. In all three there markets was a small but significant positive serial correlation between share price movements. Mandelbrot, at about the same time, also discovered that the random walk theory did not hold in stock markets as recounted, for example, in his book “The (Mis)Behaviour of Markets”. Incidentally, as Mandelbrot also pointed out, dependence can exist without correlation.

February 14, 2011 11:00 am

Terence Mills also has a very important paper on representation of trends in climate data series in a recent volume of Climatic Change.
Since the late 1980s there has been a steady stream of papers by econometricians and time series analysts looking at whether temperature is a random walk. It is possible to find conclusions on both sides, as well as the intermediate cases of fractional integration, though my impression is that the longer the data sets get, the more likely it is to find RW behaviour. Mills’ conclusion is that the current data sets are best represented by a combination of RW, trend and cyclical components, but a researcher still has some leeway to specify the trend model. Nonetheless, the model getting the most support in the data indicates RW behaviour and yields a contemporary trend component well below GCM forecasts.
This is an area where it is difficult to fully convey how important the underlying question is. The qualitative difference between a data series that contains a random walk and one that doesn’t is enormous. It is completely routine in economics to test for RW behaviour, or as it is known more formally, unit root processes. If your data contain a unit root it simply comes from another planet than non-unit root data, and you have to use completely different analytical methods both to estimate trends and to fit explanatory models. Conventional methods are built on the assumption of stationary processes that converge to Gaussian distributions in the limit. But unit roots are non-stationary and they do not converge to Gaussian limits. They also imply some fundamental differences about the underlying phenomena, namely that means, variances and covariances vary over time, so any talk about (for example) detecting climate change is very problematic if the underlying system is nonstationary. If the mean is variable over time, observing a change in the mean is no longer evidence that the system itself has changed.
If the IPCC knew enough about this issue to deal with it properly they would have an entire chapter, if not an entire special report, devoted to the question of whether climate is nonstationary and over what time scales. Until this is known there is a very high probability that a lot of statistical analysis of climate is illusory. If that seems harsh, ask any macroeconomist about the value that can be placed on empirical work in macro prior to the 1987 work of Engle and Granger on spurious regression. They wiped out more than a generation’s worth of empirical work, and got a Nobel Prize for their efforts.
If the IPCC were willing or able to deal with this issue, there are a lot of qualified people who could contribute–people of considerable expertise and goodwill. Unfortunately the only mention of the stationarity issue in the AR4 was a brief, indirect comment inserted in the second draft in response to review comments:

Determining the statistical significance of a trend line in geophysical data is difficult, and many oversimplified techniques will tend to overstate the significance. Zheng and Basher (1999), Cohn and Lins (2005) and others have used time series methods to show that failure to properly treat the pervasive forms of long-term persistence and autocorrelation in trend residuals can make erroneous detection of trends a typical outcome in climatic data analysis.

There was also a comment in the chapter appendix cautioning that their “linear trend statistical significances are likely to be overestimated”.
Sometime after the close of IPCC peer review in summer 2006 the above paragraph was deleted, the cautionary statement in the Appendix was muted, and new text was added that claimed IPCC trend estimation methods “accommodated” the persistence of the error term. We know on other grounds that their method was flawed–they applied a DW test to residuals after correcting for AR1, which is an elementary error. The email traffic pertaining to the AR4 Ch 3 review process (not in the Climategate archive, but public nonetheless, if you know where to look) shows that the Ch 3 lead authors knew they were in over their heads. But rather than get help from actual experts, they just made up their own methods, and then after the review process was over they scrubbed the text to remove the caveats inserted during the review process.

Regg
February 14, 2011 11:11 am

30 years of data is an ”inconveniant truth”. mmmmm
There’s only 30 years of satellite data in that paper. What’s up Dr. Pielke ?

Horace the Grump
February 14, 2011 11:14 am

mmmm….. brownian motion….

rbateman
February 14, 2011 11:17 am

‘climate is the average weather’
For you C++ fans out there, weather is an instance of the class climate.
You will need many climates to make up the next class. Then there is the added complexity of regions, which can and do act independently and oppositely of neighboring regions, as well as sympathetically.

1 2 3 5