
Forecasting the Earth’s Temperature
by David Whitehouse via Benny Peiser’s CCnet
The recent spate of scientific papers that are attempting to predict what the earth’s temperature might be in the coming decades, and also explain the current global temperature standstill, are very interesting because of the methods used to analyse temperature variations, and because they illustrate the limitations of our knowledge.
Recall that only one or two annual data points ago many scientists, as well as the most vocal ‘campaigners,’ dismissed the very idea that the world’s average annual temperature had not changed in the past decade. Today it is an observational fact that can no longer be ignored. We should also not forget that nobody anticipated it. Now, post facto, scientists are looking for an explanation, and in doing so we are seeing AGW in a new light.
The main conclusion, and perhaps it’s no surprise, to be drawn about what will happen to global temperatures is that nobody knows.
The other conclusion to be drawn is that without exception the papers assume a constantly increasing AGW in line with the increase of CO2. This means that any forecast will ultimately lead to rising temperatures as AGW is forever upward and natural variations have their limits. But there is another way of looking at the data. Instead of assuming an increasing AGW why not look for evidence of it in the actual data. In other words let the data have primacy over the theory.
Lean and Ride try to isolate and analyse the various factors that affect decadal changes in the temperature record; El Nino, volcanic aerosols, solar irradiance and AGW. Their formula that links these factors together into a time series is quite simple (indeed there is nothing complicated about any of the papers looking at future temperature trends) though in the actual research paper there is not enough information to follow through their calculations completely.
El Nino typically produces 0.2 deg C warming, volcanic aerosols 0.3 deg C cooling on short timescales, solar irradiance 0.1 deg C (I will come back to this figure in a subsequent post) and the IPCC estimate of AGW is 0.1 deg C per decade.
It should also be noted that natural forces are able to produce a 0.5 deg C increase, although over a longer period. The 0.5 deg C warming observed between say 1850 and 1940 is not due to AGW.
The temperature increase since 1980 is in fact smaller than the rise seen between 1850 – 1940, approx 0.4 deg C. This took place in less than two decades and was followed by the current standstill. A fact often overlooked is that this recent temperature increase was much greater than that due to the postulated AGW effect (0.1 deg C per decade). It must have included natural increases of a greater magnitude.
This is curious. If the recent temperature standstill, 2002-2008, is due to natural factors counteracting AGW, and AGW was only a minor component of the 1980 -1998 temperature rise, then one could logically take the viewpoint that the increase could be due to a conspiracy of natural factors forcing the temperature up rather than keeping the temperature down post 2002. One cannot have one rule for the period 2002 – 2008 and another for 1980 -1998!
Lean and Rind estimate that 73% of the temperature variability observed in recent decades is natural. However, looking at the observed range of natural variants, and their uncertainties, one could make a case that the AGW component, which has only possibly shown itself between 1980 – 98, is not a required part of the dataset. Indeed, if one did not have in the back of one’s mind the rising CO2 concentration and the physics of the greenhouse effect, one could make out a good case for reproducing the post 1980 temperature dataset with no AGW!
Natural variations dominate any supposed AGW component over timescales of 3 – 4 decades. If that is so then how should be regard 18 years of warming and decades of standstills or cooling in an AGW context? At what point do we question the hypothesis of CO2 induced warming?
Lean and Rind (2009) look at the various factors known to cause variability in the earths temperature over decadal timescales. They come to the conclusion that between 2009-14 global temperatures will rise quickly by 0.15 deg C – faster than the 0.1 deg C per decade deduced as AGW by the IPCC. Then, in the period 2014-19, there will be only a 0.03 deg C increase. They believe this will be chiefly because of the effect of solar irradiance changes over the solar cycle. Lean and Rind see the 2014-19 period as being similar to the 2002-8 temperature standstill which they say has been caused by a decline in solar irradiance counteracting AGW.
This should case some of the more strident commentators to reflect. Many papers have been published dismissing the sun as a significant factor in AGW. The gist of them is that solar effects dominated up to 1950, but recently it has been swamped by AGW. Now however, we see that the previously dismissed tiny solar effect is able to hold AGW in check for well over a decade – in fact forcing a temperature standstill of duration comparable to the recent warming spell.
At least the predictions from the various papers are testable. Lean and Rind (2009) predict rapid warming. Looking at the other forecasts for near-future temperature changes we have Smith et al (2007) predicting warming, and Keenlyside et al (2008) predicting cooling.
At this point I am reminded that James Hansen ‘raised the alarm’ about global warming in 1988 when he had less than a decade of noisy global warming data on which to base his concern. The amount of warming he observed between 1980 and 1988 was far smaller than known natural variations and far larger than the IPCC would go on to say was due to AGW during that period. So whatever the eventual outcome of the AGW debate, logically Hansen had no scientific case.
There are considerable uncertainties in our understanding of natural factors that affect the earth’s temperature record. Given the IPCC’s estimate of the strength of the postulated AGW warming, it is clear that those uncertainties are larger than the AGW effect that may have been observed.
References:
Lean and Rind 2009, Geophys Res Lett 36, L15708
Smith et al Science 2007, 317, 796 – 799
Keenlyside et al 2008, Nature 453, 84 – 88
“And EMD most certainly does produce useful results, as its adoption in extracting information from seismic and medical data processing can demonstrate.”
Maybe. My hasty evaluation necessarily may not have taken in all the implications. But, it appears very, prima facie, ad hoc. I have seen dozens of analytical techniques become faddish and recede over my career (OK, that’s hyperbole, maybe 5 or 6 in my specific milieu), but the ones which are fundamental have more staying power. As I said, we tend to decompose time series according to those functions which are typically seen in nature, and for which solid reasons for expecting them to appear exist.
As I stated previously, there is a reason we choose trig functions and polynomials as bases. It is because that is the form a vast array of physical processes assume, because of natural integrations and projections, and the regularity of time as we define it.
Bart,
Most real-world time series are non stationary – their statistics depends on time. Fourier techniques find such signals very difficult to deal with.
The Queen of Heart’s is a very apt analogy. It is indeed execution followed by trial if you first force the time series to be stationary before taking the frequency transform. Empirical mode decomposition in contrast does not follow such a retrograde method for reaching a verdict and clearly extracts the dominant long-term rising temperature trend.
“Here, for example, is a 400 year cycle.”
I’m afraid the link doesn’t appear to work.
“I’m just saying, there is a lot more going on than the rising CO2 narrative which we are being spoon fed by the promoters of AGW.”
Quantitatively, not a lot more. About 80% of the content of the temperature signal is a rising trend well correlated with the CO2 concentration. I’m happy to discuss the other 20%, but like tallboy you appear to be ignoring the elephant in the room.
Tom – I would argue the opposite. Most real world random processes are stationary, which is why Fourier analysis has proven so successful over time. And, the most common non-stationary signals are martingales (see Donsker’s theorem). Deterministic signals are neither stationary nor non-stationary, as this is a probabilistic concept.
You can deal with non-stationarity in Fourier analysis simply by deriving the expectation of the operation. In this way, for example, we can find that the expected PSD of a Wiener process with autocorreltation E{x(t1)x(t2)} = K*min(t1,t2) is approximately 2*K/omega^2 at non-zero frequencies below the Nyquist rate (if you look up the literature, you may find that many people miss the factor of two by making a false analogy with an Ornstein-Uhlenbeck process with infinite time constant). There is a qualitative difference in that the frequency samples are highly correlated, whereas with a stationary process, they are essentially independent. This makes it difficult to get very good quantitative estimates from the PSD for that process, but it is still an excellent tool for qualitative analysis.
“About 80% of the content of the temperature signal is a rising trend well correlated with the CO2 concentration.”
And, you have no information to confirm whether it is a rising trend, or simply another cyclical component. Here is another attempt at that link, but if it doesn’t go through, try some googling of your own on long term periodic climate influences. I think you will be surprised what you find.
“Empirical mode decomposition in contrast does not follow such a retrograde method for reaching a verdict and clearly extracts the dominant long-term rising temperature trend.”
This is absurd. You are merely decomposing the signal another way, according to (I’ll take your word for it) another (arbitrary) orthogonal basis which gives you little insight into the actual physical processes. Let me try one more time to make the point: we generally assume trigonometric and polynomial bases because that is the form which most real world processes assume. Go and perform your EMD, then produce for me your bases, and tell me how such forms might arise physically. Generally, I expect you would not be able to, because you have become unmoored from physicality.
By pulling out a monotonic trend, you are making the assumption that such a trend exists in the large (beyond the boundaries of your data set). But, that may not be the case, and you do not have enough information to determine whether the assumption is true.
J. Bob – I am not ignoring you, it is just that a full response would take rather longer than responding to Tom’s questions, and unfortunately, I have a job. Briefly, let me say that, no matter how you do the analysis, your end points are going to in some way be an extrapolation, and therefore uncertain. In my suggested method of fitting periodic terms through least squares, I am not saying I can divine the “truth”. It would be more along the lines of demonstrating another plausible interpretation of where the data are going.
Would it be possible to post an ascii printout of the data you are analyzing to a page like you are doing for your plots so that I could perhaps generate some plots for you?
Tom –
I think I have said more or less all I can say, and at some point we will have to agree to disagree. But, let me take note of this plot:
http://img9.imageshack.us/img9/1994/glaciervsinstrumental.png
Direct temperature records, of course, only go back to the late 19th century, and even those are questionable given the sparsity of coverage, the sensor siting, etc.. which have been covered in this blog better than just about anywhere.
But, let’s assume the data are reliable and valid. If we as a species perceived time in reverse in 1850, we might well believe the world was rushing headlong into a global freeze. Year before year, we are pumping less and less CO2 into the atmosphere, and the temperature is plummeting. Remember, we have no perception of what has come before, and it will only be revealed to us in retrograde time. Clearly, time is running in, and we must take action!
You may not see the symmetry in the arguments, but they are wholly the same. Right now, there are those who believe the natural carbon sinks cannot handle the marginal increase in CO2 we are adding to the atmosphere each year. But, in the year 1850 going backward in time, they could be as easily concerned that the sinks are too aggressive, and that by failing to pump more carbon into the atmosphere, we are ensuring it will all be sucked out. If your reaction is skeptical, it is only because you have beforeknowledge of what unhappened in this event, whereas you have no such insight into our future.
Bart,
“Most real world random processes are stationary, which is why Fourier analysis has proven so successful over time.”
There’s no doubt about its success and I use it all the time. But most real-world processes involve elements of drift and random walk and so are not stationary. Often drift can be treated as a spurious signal and removed by detrending for Fourier analysis, especially in engineering applications, but if this is applied to real-world data you may well be removing a major component of the signal.
“Deterministic signals are neither stationary nor non-stationary, as this is a probabilistic concept.”
Correct, but I’m sure we agree that what drives temperatures on Earth isn’t a purely deterministic process. Hence there will be a stochastic component to any temperature signal and the statistics of the signal are important. Indeed you have been implicitly treating the signal as stochastic by using filtering to remove the random noise in your analysis above.
Hence temperature history can be considered as either stationary or non- stationary, and all evidence is that it is the latter. For instance the mean temperature of the Earth has varied at huge time scales, and when the literal end of the world comes as the sun blows up it will rise monotonically. It is problematic to extract quantitative data using Fourier techniques from such data as you have acknowledged.
Yes, the link now works, but says something a little different than maybe you intended: “global-scale temperature only shows a minor response” to the solar forcing discussed in their work. Of course there are long-term periodicities associated Milankovitch cycles, but I still await any evidence of a 400 year forcing of temperatures, and even that cycle would be inconsistent with the glacier record.
“Go and perform your EMD, then produce for me your bases, and tell me how such forms might arise physically.”
I’ve already done this! – and the plots of the decomposition of past temperature and CO2 history are above for you to see. I’d prefer you criticised rather than ignored them! You might want to read Arrhenius for a first order explanation.
“You may not see the symmetry in the arguments, but they are wholly the same…. in the year 1850 going backward in time, they could be as easily concerned that the sinks are too aggressive, and that by failing to pump more carbon into the atmosphere, we are ensuring it will all be sucked out.”
There are some severe physical and societal contradictions in assuming such a symmetry. But even ignoring these your analysis breaks down. By 1850 such a temperature change in reverse would have decreased steadily over the years and by then the changes would be minor compared to the dramatic drops experienced a century and a half earlier. I might imagine any panic would have subsided. I’m at somewhat of a loss to understand what you are trying to demonstrate here.
Bart, here is a site for my composite temperature.
http://www.4shared.com/file/132702881/dc4fb18c/Ave14.html
The columns are: Year, Hadcrut, Ave14 & the number of sites for that year.
On the Fourier filtering, I would guess that if is the computational method could re-create the original to the accuracy that it did ( echo1 & 2), it would seem that it would the end points should be pretty close to being right. Anyway have fun.
I would like to add my pennysworth to the discussion of Empirical Mode Decomposition (EMD) and the matter of trends and randomness.
I suggest EMD is a statistically valid methodology. It is worth a careful study as are the papers by its originator Dr Huang, especially those about why it is superior to Fourier methods.
The time series of almost all processes relevant to climate dynamics are non stationary, in that the measures within each are interrelated, and non linear and contain elements of randomness. EMD, unlike most statistical methodologies for analysing time series, makes no assumptions about the linearity or stationarity of a time series. EMD lets the data speak more directly, revealing its intrinsic functional structure more clearly. It does not does not have the restrictive assumptions of linearity and stationarity that the familiar Fourier-based techniques have, because it uses Hilbert, not Fourier, transforms.
I suggest that the EMD analysis of GMSTA data warrants close study as it is telling us what is happening in the GSTA time series, and hopefully in the world in which we live (I think it does). The paper is full of interesting insights, such as (page 14892):
“Finally, it is noted that the global temperature anomalies with respect to the sum of the overall EMD trend and the multidecadal variability appear to be quite stationary in the whole data span, indicating that the higher frequency part of the record in recent years is not more variable than that in the 1800s. The extreme temperature records in the 1990s stand out mainly because the general global warming trend over the whole data length coincides with the warming phase of the 65-year cycle.”
Murray C Peel, Senior Research Fellow Department of Civil and Environmental Engineering Centre for Environmental Applied Hydrology, University of Melbourne is an expert on EMD.
(see http://www.civenv.unimelb.edu.au/~mpeel ). He has, with his colleagues, written several papers about EMD and reporting applications of EMD.
One of the most interesting is Peel, M and McMahon, T. A., 2006. Recent frequency component changes in interannual climate variability, Geophysical Research Letters, Vol.33, L16810, doi:10.1029/2006GL025670.
Peel and McMahon demonstrated that randomness in the climate system has been on the rise since the 1950s. The authors used the EMD time series analysis technique to quantify the proportion of variation in the annual temperature and rainfall time series that resulted from fluctuations at different time scales. They applied EMD to annual data for 1,524 temperature and 2,814 rainfall stations from the Global Historical Climatology Network.
Peel and McMahon found that the proportion of variance due to inter-decadal fluctuations has been decreasing since the 1950s for rainfall and since the 1970s for temperature. They argue that this means the long term memory of the climate system is shortening, thus increasing the degree of randomness in the system.
Regardless of EMD, I suggest that the global authority on trend analysis and the analysis of non-linear, non-stationary time series which contain elements of randomness, is Demetris Koutsoyiannis, Professor of the National Technical University of Athens in Hydrology and Analysis of Hydrosystems; and Professor of Hydraulics in the Hellenic Army’s Postgraduate School of Technical Education of Officers Engineers; Editor of Hydrological Sciences Journal;. Here is his home page http://www.itia.ntua.gr/dk . He argues for a different approach based on Hurst phenomena. He would not use EMD, nor advocate its use.
There are many papers on his website which keen students of the time series analysis of non-linear, non-stationary trendy data with elements of randomness could fruitfully spend months studying. Cohen and Lins have, of course, shown that Nature is naturally trendy (COHN, T A. and LINS, F., 2005. Nature’s style: Naturally trendy. Geophysical Research Letters, (32), L23402.
Bart, forgot to put in the links to echo1 & 2.
http://www.imagenerd.com/uploads/ave14-raw-fft-echo1-2Tgav.gif
http://www.imagenerd.com/uploads/ave14-raw-fft-echo2-p9mBY.gif
Let me know if you got the text file.
Richard Mackey,
Thanks for the summary on EMD.
What is your explanation for the general warming trend, the most dominant mode of the temperature record as decomposed by EMD?