A new paper published today by the Global Warming Policy Foundation explains how statistical forecasting methods can provide an important contrast to climate model-based predictions of future global warming. The repeated failures of economic models to generate accurate predictions has taught many economists a healthy scepticism about the ability of their own models, regardless of how complex, to provide reliable forecasts. Statistical forecasting has proven in many cases to be a superior alternative. Like the economy, the climate is a deeply complex system that defies simple representation. Climate modelling thus faces similar problems. —Global Warming Policy Foundation, 23 February 2016
The global average temperature is likely to remain unchanged by the end of the century, contrary to predictions by climate scientists that it could rise by more than 4C, according to a leading statistician. British winters will be slightly warmer but there will be no change in summer, Terence Mills, Professor of Applied Statistics at Loughborough University, said in a paper published by the Global Warming Policy Foundation. He found that the average temperature had fluctuated over the past 160 years, with long periods of cooling after decades of warming. Dr Mills said scientists who argued that global warming was an acute risk to the planet tended to focus on the period from 1975-98, when the temperature rose by about 0.5C. He said that his analysis, unlike computer models used by the IPCC to forecast climate change, did not include assumptions about the rate of warming caused by rising emissions. “It’s extremely difficult to isolate a relationship between temperatures and carbon dioxide emissions,” he said. –Ben Webster, The Times, 23 February 2016
Bishop Hill reports:
GWPF have release a very interesting report about stochastic modelling by Terence Mills, professor of applied statistics and econometrics at Loughborough University. This is a bit of a new venture for Benny and the team because it’s written with a technical audience in mind and there is lots of maths to wade through. But even from the introduction, you can see that Mills is making a very interesting point:
The analysis and interpretation of temperature data is clearly of central importance to debates about anthropogenic globalwarming (AGW). Climatologists currently rely on large-scale general circulation models to project temperature trends over the coming years and decades. Economists used to rely on large-scale macroeconomic models for forecasting, but in the 1970s an increasing divergence between models and reality led practitioners to move away from such macro modelling in favour of relatively simple statistical time-series forecasting tools, which were proving to be more accurate.In a possible parallel, recent years have seen growing interest in the application of statistical and econometric methods to climatology. This report provides an explanation of the fundamental building blocks of so-called ‘ARIMA’ models, which are widely used for forecasting economic and financial time series. It then shows how they, and various extensions, can be applied to climatological data. An emphasis throughout is that many different forms of a model might be fitted to the same data set, with each one implying different forecasts or uncertainty levels, so readers should understand the intuition behind the modelling methods. Model selection by the researcher needs to be based on objective grounds.
There is an article (£) in the Times about the paper.
I think it’s fair to say that the climatological community is not going to take kindly to these ideas. Even the normally mild-mannered Richard Betts seems to have got a bit hot under the collar.
— Richard Betts (@richardabetts) February 23, 2016
My response to “clickbait”: