Still chasing consensus – on building a climate consensus model

Statistics research could build consensus around climate predictions

Philadelphia, PA—Vast amounts of data related to climate change are being compiled by research groups all over the world. Data from these many and varied sources results in different climate projections; hence, the need arises to combine information across data sets to arrive at a consensus regarding future climate estimates.

In a paper published last December in the SIAM Journal on Uncertainty Quantification, authors Matthew Heaton, Tamara Greasby, and Stephan Sain propose a statistical hierarchical Bayesian model that consolidates climate change information from observation-based data sets and climate models.

“The vast array of climate data—from reconstructions of historic temperatures and modern observational temperature measurements to climate model projections of future climate—seems to agree that global temperatures are changing,” says author Matthew Heaton. “Where these data sources disagree, however, is by how much temperatures have changed and are expected to change in the future.  Our research seeks to combine many different sources of climate data, in a statistically rigorous way, to determine a consensus on how much temperatures are changing.”

Using a hierarchical model, the authors combine information from these various sources to obtain an ensemble estimate of current and future climate along with an associated measure of uncertainty. “Each climate data source provides us with an estimate of how much temperatures are changing.  But, each data source also has a degree of uncertainty in its climate projection,” says Heaton. “Statistical modeling is a tool to not only get a consensus estimate of temperature change but also an estimate of our uncertainty about this temperature change.”

The approach proposed in the paper combines information from observation-based data, general circulation models (GCMs) and regional climate models (RCMs).

Regional analysis for climate change assessment. Image credit: Melissa Bukovsky, National Center for Atmospheric Research (NCAR/IMAGe)

Observation-based data sets, which focus mainly on local and regional climate, are obtained by taking raw climate measurements from weather stations and applying it to a grid defined over the globe. This allows the final data product to provide an aggregate measure of climate rather than be restricted to individual weather data sets. Such data sets are restricted to current and historical time periods. Another source of information related to observation-based data sets are reanalysis data sets in which numerical model forecasts and weather station observations are combined into a single gridded reconstruction of climate over the globe.

GCMs are computer models which capture physical processes governing the atmosphere and oceans to simulate the response of temperature, precipitation, and other meteorological variables in different scenarios. While a GCM portrayal of temperature would not be accurate to a given day, these models give fairly good estimates for long-term average temperatures, such as 30-year periods, which closely match observed data. A big advantage of GCMs over observed and reanalyzed data is that GCMs are able to simulate climate systems in the future.

RCMs are used to simulate climate over a specific region, as opposed to global simulations created by GCMs. Since climate in a specific region is affected by the rest of the earth, atmospheric conditions such as temperature and moisture at the region’s boundary are estimated by using other sources such as GCMs or reanalysis data.

By combining information from multiple observation-based data sets, GCMs and RCMs, the model obtains an estimate and measure of uncertainty for the average temperature, temporal trend, as well as the variability of seasonal average temperatures. The model was used to analyze average summer and winter temperatures for the Pacific Southwest, Prairie and North Atlantic regions (seen in the image above)—regions that represent three distinct climates. The assumption would be that climate models would behave differently for each of these regions. Data from each region was considered individually so that the model could be fit to each region separately.

“Our understanding of how much temperatures are changing is reflected in all the data available to us,” says Heaton. “For example, one data source might suggest that temperatures are increasing by 2 degrees Celsius while another source suggests temperatures are increasing by 4 degrees.  So, do we believe a 2-degree increase or a 4-degree increase?  The answer is probably ‘neither’ because combining data sources together suggests that increases would likely be somewhere between 2 and 4 degrees. The point is that that no single data source has all the answers.  And, only by combining many different sources of climate data are we really able to quantify how much we think temperatures are changing.”

While most previous such work focuses on mean or average values, the authors in this paper acknowledge that climate in the broader sense encompasses variations between years, trends, averages and extreme events. Hence the hierarchical Bayesian model used here simultaneously considers the average, linear trend and interannual variability (variation between years).  Many previous models also assume independence between climate models, whereas this paper accounts for commonalities shared by various models—such as physical equations or fluid dynamics—and correlates between data sets.

“While our work is a good first step in combining many different sources of climate information, we still fall short in that we still leave out many viable sources of climate information,” says Heaton. “Furthermore, our work focuses on increases/decreases in temperatures, but similar analyses are needed to estimate consensus changes in other meteorological variables such as precipitation. Finally, we hope to expand our analysis from regional temperatures (say, over just a portion of the U.S.) to global temperatures.”

To read other SIAM Nuggets, explaining current high level research involving applications of mathematics in popular science terms, go to http://connect.siam.org/category/siam-nuggets/

Source article:

Modeling Uncertainty in Climate Using Ensembles of Regional and Global Climate Models and Multiple Observation-Based Data Sets

Matthew J. Heaton, Tamara A. Greasby, and Stephan R. Sain

SIAM/ASA Journal on Uncertainty Quantification, 1(1), 535–559 (Online publish date: December 17, 2013). The source article is available for free access at the link above through December 31, 2014.

About the authors:

Matthew Heaton is currently an assistant professor in the department of statistics at Brigham Young University, but the majority of this work was done while the author was a postgraduate scientist at the National Center for Atmospheric Research. Tamara Greasby is a postdoctoral researcher and Stephan Sain is a scientist at the Institute for Mathematics Applied to the Geosciences (IMAGe) at the National Center for Atmospheric Research (NCAR).

Advertisements

  Subscribe  
newest oldest most voted
Notify of

Oh God have we not already had to put up with industrial scale stupidity with cartoonist Cook’s consensus without this. There is so much wrong with this approach I can’t even be bothered to post a critique. Don’t these people realise just how stupid they look to statisticians.

Steven Strittmatter

Since when does consensus mean anything in science?

Peter Miller

Advanced GIGO for geeks.
Like all current climate modelling, it starts with the premise thar CAGW theory is correct and global temperatures will rise by 2-4 degrees C over the next century.

Eustace Cranch

Exactly, Peter. Decide the answer ahead of time and find a way to make the models produce it.
Richard Feynman is either screaming or laughing his a– off right now.

This sounds like a movie speech crafted be Mel Brooks.

Steve from Rockwood

So much effort for a 3% gain.

My guess is that they are trying to get everyone to sing from the same page. Currently there are massive contradictions flying about, which rather shoots the notion of consensus in the foot.

M. Hastings

While a GCM portrayal of temperature would not be accurate to a given day, these models give fairly good estimates for long-term average temperatures, such as 30-year periods, which closely match observed data.
Since when have the GCM’s closely matched observed data for 30 years?

Gamecock

They can’t even predict the past.

Richard M

Let me help them out. It’s cooling now and will continue to cool for at least another couple of decades.

Alan Robertson

Wait a minute… I thought there already was a consensus.

HGW xx/7

It’s getting to the point where if you wanted me to explain where the environmental establishment went wrong, I wouldn’t even know where to start. It would be like trying to explain the concept of love; it baffles even the most loquacious of us. It has lost any semblance of trying to “save the planet” and now simply exists to continue its existence.
That’s “sustainability” for ya!

Cynical Scientst

Statistics is a tool which one should attempt to use minimally. It is like making sausage. The output may be in more useful form, but it has been altered and adulterated with artificial ingredients. Every step by which data is processed destroys information and adds things to the data that were not originally there.
By the time you get to grand statistical amalgamations of the statistical output of multiple studies and models based on the input of still other studies you have gone way beyond the realm of processed sausage and are eating pure plastic.

“They can’t even predict the past.”
Funniest comment in a while. I’ll be stealing that one! 🙂

RH

And I thought they ran out of ways to fudge data.

“The approach proposed in the paper combines information from observation-based data, general circulation models (GCMs) and regional climate models (RCMs).”
I would like them to only focus on observation-based data. Un-fooled around with data would be best. Data that does not fill in points that for whatever reason was not obtained. I suspect they will get different projections based on different chosen start dates.
Most likely they would discover that their projections would somewhat continue the warming/cooling cycles that we see since the end of the LIA.
How can one determine when that cyclic warming will end without knowing all the causes of the cyclic warming?
Also, I have a question:
Aren’t models supposed to be based on data/observations? Will their new, as proposed by the paper, model(s) be based partly/mostly on current models?

DirkH

“Furthermore, our work focuses on increases/decreases in temperatures, but similar analyses are needed to estimate consensus changes in other meteorological variables such as precipitation.”
This sounds like they don’t trust their own prediction of a wet Saudi Arabia.

The everlasting delusion of statisticians that they can mine gold from worthless ore
simply by applying statistics to a giant boatload of crappy data and model estimates. Bayesian stats are included, no less, which aren’t even statistics, simply people giving their own guesses and weighting their belief in same. Maybe someday it wil dawn on these innocents that model output and projections are not data.

Sweet Old Bob

NCAR has a fever and it’s contagious?
Symptoms include delusions?

TheLastDemocrat

One of these models is the most accurate. It might be accurate by chance, or because it is fairly well estimating some of the important variables.
Averaging the models is a way to ensure your model is poor by straying away from the best model.

Latitude

Data from these many and varied sources results in different climate projections; hence, the need arises to combine information across data sets to arrive at a consensus regarding future climate estimates…..
You’re going to take bad data….average it….and get what??
“Last fall the Climate Prediction Center of the National Oceanic and Atmospheric Administration predicted that temperatures would be above normal from November through January across much of the Lower 48 states………………”
http://www.businessweek.com/articles/2014-02-18/the-official-forecast-of-the-u-dot-s-dot-government-never-saw-this-winter-coming

David Wells

I needed something to read whilst my graphics driver updates, I wish now that I had gone to the kitchen and found a very large knife and slit my throat. I am having an email and phone debate with Myles Allen and Kevin Anderson about RCP’s which are projections made by three difference computer models about Co2 emissions and currently we are above the highest which is RCP 8.5 the worst case scenario where we all get to fry next Tuesday and no matter how specific I get about the fact that it has not warmed for 17 years and 5 months its flat, they tell me I am drawing the line in the wrong place. Kevin Anderson advises Ed Davey DECC Sec of State on mitigation policies which is what Connie Hedegaarde is spending Euros 30 trillion to save 2 hundredths of 1C. Myles Allen conducts jam jar laboratory experiments on Co2 and makes extrapolations which are then picked up by computer modellers whose projections are based on those simplistic experiments without any exposure to the real world I have just read the ipcc pdf and the time it must have taken to write 17 pages of hyperbole to justify the insignificance of what they are about and which guys like Kevin Anderson believe is beyond belief. They just read what other people write and if its peer reviewed they just accept it as fact they don’t want to look close or question it because it could mean they wont have a job in six months, if it is not a conspiracy then it is certainly collusion. I have watched an horizon program on placebos today and when a consultant had a meeting about the placebo effect saying he was going to do some trials all of the other doctors in the room called him a bastard sceptic and a heretic for daring to challenge the orthodoxy he was proven right there is such a thing as the placebo effect but I don’t think I will live long enough to see these green bastards out of a job.

Oldseadog

JohnWho:
Exactly.

Kevin Kilty

Why must we arrive at a consensus? Why wouldn’t we be better served to let people do the best job they think is possible at analysis and projection, and then plan accordingly? Some may plan poorly; while others plan better, but there wouldn’t be the risk of everyone planning poorly and identically, end up like the lemmings.

Alan Robertson says:
February 19, 2014 at 1:28 pm
Wait a minute… I thought there already was a consensus.
*
Yes, but people are getting confused what the consensus is because it’s all getting out of control. Someone must have decided that they needed a consensus of the consensus. And then everyone will know what it is. 🙂

Signal THEORY…there is a point at which NOISE dominates INFORMATION.
I have YET to hear anyone talk about SIGNAL THEORY. (Oh, I got it….to much NOISE out there, no SIGNAL..)

Jeff F

@Cynical Scientst: What I didn’t realize, upon interacting with many scientists that work in the environmental sciences, is that they firmly believe that cause and effect can be linked through statistics only. The physics based connection doesn’t seem to be all to important. Worse, if you try to argue the problems associated with a black box statistical approach, you will be chastised for not knowing anything about statistics. Worse still, if you happen to be an engineer, you will be chastised because, apparently, engineers only understand deterministic systems. Thankfully, I have a PhD in statistical data analysis with an electrical engineering degree, so I simply ignore the ad-hominem attacks.
However, there seems to be a problem with how we are educating some students of the sciences. They are abusing the idea that correlation, with all other factors known is equivalent to causation. This is essentially impossible to establish with complex systems. Worse again, after they mistakenly assume that causation has been established through the use of statistics, everything that they observe a-posteriori becomes evidence. Bad news all around.

BioBob

I would truly like to see how they combine data ” in a statistically rigorous way”, with data drawn from a daily sample size of ONE, non-random, non-replicated grab sample !!!
Amaze me….

more soylent green!

Spoiler alert — You don’t get “data” from a climate model! You don’t get data from any model. Model outputs are not facts.

Neil

@Lattitude,
Love the article (http://www.businessweek.com/articles/2014-02-18/the-official-forecast-of-the-u-dot-s-dot-government-never-saw-this-winter-coming) and especially love this little quote:
‘Climatologists are trying to use their big miss this winter as a learning experience’. Really? There’s stuff to learn? But this is Settled Science.

Looks like a long- winded but potentially face- saving exercise for those ready to distance themselves from the increasingly Baroque and increasingly embarrassing GCMs. If that is what it will take to calm things down a bit on both the policy and the science fronts, then it is good news.

Dung

Why is it wrong to shoot people? It is blatantly obvious that some people are seriously in need of shooting (twice just to make sure) ^.^

Lloyd Martin Hendaye

Seeking qualitative vs. quantitative rationalization of inherently deficient GCMs is a first-order exercise in turd-polishing. No-one is or can be “expert on the future”, and “error bars” pretending otherwise are naught but academic snares, delusions.
“Climate research” is a classificatory discipline akin to botany. If and when a Mendelian genetic analog appears, benighted catalogers will be the last to know.

Steve from Rockwood

Gamecock says:
February 19, 2014 at 1:20 pm
They can’t even predict the past.
———————————————
Haha. That’s because they keep changing it!

DirkH

Steve from Rockwood says:
February 19, 2014 at 2:28 pm
“Gamecock says:
February 19, 2014 at 1:20 pm
They can’t even predict the past.
———————————————
Haha. That’s because they keep changing it!”
The past is easy to predict. The past always cools over time.

Gail Combs

Science by a committee of politicians and parasites — OH, How Wonderful.
Good grief can’t they even READ:
The IPCC actually said in the Science Report in TAR:

“in climate research and modeling we should recognise that we are dealing with a complex non linear chaotic signature and therefore that long-term prediction of future climatic states is not possible”
IPCC 2001 section 4.2.2.2 page 774


So they are going to ignore the fact climate is ‘a complex non linear chaotic signature’
Instead:
We’lll toss eye of newt and toe of frog, wool of bat and tongue of dog, adder’s fork and blind-worm’s sting, lizard’s leg and owlet’s wing, into the charmèd computer all shall go, chanting incantations statistical while stirring well. We’ll call it Gore-bull Worming, this charm of trouble, this bubbling broth of he!!.
(Apologies to the Bard.)

Berényi Péter

authors […] propose a statistical hierarchical Bayesian model that consolidates climate change information from observation-based data sets and climate models

Nuff said (“The term hierarchical model is sometimes considered a particular type of Bayesian network, but has no formal definition”). For starters: anything goes.
Just to make it absolutely clear, Bayesian statistics is a way to build a formidable castle on quicksand by formalizing stubborn prejudices as priors, while “hierarchical” is a buzzword making this process as convoluted, opaque and intractable as possible.
On the plus side the method is developed to handle marketing issues, to which class “climate change” belongs to.

Adam

“statistical hierarchical Bayesian model”
This idea will definitely work because the simple ideas are always closer to nature and there is nothing more simple and easy to understand and less convoluted than a good old “statistical hierarchical Bayesian model”. Honestly guv, it is not BS, it is a really simple way at getting to the predefined truth – that global temperatures have been soaring over the past few decades and urgent action is required NOW to save the planet. For our children! [/sarc]

Curious George

“only by combining many different sources of climate data are we really able to quantify how much we think temperatures are changing.”.
Two basic errors here:
1. Not all data sources are equally reliable. You need a good crystal ball to assign weights.
2. Even in the best case, it will only tell us how the temperatures WERE changing.

Londo

Better throw in some climate models in there or else they might just get it right by accident. We can’t have that, can’t we.
Science by by committee, at its worst.

Adam

Gamecock says:
February 19, 2014 at 1:20 pm
They can’t even predict the past.
Nice one!

Gail Combs

BioBob says: @ February 19, 2014 at 2:04 pm
I would truly like to see how they combine data ” in a statistically rigorous way”, with data drawn from a daily sample size of ONE, non-random, non-replicated grab sample !!!
>>>>>>>>>>>>>>>>
Yes one of my favorite nits to pick.

Stuart Elliot

It’s subprime mortgage thinking, applied to climate models. Over here we have a dog turd, but if I put it in a bag with dozens of other dog turds it becomes a reliable low risk investment, er, model.
It will probably turn out the same way as subprime, with perpetrators getting rich and victims still not sure how the people “in charge” let it happen.

knr

Its not unheard of , there where in fact various meeting to decide which gospel should stay in and which stay out of the bible and much rewriting of scripture to suit the political purposes of the church. These people are just carrying on in that ‘fine tradition’ , we already have St Gore with the ‘prophet’ Mann , whose words should never be questioned , calls for the prosecution of ‘unbelievers’ and hatred for ‘heretics’ so this hardly a big step. Its just the next logic step.

Londo

Come to think of it, there is this joke about how many French does it take to defend Paris, and the answer is that the number is not known. It hasn’t been tried. I suspect the remake of this for the future would be how may climate scientist does it take to solve the climate problem, and from this post it very much feels like that number is still not known, it hasn’t been tried.
Now the thing is that France had all that modern (military) equipment at the time but didn’t use it and that pretty much resembles climate science. All that stuff is only being used to scare people but beyond that, it’s not doing much good.

george e. smith

A wonderful new map of the world to be sure. Looks like the USA, is the actual center of the universe.
I wonder how many orders of magnitude Nyquist under-sampling, THIS land mark study has ??
So we have a whole new raft of statistical prestidigitation of climate data, that was already known, but that helps nowt, in foretelling what happens tomorrow..

K-Bob

This is kind of like putting a bunch of turds in a blender, set on puree and see what happens. It’ll look a lot like the stuff I get when I eat at a buffet style all you can eat restaurant. I don’t get it, wouldn’t this just reproduce the same ensemble of bad models that we have already?

rogerthesurf

I believe that these activities such as chasing consensus, arguing over theories which are unsupported empirically are all an orchestrated diversion and smoke screen.
The real danger comes from the United Nations and UN Agenda 21 which is the parent of AGW.
Is the UN a dangerous and malignant bureaucracy? My research points to this.
There is no conspiracy as everything is clearly described and published on official websites etc. All we need to do is read it!
Here is an example on Agenda 21 policy on land.
See the my highlights in red on page 8
http://thedemiseofchristchurch.files.wordpress.com/2014/02/unitednations-conference-on-human-settlements_habitat1.pdf
No wonder property rights are being abused in my city under the guise of an earthquake recovery.
Ever consider the Agenda 21 wetlands policy. It works like this. 1. Persuade a farmer to part with a swampy area of his farm and label it a ‘wetlands restoration’. 2. Do nothing and let the wetland ‘recover’ which means the original drainage will block and the wetland/swamp will spread further onto the farmer’s land. 3. If the farmer in desperation clears the drainage problem then fine or gaol him. 4. Observing that there are now further spreading swamps ‘wetlands’ encroaching on said farmers land, well take that off him too.
http://www.stuff.co.nz/marlborough-express/news/8790787/Farmer-guilty-of-damaging-Rarangi-wetland
Note that this guy once owned the land in question, it is not mentioned how it was transferred to the government. Also willows are mentioned. Willows are not natural in New Zealand and used to be regarded as pests because they block waterways.
In my country, maybe more than 50% of farms are comprised of more than 90% drained swamp.
Check out the warped thinking on page 12. Re: educating men or women!
http://thedemiseofchristchurch.files.wordpress.com/2014/02/education-for-sustainable-development-toolkit.pdf
My blog has more real life examples!
Please visit both posts.
http://www.thedemiseofchristchurch.com
Cheers
Roger

george e. smith

I take it that a “Bayesian Model” is a German way to make Aspirin.
Come to think of it, I could use an aspirin right about now after reading this “peer reviewed” (?) paper. I’ll go with the Monsanto cheap aspirin.
And they don’t need to try and convince me that their statistical mathematics is rigorous. I assume that, a priori (Latin words, I don’t understand), about virtually any branch of mathematics. It’s the mischief they get up to with it that grabs my attention.

Adam

Climate models are a good idea and they do work as far as they are designed to.
What is a climate model: They are an attempt to take the known historical data put it on a grid and apply all of the physics that we *can compute*, and to see whether we can make predictions about the future.
Climate models do work.
We can make very good predictions out to a couple of weeks and excellent 24 hour predictions on a local scale.
BUT.
1. We cannot test predictions on a global scale because we do not even have the global data collection system in place to collect the empirical data that we are trying to predict to any where near the precision being spoken about.
2. We cannot (and will never) compute the small scale processes, which are vital and which cannot be ignored and cannot be “averaged”, due to lack of computational power.
3. The equations themselves tell us that predictions beyond a few weeks, on any spatial scale, are COMPLETE NONSENSE because of the error due to the chaotic nature of the nonlinearities. Any real mathematician will tell you that. Just show him the equations and tell him how closely spaced in space and time you have measured the historical data fields. He will tell you that your forward run integrations will get less and less accurate into the future and that after a week (at best!) the uncertainty in your measurement will completely swamp the “signal”.
On point number three. Where the are all of the Professors of Mathematics? Why are they so silent? Why don’t they pipe up and say something about the insanity of pumping billions of dollars into Climate Predictions. Even if you try to predict the end positions of 10 Snooker balls which start out spaced by 1 foot in a line, with the first one being struck into the next and so on, you will fail. Why? Because you cannot measure the starting locations of the balls and the strike point of the cue well enough for the error in that measurement to not swamp the “signal”. How can we expect to predict where the energy and moisture concentrations in the Ocean and Atmosphere will end up 20 years from now? How? Why do we think it is even possible?
The thing that has gone wrong in climate science is that the people who are now modelling the climate do not understand the Mathematical facts about the equations they are modelling. The first set of people to write GCM’s did understand. They were Mathematicians who understood chaos theory and the limitations of what could be predicted. I.e. they knew that the best they could achieve were very good weather models.
But now we have people who are called “Climate Scientists” doing the work, and they do not understand the fundamental mathematics of the equations they are modelling. It is beyond their capacity. Because as Professor Lindzen said recently, “Climate science is for second-raters”. They do not understand that it is IMPOSSIBLE to predict global temperature out to many years or decades.
I REPEAT AT THE TOP OF MY VOICE: IT IS IMPOSSIBLE TO PREDICT TEMPERATURE OUT TO MANY YEARS OR DECADES, THE EQUATIONS TELL US THAT.