Statistics research could build consensus around climate predictions
Philadelphia, PA—Vast amounts of data related to climate change are being compiled by research groups all over the world. Data from these many and varied sources results in different climate projections; hence, the need arises to combine information across data sets to arrive at a consensus regarding future climate estimates.
In a paper published last December in the SIAM Journal on Uncertainty Quantification, authors Matthew Heaton, Tamara Greasby, and Stephan Sain propose a statistical hierarchical Bayesian model that consolidates climate change information from observation-based data sets and climate models.
“The vast array of climate data—from reconstructions of historic temperatures and modern observational temperature measurements to climate model projections of future climate—seems to agree that global temperatures are changing,” says author Matthew Heaton. “Where these data sources disagree, however, is by how much temperatures have changed and are expected to change in the future. Our research seeks to combine many different sources of climate data, in a statistically rigorous way, to determine a consensus on how much temperatures are changing.”
Using a hierarchical model, the authors combine information from these various sources to obtain an ensemble estimate of current and future climate along with an associated measure of uncertainty. “Each climate data source provides us with an estimate of how much temperatures are changing. But, each data source also has a degree of uncertainty in its climate projection,” says Heaton. “Statistical modeling is a tool to not only get a consensus estimate of temperature change but also an estimate of our uncertainty about this temperature change.”
The approach proposed in the paper combines information from observation-based data, general circulation models (GCMs) and regional climate models (RCMs).
Regional analysis for climate change assessment. Image credit: Melissa Bukovsky, National Center for Atmospheric Research (NCAR/IMAGe)
Observation-based data sets, which focus mainly on local and regional climate, are obtained by taking raw climate measurements from weather stations and applying it to a grid defined over the globe. This allows the final data product to provide an aggregate measure of climate rather than be restricted to individual weather data sets. Such data sets are restricted to current and historical time periods. Another source of information related to observation-based data sets are reanalysis data sets in which numerical model forecasts and weather station observations are combined into a single gridded reconstruction of climate over the globe.
GCMs are computer models which capture physical processes governing the atmosphere and oceans to simulate the response of temperature, precipitation, and other meteorological variables in different scenarios. While a GCM portrayal of temperature would not be accurate to a given day, these models give fairly good estimates for long-term average temperatures, such as 30-year periods, which closely match observed data. A big advantage of GCMs over observed and reanalyzed data is that GCMs are able to simulate climate systems in the future.
RCMs are used to simulate climate over a specific region, as opposed to global simulations created by GCMs. Since climate in a specific region is affected by the rest of the earth, atmospheric conditions such as temperature and moisture at the region’s boundary are estimated by using other sources such as GCMs or reanalysis data.
By combining information from multiple observation-based data sets, GCMs and RCMs, the model obtains an estimate and measure of uncertainty for the average temperature, temporal trend, as well as the variability of seasonal average temperatures. The model was used to analyze average summer and winter temperatures for the Pacific Southwest, Prairie and North Atlantic regions (seen in the image above)—regions that represent three distinct climates. The assumption would be that climate models would behave differently for each of these regions. Data from each region was considered individually so that the model could be fit to each region separately.
“Our understanding of how much temperatures are changing is reflected in all the data available to us,” says Heaton. “For example, one data source might suggest that temperatures are increasing by 2 degrees Celsius while another source suggests temperatures are increasing by 4 degrees. So, do we believe a 2-degree increase or a 4-degree increase? The answer is probably ‘neither’ because combining data sources together suggests that increases would likely be somewhere between 2 and 4 degrees. The point is that that no single data source has all the answers. And, only by combining many different sources of climate data are we really able to quantify how much we think temperatures are changing.”
While most previous such work focuses on mean or average values, the authors in this paper acknowledge that climate in the broader sense encompasses variations between years, trends, averages and extreme events. Hence the hierarchical Bayesian model used here simultaneously considers the average, linear trend and interannual variability (variation between years). Many previous models also assume independence between climate models, whereas this paper accounts for commonalities shared by various models—such as physical equations or fluid dynamics—and correlates between data sets.
“While our work is a good first step in combining many different sources of climate information, we still fall short in that we still leave out many viable sources of climate information,” says Heaton. “Furthermore, our work focuses on increases/decreases in temperatures, but similar analyses are needed to estimate consensus changes in other meteorological variables such as precipitation. Finally, we hope to expand our analysis from regional temperatures (say, over just a portion of the U.S.) to global temperatures.”
To read other SIAM Nuggets, explaining current high level research involving applications of mathematics in popular science terms, go to http://connect.siam.org/category/siam-nuggets/.
Source article:
Matthew J. Heaton, Tamara A. Greasby, and Stephan R. Sain
SIAM/ASA Journal on Uncertainty Quantification, 1(1), 535–559 (Online publish date: December 17, 2013). The source article is available for free access at the link above through December 31, 2014.
About the authors:
Matthew Heaton is currently an assistant professor in the department of statistics at Brigham Young University, but the majority of this work was done while the author was a postgraduate scientist at the National Center for Atmospheric Research. Tamara Greasby is a postdoctoral researcher and Stephan Sain is a scientist at the Institute for Mathematics Applied to the Geosciences (IMAGe) at the National Center for Atmospheric Research (NCAR).

Jacques Derrida was a French philosopher best known for developing a form of semiotic analysis known as deconstruction. He is associated with postmodern philosophy, whereby a structure cannot be understood without understanding its genesis. That is, without understanding how the notion of Climate Change was formed, you cannot understand Climate Change. We have here an effort to implant en masse a “consensus” on what climate change is, projecting that this notion will eventually constitute the reality of climate change. Deconstruction exemplified.
Gary Pearse says: @ur momisugly February 19, 2014 at 4:18 pm
… Not having the resources to diddle data as much as the US, the 1930s temps still stand north of the border….
>>>>>>>>>>>>>>>>>>>>>>>..
Actually you can thank your weather historian (no doubt close to retirement) for being a very honest and up right man. And yes I do know him but haven’t seen him since I moved south.
Jeff F says
I don’t think it is so much with the students as the “self-serving bias.” My brother is a research psychologist and he taught me how to do research. He also taught me how to spot the weal points of research and the difference between correlation and causation which he said (besides fraud) was one of the biggest problems with the current research.(albeit 20 years ago).
Based on what me taught me, I immediately found the flaws in climate “science.” Started with the NSA website–they referenced charts in their articles and the charts did not say what they said they said–many read the articles and don’t study the links. I was SURE my brother would see what I saw. But he is a professor and part of the academic bubble, apparently. He scoffed at me when I shared my conclusions–he said “Tim Ball” and rolled his eyes–never heard of Dr. Spencer. Anyone i cited, he said wasn’t a “real” scientist and he cited the usual suspects. I went into shock.
He wants to believe the meme and no amount of real science is going to influence him. I asked what it would take for him to reconsider and he answered, “If it doesn’t warm up for 5 more years.” That was 3 years ago. He asked what it would take for me to reconsider. If any of their models could ever get one thing right. I should have said (referring to previous post) “If any of the models could at least predict the past!”
HA HA–great line guys.
I’m thinking of a number from 1 to 100. Their solution to determining the number is to average all the numbers between 1 and 100. Therefore the number I am thinking of MUST be 50. We must act as if it is 50. My denials notwithstanding, it IS 50.
I was thinking of 6, but oh well.
The approach proposed in the paper combines information from observation-based data, general circulation models (GCMs) and regional climate models (RCMs).
Hmmmm……
Take one part real data and blend with 2 parts unvalidated climate models.
What do you get?
Science Fiction……
I think that was an interesting paper, but for now what it accomplishes is to provide grist to everyone’s mill. Like, before incorporating GCM results into a Bayesian hierarchical framework, shouldn’t we wait until some of their outputs have been shown to be accurate for 20 years or so without ad hoc modifications? Should we ignore (as the authors do) all the interesting curve-fitting that has been performed to extract possibly quasiperiodic oscillations? Is almost everyone satisfied that the numerous data sets (and the reanalyis data set) have been properly curated? Have the effects of the diverse changes in insolation been properly accounted for?
Here’s from near the end: While many sources of uncertainty are accounted for in this analysis, there are also many sources which are ignored. Notably, uncertainty in initial conditions, imperfect scientific understanding of climate processes, uncertainty in parameterization of processes below grid level, etc. are all unaccounted for in this analysis and could contribute to wider credible intervals than those presented in Figures 4–6.
In order to perform this analysis, certain assumptions had to be made about the biases
of climate models in the future. As discussed above, such biases are not identifiable. Hence,
comparing the validity of two different sets of assumptions is not possible. In regard to such
assumptions, perhaps the important question is not which set of assumptions is “best” but
whether different assumptions lead to drastically different results on climate change. That
is, are climate change results sensitive to different assumptions about future biases in climate
models? Answering this question is an interesting direction for future work.
That’s a good short list of important reasons not to believe any putative “projections” from this method. I can see it being refined by generations of graduate students indefinitely into the future.
“By combining information from multiple observation-based data sets, GCMs and RCMs, the model obtains an estimate and measure of uncertainty for the average temperature, temporal trend, as well as the variability of seasonal average temperatures.”
By combining three different sorts of garbage, we achieve…Der Übergarbage!
pochas says: “Jacques Derrida was a French philosopher best known for developing a form of semiotic analysis known as deconstruction….”
Derrida, by the way, was a Marxist.
Anthony – I came across this .pdf posted arxiv.org, a paper by Donald C Morton at the Herzberg Astronomy and Astrophysics Programs, NRC Canada. To my layman’s eye it looks to be a pretty important overview of the potential of solar, interstellar and galactic forcings impacting global climate. Of interest to you for a post?
Abstract
This paper describes some of the astronomical effects that
could be important for understanding the ice ages, historic
climate changes and the recent global temperature increase.
These include changes in the Sun’s luminosity, periodic
changes in the Earth’s orbital parameters, the Sun’s orbit
around our galaxy, the variability of solar activity, and the
anticorrelation of cosmic-ray flux with that activity. Finally,
recent trends in solar activity and global temperatures are
compared with the predictions of climate models.
http://arxiv.org/ftp/arxiv/papers/1401/1401.8235.pdf
I got the reference via Ben Davidson’s Suspicious0bservers News, posted this morning (19th)
As Ben D comments, it seems there’s a ‘sea-change’ in terms of scientists in a lot of disciplines, distancing themselves from the CAGW models.
Thanks, sorry if you’ve seen it already – but I follow WUWT pretty closely and haven’t seen it referenced in postings or guest articles….- David S.
They are jumping onto the air-brushed band wagon a bit late. People want to see warts and moles these days. The days of an air-brushed-thin Oprah on the cover of her O magazine are over ladies and gentlemen.
The sad part is that the original hand written logs are gone. Do we not learn? At least let’s keep all the disparate homogenized and sanitized data sets separate. Else we repeat the past: Three data sets disagree. Hey, I have an idea! Why don’t we average them together and let the dog eat the original data sets?
ALL THIS TALK ABOUT CLIMATE CONSENSUS MODELS…
OFFICIAL US FORECAST DIDN’T EVEN SEE THIS WINTER COMING.
http://www.climatedepot.com/2014/02/19/epic-fail-bloomberg-news-the-official-forecast-of-the-u-s-government-never-saw-this-winter-coming-feds-predicted-temperatures-would-be-above-normal-from-november-through-january/
Perhaps OT, but y’all pay attention to this emerging initiative. Kinda fits consensus building, no?
Just sayin,,, these folks have a game plan and it is coming from all sides.
http://www.foxnews.com/politics/2014/02/19/fcc-official-others-warn-agency-study-would-squash-news-media-1st-amendment/
The whole PREMISE of the IPCC and the CAGW government grant process is to PROVE an Anthropogenic causal relationship exists for global warming.
When a CO2/global warming correlation ceases to exist (as has been the case for almost 18 years) the secondary purpose of CAGW is to come up with excuses for why this mysterious “anomaly” exists… This isn’t science, it’s propaganda.
The fundamental goal of science is to discover the truth; not to discover ploys to obfuscate the truth and call this hypocrisy science….
http://www.nytimes.com/2014/02/20/business/fcc-to-propose-new-rules-on-open-internet.html?hpw&rref=technology&_r=0
Hummmm, WUWT?
A scientist of statistic with an academic title who seems to have forgotten what every student of Mathematic Statistic is supposed to have learnt, come rain or sunshine, at least by reading Huff’s How to lie with statistics…….
Statistic analysed in computers are btw not better than the input data gives possibilities to be. Not to mention that most factors impact on climate never ever been considered in any of the so called computermodels…..
Mike Jonas says:
February 19, 2014 at 4:44 pm
“The argument that we can’t forecast the short term weather so we can’t forecast the long term climate are IMHO logically incorrect (they are different things that may have different primary drivers)”
——————————————————————————————————————–
In these cases the models are not forecasting the long term any more than they recognizing the pattern of weather changes for a whole year, and a multitude of years. Any fool can forecast that cold weather will begin in the fall, peak in the winter, get warmer in the spring, and get even warmer in the summer. There is no model made, or that can presently be made, that can forecast the exact temperature (within 0.1 degree C) better in thirty years than even a simple GCM can do for tomorrow. There are simply too many variables and random inputs that will occur the further away you travel from your starting date.
@Ossqss: Don’t jump to conclusions. There are two sides to this story. This isn’t a simple situation.
There may be a need for new rules because some internet providers are effective monopolies in their local markets and a few of them seem to be trying very hard to find ways to abuse this monopoly position. As well as charging users for connection they are trying to manipulate things so that they can also charge website operators for allowing people to access their content.
The excuse is that they want to sell the option of providing website owners with a faster connection. The excuse really doesn’t make much sense in terms of the technology, but sounds plausible to legislators. However that is just the foot in the door.
If they get their way then what sites you can see will depend on where you are accessing from and there will be rent collectors at every step in the road from website to user. At the extreme Anthony might be asked to pay all these rent collectors just so that people can see his website.
more soylent green! says:
February 19, 2014 at 2:07 pm
Spoiler alert — …….. You don’t get data from any model……….
++++++++++++++++++++++++++++++++++++++++++++++
36″ 25″ 36″ pick yer own model data.
Reminds one of the multi proxy temperature reconstruction insanity. Who cares if bristle cone pines are not good thermometers. Who cares if the proxy is upside down. Who cares if the temperature response is not linear. Who cares if one rogue tree has inordinate influence.Throw them all in anyway. The magic of statistical prestidigitation converts the sow’s ear into a silk purse, or so we are told by consensus climate science.
@more soylent green!
There is no difference between Climate and Weather models. Just the scales. If you like then Climate models are Weather Models with grid sizes of a few Km which cover the entire globe whereas Weather Models have smaller grid sizes and are not global. Due to computing constraints, Climate Models have much larger time steps, typically a few hours.
Ossqss,
Your post is not off topic. The administration will use this power via the illegally appointments to the FCC to require the media to push their agenda starting with the climate agenda. It will control of the media and stop all criticism. It is one of many fronts to get control of carbon and your energy usage. Based on the Obamacare debacle, we know how well the government can manage our energy.
“However, one agency commissioner, Ajit Pai, said in a Wall Street Journal op-ed piece Wednesday that the May 2013 proposal would allow researchers to “grill reporters, editors and station owners about how they decide which stories to run.”
He also said he feared the study might stifle the freedom of the press.
“The American people, for their part, disagree about what they want to watch,” wrote Pai, appointed to the FCC’s five-member commission in May 2012 by President Obama. “But everyone should agree on this: The government has no place pressuring media organizations into covering certain stories.”
An Obama administration plan that would get researchers into newsrooms across the country is sparking concern among congressional Republicans and conservative groups.”
“The purpose of the proposed Federal Communications Commission study is to “identify and understand the critical information needs of the American public, with special emphasis on vulnerable-disadvantaged populations,” according to the agency.”
The media would be required to promote climate change and global warming even though the facts do not substantiate the claims by deciding what critical information must or cannot be disseminated !!
Obviously the Klimate Khange Konsensus folks are at work here.
@Katherine – shouldn’t we just throw out ALL the GCMs as useless for forecasting climate change/weather?
Computer modelling is confirmation bias performed at speeds measured in teraflops.
Science is finding out about things you do not know. You can only put into a computer model things you already know. Therefore computer models are not science.
QED.
@Adam: You are right on the money. In my opinion, the reason that the long-term models fail is the key point. It is a subtle point that was made clear in a post on Dr. Curry’s website about spatio-temporal chaos that was discussed here at another time. The fact is that almost all climate scientists (modelers or otherwise) firmly believe that their system can be modeled as a stochastic system. It is pretty simple to see why they think this way. All of their science training tells them to treat every complex system as a stochastic one. Worse, those in climate science usually have less background in physics (as a meteorologist would have) and more background in large-scale systems theory. As a result, we have bad models and a complete disregard for how to model the behavior of the atmosphere over long time scales (which is, as you point out, currently impossible to do). In the end, it is best to let climate science fail in this endeavor – and that is the one and only thing in climate science that I will attach a high confidence to.
Latitude says:
February 19, 2014 at 1:56 pm
Data from these many and varied sources results in different climate projections; hence, the need arises to combine information across data sets to arrive at a consensus regarding future climate estimates…..
You’re going to take bad data….average it….and get what??
==========================================================
LOL, well yes they are climate scientists. They take several dozen wrong, in one direction “over warm” models, and then project the mean of dozens of wrong models.
The point of my prior posts on ths thread were quite simple and observational.
The recent activity of the FCC seems to follow a pattern. Is it similar to that of which we have seen via others prior? DOJ, IRS, EPA, NSA, DOE, and the like.
No matter your affiliation, do you feel more freedom from this type of stuff?
Think about it
http://youtu.be/diYAc7gB-0A