At #AGU17, in an attempt to prove ‘climate change creates more severe weather’, scientists resort to ‘p-hacking’

From the “If your experiment needs statistics, you ought to have done a better experiment.- Ernest Rutherford” department.

From the American Meteorological Society via press release:


AMS TO RELEASE REPORT EXPLAINING EXTREME EVENTS OF 2016

New research links notable weather and climate events to human influence

The American Meteorological Society is releasing the strongest evidence yet that humanity’s long-term release of greenhouse gases has altered recent extreme weather and climate events. In the new collection of peer-reviewed papers analyzing the link between extremes in 2016 and climate change, scientists identified three events that would not have been possible without human-caused changes in the climate.

The linkages are made in a newly released report—Explaining Extreme Events from a Climate Perspective, a supplement to the Bulletin of the American Meteorological Society. This collection of studies from 116 scientists analyzes 21 different events in 2016, from U.S. snowstorms and South African drought to oceanic hot spots and Arctic warmth. Most of the events researchers examined can be attributed, at least in part, to human-caused climate change.

Some of the analyses go beyond atmospheric and oceanic extremes to link societal or ecological impacts, including coral bleaching and crop failures, to human-caused climate change.

The American Meteorological Society will release Explaining Extreme Events in 2016 from a Climate Perspective at the AGU Fall Meeting on Wednesday, December 13, 2017 at 11:30am CDT.

The panel including editors and authors of the papers will discuss their findings about natural and human influences on the extreme events, as well as the developing maturity of attribution science.

Reporters are invited to attend the press conference in person at the Morial Convention Center in New Orleans or else via live web streaming offered by the American Geophysical Union.

Events Assessed for 2016 include:

  • The notorious warm “Blob” in the Pacific Ocean
  • Flash Droughts in South Africa
  • Wildfires in North America and Australia
  • Cold Snap in Eastern China
  • Drought in Northeast Brazil

UPDATED: You can download the entire report: http://www.ametsoc.net/eee/2016/2016_bams_eee_low_res.pdf

The press release and links to individual chapters is here:

https://www.ametsoc.org/ams/index.cfm/publications/bulletin-of-the-american-meteorological-society-bams/explaining-extreme-events-from-a-climate-perspective/

From the introduction of the 2016 edition:

As in past years, this sixth edition of Explaining Extreme Events from a Climate Perspective includes studies of extreme events from around the world that did not find a role for climate change in influencing the magnitude or frequency of an event. It is important to note that papers are selected for consideration in this report by reviewing author proposals that do not indicate whether a role for climate change will or will not be found. Thus, there is no selection bias on the part of the editorial team toward one particular conclusion, and this publication prides itself as a venue that accepts papers without consideration for whether a role for climate change is found. This year there may be a slight bias toward events that do not find a signal relative to previous years because the editors have begun to limit the number of heat papers in the report which is the event type where a signal is most commonly found.

Given that the majority of heat papers now use a widely established and accepted methodology, the scientific value of continuing to include a large number of heat studies began to seem limited. Extreme weather event types included in this year’s edition include ocean heat waves, forest fires, snow storms, and frost, as well as heavy precipitation, drought, and extreme heat and cold events over land. A number of papers also look at the impacts of extremes (Fig. 1.1). The Summary of Results Table (Table 1.1) gives readers a general overview of the results. Twenty-one of the 27 papers in this current edition identified climate change as a significant driver of an event, while six did not. Of the 131 papers now examined in this report over the last six years, approximately 65% have identified a role for climate change, while about 35% have not found an appreciable effect

Last year, the editors called on scientists submitting research proposals to investigate potential links between an extreme event and its subsequent impact, and we were excited to see five research teams take on this challenge in this year’s report. Lewis and Mallela concluded that the risk of the extreme Great Barrier Reef bleaching event was increased through anomalously high sea surface temperature and the accumulation of thermal stress caused by human caused climate change. Jacox et al. and Brainard et al. both examined how high ocean temperatures

caused in part by human-caused climate change impacted living marine resources like coral bleaching, reduced fish stocks, and a decrease in seabird counts in the California current and the equatorial Pacific, respectively. On land, Sippel et al. found that human caused climate change is causing warmer winters on the Iberian Peninsula and, when coupled with a wet spring, drove higher ecosystem productivity in the region in 2016. However, these papers represent early approaches, and more work is needed to develop impact attribution methodologies.

As is always the case, we would caution that the results of any single study should not be interpreted as the final word on the matter for that event, nor be generalized to a broader class of extremes. For example, authors of these papers selected specific modeling approaches and made other choices about factors that are important in how the models replicate extreme events, such as terrestrial heat or sea surface temperatures. If other study designs were applied to these events, it is possible a different result would be reached. The importance of the methodological approach in attribution research is further discussed in the summary of this report (Stott et al.).

A big question raised by this collection of research is whether these findings undermine the axiom that “no event is caused by climate change alone and that natural variability always plays some role.” The short answer is no. While several of the studied events were found not to be possible without climate change,

natural variability still laid the foundation for the events to occur, and the authors acknowledge this in their papers. Extreme events are always the result of a cumulative set of factors. The building blocks that form the foundation of any event continue to include natural variability, with factors such as El Niño potentially adding to the strength of the event. These temperature-related extremes would likely still have been warm events even without human-caused climate change, but according to these analyses, the events could not have surpassed the extreme warm thresholds that they did without climate change. This was especially the case for the record-setting globally averaged temperature. At the global scale, the natural

variations of Earth’s temperature are increasingly seen to pale in comparison to the growing intensity of human-induced warming. Overall, human-caused climate change allowed them to pass a threshold that they could not otherwise have exceeded.


From the introduction in the 2015 edition:

This last year has been exciting for attribution science, as the U.S. National Academy of Sciences released its report on the topic (NAS 2016). To date, it is the most comprehensive look at the state of event attribution science, including how the framing of attribution questions impacts the results. For example, in a complex event such as drought, a study of precipitation versus a study of temperature may yield different results regarding the role of climate change. The report also addresses how attribution results are presented, interpreted, and communicated. It provides the most robust description to date of the various methodologies used in event attribution and addresses the issues around both the confidence of the results and the current capabilities of near-real time attribution. No single methodology exists for the entire field of event attribution, and each event type must be examined individually. Confidence in results of an attribution analysis depends on what has been referred to as the “three pillars” of event attribution: the quality of the observational record, the ability of models to simulate the event, and our understanding of the physical processes that drive the event and how they are being impacted by climate change.

I’m not all impressed with the “three pillars”, because what typically happens is that if a model doesn’t simulate an event on the first pass, the researchers keep tweaking it until it does. Eventually, they all become “Clever Hans” in being able to respond to the weather events nature provides.

Larry Kummer of Fabius Maximus comments via email to me:

An exercise in data mining.

How many kinds of extreme weather are there? How many of these irregularly defined geographic areas are there?  Combine the two into a database. A survey of one year will always find outliers at the 5% level — by chance, because there are so many possibilities.Given that, it is easy for ingenious scientists to link some of them to anthropogenic effects.

More useful would be to see if the overall class of extreme events showed trends over time. Or at least some classes of extreme events (temp, precipitation) did so.  Otherwise they have not shown that anything unusually happened in 2016. Just weather.

Literally, all that is going on here is “p-hacking”, and it is well known to have bias problems.

Data dredging (also data fishing, data snooping, and phacking) is the use of data mining to uncover patterns in data that can be presented as statistically significant, without first devising a specific hypothesis as to the underlying causality.

After they find a statistically significant set of data, then they use the “three pillars” to assign causality, and that causality is always climate change. Problem is, there’s a built-in bias involved, for example, this 2015 paper in PLOS One explains why(bold mine):

The Extent and Consequences of P-Hacking in Science

Abstract

A focus on novel, confirmatory, and statistically significant results leads to substantial bias in the scientific literature. One type of bias, known as “p-hacking,” occurs when researchers collect or select data or statistical analyses until nonsignificant results become significant. Here, we use text-mining to demonstrate that p-hacking is widespread throughout science. We then illustrate how one can test for p-hacking when performing a meta-analysis and show that, while p-hacking is probably common, its effect seems to be weak relative to the real effect sizes being measured. This result suggests that p-hacking probably does not drastically alter scientific consensuses drawn from meta-analyses.

In Wikipedia, there is this description and a curious graph:

The process of data dredging involves automatically testing huge numbers of hypotheses about a single data set by exhaustively searching — perhaps for combinations of variables that might show a correlation, and perhaps for groups of cases or observations that show differences in their mean or in their breakdown by some other variable. Conventional tests of statistical significance are based on the probability that a particular result would arise if chance alone were at work, and necessarily accept some risk of mistaken conclusions of a certain type (mistaken rejections of the null hypothesis). This level of risk is called the significance. When large numbers of tests are performed, some produce false results of this type, hence 5% of randomly chosen hypotheses turn out to be significant at the 5% level, 1% turn out to be significant at the 1% significance level, and so on, by chance alone. When enough hypotheses are tested, it is virtually certain that some will be statistically significant but misleading, since almost every data set with any degree of randomness is likely to contain (for example) some spurious correlations. If they are not cautious, researchers using data mining techniques can be easily misled by these results.

There’s this example provided:

An example of data produced by data dredging, apparently showing a close link between the best word winning a spelling bee competition and the number of people in the United States killed by venomous spiders.

There’s also the famous graph showing climate change correlates to the number of pirates.

Larry Kummer adds via email:

In general use, p-hacking can be used to mine datasets for patterns to support a pre-existing hypothesis — such as “AGW is increasing incidence and magnitude of exteme weather.”  Operationally, it can result from scientists each checking their own database (a specific kind of weather in a specific geographical region) for increased extremes — which can be attributed to AGW or CC.  Since negative findings are not reported, this inevitably results in “findings.”

This is a classic setup for replication failure, as so many other fields — in both “hard” and “soft” sciences — have discovered. I’ve written about this, starting with this from April 2016: The replication crisis in science has just begun. It will be big.

Scientists have been searching for years for the elusive link between “climate change” and “severe weather” for example, this editorial in Nature put the onus on them back in 2012:

From Nature: Extreme weather

Better models are needed before exceptional events can be reliably linked to global warming.

As climate change proceeds — which the record summer melt of Arctic sea-ice suggests it is doing at a worrying pace — nations, communities and individual citizens may begin to seek compensation for losses and damage arising from global warming. Climate scientists should be prepared for their skills one day to be probed in court. Whether there is a legal basis for such claims, such as that brought against the energy company ExxonMobil by the remote Alaskan community of Kivalina, which is facing coastal erosion and flooding as the sea ice retreats, is far from certain, however. So lawyers, insurers and climate negotiators are watching with interest the emerging ability, arising from improvements in climate models, to calculate how anthropogenic global warming will change, or has changed, the probability and magnitude of extreme weather and other climate-related events. But to make this emerging science of ‘climate attribution’ fit to inform legal and societal decisions will require enormous research effort.

Attribution is the attempt to deconstruct the causes of observable weather and to understand the physics of why extremes such as floods and heatwaves occur. This is important basic research. Extreme weather and changing weather patterns — the obvious manifestations of global climate change — do not simply reflect easily identifiable changes in Earth’s energy balance such as a rise in atmospheric temperature. They usually have complex causes, involving anomalies in atmospheric circulation, levels of soil moisture and the like. Solid understanding of these factors is crucial if researchers are to improve the performance of, and confidence in, the climate models on which event attribution and longer-term climate projections depend.

Read the full editorial here.

Dr. Roger Pielke Jr. observed then:

The 116 scientists finally have come to a point where they figured out how to justify their claims with with “better models” and data mining, but all the correlation in the world does not equate to causation.

Meanwhile, examining one of the most fearful severe weather metrics, tornadoes, doesn’t seem to show a correlation:

But in the case of this recent BAMS special report, the researchers truly believe the correlation must be there, and belief is a powerful motivator, so they set out on a path of self-reinforcing data discovery to prove it, just like those kids at spelling bees and venomous spiders, they certainly found what they are looking for.

I weep for science.


NOTE: Shortly after publication, several updates and edits were added to improve the article, including additional excerpts from the 2016 report, and a new comment from Larry Kummer.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

95 Comments
Inline Feedbacks
View all comments
Gamecock
December 13, 2017 11:31 am

‘In the new collection of peer-reviewed papers analyzing the link between extremes in 2016 and climate change, scientists identified three events that would not have been possible without human-caused changes in the climate.’

A juvenile appeal to ignorance.

jclarke341
Reply to  Gamecock
December 13, 2017 7:03 pm

Yes, Gamecock! It is the same appeal to ignorance that is the foundation of all claims of evidence of AGW in weather events. “It has to be man-made, because we don’t know of any natural variability that could account for it.” Their ignorance is understandable, as they have been working very hard at ignoring natural climate variability and even denying some well-known cycles even exist (“we have to get rid of the Medieval Warm Period”).

Our ancestors blamed the gods, because our ancestors could not explain things without them. Modern man still blames the gods, but believes he is one of them.

Biggg
December 13, 2017 11:36 am

I have to challenge a massive error in this discussion. The Lack of Pirates Is Causing Global Warming. Jo Nova proved beyond a reasonable doubt that the price of postage stamps is the leading cause of global warming. 🙂

4 Eyes
December 13, 2017 11:55 am

Only when you know what the natural rate of warming is can you say how much anthropogenic warming is and thereby make conclusions about causality of extreme events. This reality is avoided repeatedly but it is why the historical temperature is constantly being adjusted. The new breed of young scientists will be unaware that the 1930s were very hot. This is a very sinister game with a serious agenda.

Dave Fair
Reply to  4 Eyes
December 13, 2017 12:49 pm

Mann’s hockey stick worked for years and was foundational in the consensus and governmental policies. At the time, I took it as truth and it helped form the belief on my part that CAGW was real.

Sheri
Reply to  4 Eyes
December 13, 2017 3:54 pm

What if the natural rate of warming is a constantly changing value that cannot be predicted or explained?

Dave Fair
Reply to  Sheri
December 13, 2017 4:25 pm

Then one shouldn’t spend bunches of OPM trying to control it, Sheri.

Svend Ferdinandsen
Reply to  4 Eyes
December 13, 2017 4:29 pm

Who say that extreme events increase with warming?
Reality show no positive correlation.

Reply to  Svend Ferdinandsen
December 14, 2017 4:06 am

Its frustrating isn’t it. Start with the assumption that “warming” or indeed any effect is bad and argue as though there is simply no alternative and wonder whether it will just be bad or really bad. Its kind of like they’re putting blinkers on the crowd up front.

Ptolemy2
December 13, 2017 12:05 pm

The real story about freak weather is that green-politicised energy policy has rendered the electricity grids in many European countries fragile and vulnerable to extreme weather outbreaks:

http://www.telegraph.co.uk/business/2017/12/12/fears-uk-gas-prices-could-soar-winter-shock-events-hit-supply/

December 13, 2017 12:06 pm

This type of “research” is not about discovery but about persuasion and advocacy. The intended audience is not real skilled scientists who follow the scientific process, but the public, policy makers, and, most especially, funders who can be tricked into paying more for the same nonsense year after year. In medical science the journals are filled with false claims and soon to be refuted discoveries because academia seems to reward this type of nonsense. Unfortunately those who use medical services suffer both through poor care and through higher payments because of the failure to do proper science. In climate science the harm will be larger and have far greater impact as entire economies are damaged.

Mohatdebos
December 13, 2017 12:46 pm

I am surprised they did not attribute record global grain harvests to climate change.

Dave Fair
Reply to  Mohatdebos
December 13, 2017 12:51 pm

Some actually do so! Then you get “just wait; our models show harm in the future.”

December 13, 2017 2:09 pm

For more research about North American wildfires

See this slideshow by Scott St. George, Associate Professor of Geography at the University of Minnesota (University page, his website).

[slideshare id=7516538&doc=class17firehistory-110404213848-phpapp01&w=650&h=500 ]

Reply to  Larry Kummer, Editor
December 14, 2017 8:56 am

The kangaroo rat was common in CA.
Part of its habitat is dry brush.
The k.rat is now endangered, partly through loss of habitat.
Land owners are not allowed to remove dry brush from their property. (Just in case one shows up.)
CA has had a bunch of wildires.

Conclusion:
Kangaroo Rats were expert fire-fighters.

Gary Kerkin
December 13, 2017 3:24 pm

Post hoc, ergo propter hoc.

Svend Ferdinandsen
December 13, 2017 4:25 pm

“scientists identified three events that would not have been possible without human-caused changes in the climate.”
I ask out of how many events? Haven’t we been told of all the disasters that has happened and that they are signs of Global Warming, and they only found 3 events!
Yesterday i had 3 leaves falling of my tree out of 1000, but these 3 where a sure sign of Global Warming.

December 13, 2017 5:00 pm

“If your experiment needs statistics, you ought to have done a better experiment.”

Anyone who believes that doesn’t understand science, and doesn’t understand statistics.

Having said that, it does not surprise me that climate researchers resort to all sorts of fallacies and deceptions. They’ve done it right from the beginning.

gnomish
Reply to  Karim Ghantous
December 13, 2017 5:18 pm

if a proposition does not resolve to true/false, then it is not reasonable and unscientific, mmk?
so i’m not persuaded that you are speaking truth and i’m betting you can’t resolve the truth of your own statement with statistics, amirite?

Reply to  gnomish
December 13, 2017 10:05 pm

Some propositions result in a probability, until they are able to be shown as false (0) or true (1). Sometimes you can’t go beyond probabilities with current methods, but probabilities, where available, are more useful than nothing at all.

Dave Fair
Reply to  Karim Ghantous
December 14, 2017 11:47 am

I still say all statistics are 50/50: Either something happens or it doesn’t.

Gary Kerkin
Reply to  Karim Ghantous
December 13, 2017 5:44 pm

The quote is attributed to Ernest Lord Rutherford of Nelson. His long standing reputation as a scientist, especially that gained while head of the Cavendish Laboratory at the University of Cambridge through much of the first half of the 20th century, was established through his ability to devise an experiment and then build the instrumentation which enabled him to measure the outcomes of the experiment. Rutherford was born in New Zealand towards the end of the 19th century and his particular ability stood him in good stead in his early career. At the time science in New Zealand lacked the instrumentation capable of measuring the results of experiments and anyone who could devise appropriate instrumentation was bound to succeed. Conversely anyone who couldn’t was doomed to fail. So to say that he didn’t understand science and/or statistics is not helpful. I suggest that his ability to devise such instruments indicates a very, very good understanding of science.

Reply to  Gary Kerkin
December 13, 2017 10:01 pm

“So to say that he didn’t understand science and/or statistics is not helpful.”

I wasn’t directing my criticism towards Rutherford.

Yogi Bear
December 13, 2017 5:10 pm

Increased El Nino frequency and intensity is normal during a solar minimum, because of increased negative NAO/AO. The same with with AMO and Arctic warming.comment image

John Bills
December 13, 2017 7:57 pm

I see Nick Stokes didn’t comment so the p-hacking allegation must be true.
Nick, Nick ?

December 13, 2017 9:10 pm

Replace attribution with junk. Attribution science has decades at least to go before they can even use the word mature to describe the field. Self important liars

Phoenix44
December 14, 2017 1:33 am

Not long ago one of the medical journals said it would not publish this sort of stuff. Researchers had to state what they were looking for before they started looking, and then say if they had found it, not just claim to have found “something”. Rates of “success” fell from over 50% to around 5%.

Al Saletta
December 14, 2017 11:07 am

Back when Accutron watches were a luxury only the faculty could afford, our grad school statistics teacher rigged his to sound a buzzing alarm in the middle of a guest lecture. He apologized for his “watch” interrupting but then said, “As long as I have everyone’s attention, I just don’t see a significant result in the data on the chart you are displaying.” “But,” the presenter replied, “It is statistically significant at the 5% level!” “Who cares?”, replied our statistic’s expert, “I don’t see any effect in the chart.”

December 16, 2017 6:24 am

Re: p-hacking 12/13/2017: P-hacking, along with data mining, deep data diving, HARKing (Hypothesizing After Results are Known), are certain methods peculiar to Post Modern Science, where they are met with criticism by would-be gatekeepers of that species of science. It’s a bit of philosophical hand-waving, which at its roots is the result of gross failures of PMS to produce reliable scientific models. Indeed, the real gatekeeper of PMS is the American Statistical Association (ASA), which in January last year disapproved reliance on p-values at all following a half century of unreproducible scientific studies. See amstat.tandfonline.com/doi/abs/10.1080/00031305.2016.1154108.

Modern Science has no such restrictions. That species of science grades scientific propositions (models) as conjectures, hypotheses, theories, and laws according to their predictive power. By contrast, MS relies on Neyman-Pearson statistics along with Shannon’s Information Theory.

Science philosopher and champion of PMS, Paul Feyerabend, a student and follower of Popper, taught along with Popper that the scientific method was a fiction. Feyerabend’s problem in particular was that he believed that the method was a recipe, an ordered procedure that arrived at valid models. Instead, the method as it developed in MS is a logical organization. Its elements in language, observations, measurements, prediction, experiment and validation cam come to pass in any order, including validation first. P-hacking, HARKing, and other such procedures are quite acceptable in MS.

The method called Deep Data Diving illuminates the problem. In DDD an investigator sorts his database, assigning some variables as causes and some of the others as effects, guided by derivatives of Fisherian statistics. Here, rather than postulate a Cause & Effect relationship among the variables to be tested on its prediction, the investigator rests his results on statistical coincidences. After all, the p statistic is a random variable, based solely on the null hypothesis, and the null hypothesis has no relationship to the affirmative (a much better word than alternative) hypothesis which is supposed to have generated the data.

What the AMS, along with the whole of the academic community of scientists, has yet to realize is that when the ASA disapproved reliance on p-values, it also disapproved confidence limits, statistical significance, and anything else based on the null hypothesis alone. It has disapproved Fisherian statistics.

don
December 16, 2017 3:42 pm

What does the past tell us about higher temperatures leading to more extreme Climate change?

Throughout the worlds history, during warm periods, there was less variability to the weather. During cold periods, there were larger variations and more extreme weather. Thus, warm means more stable times, cold means more extreme climate change. For example, storms were worse a few centuries ago during the Little Ice Age, the best we can tell. The climate, overall, has gotten slightly better, or is about the same, on average, for the last 100 years, the best we can tell. Not worse as pushed by the fear peddlers and their poorly done biased studies. And this is with 60 or so years of man use of fossil fuels, thus confirming C02 doesn’t make the climate worse.

There have been close to the same or less tornadoes, droughts, and forest fires, worldwide, recently. The last years had the least hurricanes worldwide this century, and there has been a steady decline since 2005, other than this year (but this year is only slightly up).

If extreme and severe weather are not increasing, then how can the climate change movement claim that C02 is causing the climate to get more extreme, when the climate has not been getting more extreme, with 60 or so years of already increasing C02?
– Because their biased computer models tell them things will get worse in the future.
– They cherry pick certain locations that recently had many extreme weather events.
– Huge monies where poured into poorly done studies, backed by bad biased data, that anyone with a decent knowledge can poke huge holes in (as explained later).
– If you keep repeating and repeating, over and over again, and use the emotion of fear to scare, eventually many will believe that C02 is causing danger.

The supposed scientific explanation (hypothesis) of the link between global temperature increase and climate change, according to the pro global warming side, is a long daisy chain of very unlikely events and ties that may be only slightly correlated, with no causation proven. If true, only a small percentage responsible for each link. Sometimes the opposite, totally breaking the link. Rarely factually proven (by relying on computer model’s guestimate of the future, as if a fact). And if true, would take thousands of years to happen anyway. Ie; increased C02 from fossil fuels use causes “B”, “B” leads to “C”, “C” creates “D”, “D” results in “F”, “F” creates extreme weather. Also, this is all dependent on their belief that man’s increase of fossil fuels is increasing the temperature of the planet significantly, which was shown to be wrong on pages 11 – 26.

Then add that a less than 1-degree change over a 136-year period doesn’t change much, if anything, anyway. Ie; if the world is warming on average .007 of degree a year, how will that tiny amount cause something to get a lot more extreme or even change anything? It will not. It just gets lost in the noise.
And most of the warming wasn’t caused by fossil fuels, anyway. So even if C02 causes an unnoticeable amount of warming, it doesn’t cause the climate to change, and if it does, it is for the better, not worse. And the politician can’t regulate and tax C02 to make any more than an almost nothing difference, and the difference would probably be in the opposite of the desired direction, anyway.

Rob
December 18, 2017 4:48 am

Degrees of freedom.

The critical feature of a probability analysis is how many different classes an observation can fall into. When people slice and dice data into multiple classes, they should be increasing the DF much more than they do. But of course that would mean the result would not be significant…….

Verified by MonsterInsights