At #AGU17, in an attempt to prove ‘climate change creates more severe weather’, scientists resort to ‘p-hacking’

From the “If your experiment needs statistics, you ought to have done a better experiment.- Ernest Rutherford” department.

From the American Meteorological Society via press release:


New research links notable weather and climate events to human influence

The American Meteorological Society is releasing the strongest evidence yet that humanity’s long-term release of greenhouse gases has altered recent extreme weather and climate events. In the new collection of peer-reviewed papers analyzing the link between extremes in 2016 and climate change, scientists identified three events that would not have been possible without human-caused changes in the climate.

The linkages are made in a newly released report—Explaining Extreme Events from a Climate Perspective, a supplement to the Bulletin of the American Meteorological Society. This collection of studies from 116 scientists analyzes 21 different events in 2016, from U.S. snowstorms and South African drought to oceanic hot spots and Arctic warmth. Most of the events researchers examined can be attributed, at least in part, to human-caused climate change.

Some of the analyses go beyond atmospheric and oceanic extremes to link societal or ecological impacts, including coral bleaching and crop failures, to human-caused climate change.

The American Meteorological Society will release Explaining Extreme Events in 2016 from a Climate Perspective at the AGU Fall Meeting on Wednesday, December 13, 2017 at 11:30am CDT.

The panel including editors and authors of the papers will discuss their findings about natural and human influences on the extreme events, as well as the developing maturity of attribution science.

Reporters are invited to attend the press conference in person at the Morial Convention Center in New Orleans or else via live web streaming offered by the American Geophysical Union.

Events Assessed for 2016 include:

  • The notorious warm “Blob” in the Pacific Ocean
  • Flash Droughts in South Africa
  • Wildfires in North America and Australia
  • Cold Snap in Eastern China
  • Drought in Northeast Brazil

UPDATED: You can download the entire report:

The press release and links to individual chapters is here:

From the introduction of the 2016 edition:

As in past years, this sixth edition of Explaining Extreme Events from a Climate Perspective includes studies of extreme events from around the world that did not find a role for climate change in influencing the magnitude or frequency of an event. It is important to note that papers are selected for consideration in this report by reviewing author proposals that do not indicate whether a role for climate change will or will not be found. Thus, there is no selection bias on the part of the editorial team toward one particular conclusion, and this publication prides itself as a venue that accepts papers without consideration for whether a role for climate change is found. This year there may be a slight bias toward events that do not find a signal relative to previous years because the editors have begun to limit the number of heat papers in the report which is the event type where a signal is most commonly found.

Given that the majority of heat papers now use a widely established and accepted methodology, the scientific value of continuing to include a large number of heat studies began to seem limited. Extreme weather event types included in this year’s edition include ocean heat waves, forest fires, snow storms, and frost, as well as heavy precipitation, drought, and extreme heat and cold events over land. A number of papers also look at the impacts of extremes (Fig. 1.1). The Summary of Results Table (Table 1.1) gives readers a general overview of the results. Twenty-one of the 27 papers in this current edition identified climate change as a significant driver of an event, while six did not. Of the 131 papers now examined in this report over the last six years, approximately 65% have identified a role for climate change, while about 35% have not found an appreciable effect

Last year, the editors called on scientists submitting research proposals to investigate potential links between an extreme event and its subsequent impact, and we were excited to see five research teams take on this challenge in this year’s report. Lewis and Mallela concluded that the risk of the extreme Great Barrier Reef bleaching event was increased through anomalously high sea surface temperature and the accumulation of thermal stress caused by human caused climate change. Jacox et al. and Brainard et al. both examined how high ocean temperatures

caused in part by human-caused climate change impacted living marine resources like coral bleaching, reduced fish stocks, and a decrease in seabird counts in the California current and the equatorial Pacific, respectively. On land, Sippel et al. found that human caused climate change is causing warmer winters on the Iberian Peninsula and, when coupled with a wet spring, drove higher ecosystem productivity in the region in 2016. However, these papers represent early approaches, and more work is needed to develop impact attribution methodologies.

As is always the case, we would caution that the results of any single study should not be interpreted as the final word on the matter for that event, nor be generalized to a broader class of extremes. For example, authors of these papers selected specific modeling approaches and made other choices about factors that are important in how the models replicate extreme events, such as terrestrial heat or sea surface temperatures. If other study designs were applied to these events, it is possible a different result would be reached. The importance of the methodological approach in attribution research is further discussed in the summary of this report (Stott et al.).

A big question raised by this collection of research is whether these findings undermine the axiom that “no event is caused by climate change alone and that natural variability always plays some role.” The short answer is no. While several of the studied events were found not to be possible without climate change,

natural variability still laid the foundation for the events to occur, and the authors acknowledge this in their papers. Extreme events are always the result of a cumulative set of factors. The building blocks that form the foundation of any event continue to include natural variability, with factors such as El Niño potentially adding to the strength of the event. These temperature-related extremes would likely still have been warm events even without human-caused climate change, but according to these analyses, the events could not have surpassed the extreme warm thresholds that they did without climate change. This was especially the case for the record-setting globally averaged temperature. At the global scale, the natural

variations of Earth’s temperature are increasingly seen to pale in comparison to the growing intensity of human-induced warming. Overall, human-caused climate change allowed them to pass a threshold that they could not otherwise have exceeded.

From the introduction in the 2015 edition:

This last year has been exciting for attribution science, as the U.S. National Academy of Sciences released its report on the topic (NAS 2016). To date, it is the most comprehensive look at the state of event attribution science, including how the framing of attribution questions impacts the results. For example, in a complex event such as drought, a study of precipitation versus a study of temperature may yield different results regarding the role of climate change. The report also addresses how attribution results are presented, interpreted, and communicated. It provides the most robust description to date of the various methodologies used in event attribution and addresses the issues around both the confidence of the results and the current capabilities of near-real time attribution. No single methodology exists for the entire field of event attribution, and each event type must be examined individually. Confidence in results of an attribution analysis depends on what has been referred to as the “three pillars” of event attribution: the quality of the observational record, the ability of models to simulate the event, and our understanding of the physical processes that drive the event and how they are being impacted by climate change.

I’m not all impressed with the “three pillars”, because what typically happens is that if a model doesn’t simulate an event on the first pass, the researchers keep tweaking it until it does. Eventually, they all become “Clever Hans” in being able to respond to the weather events nature provides.

Larry Kummer of Fabius Maximus comments via email to me:

An exercise in data mining.

How many kinds of extreme weather are there? How many of these irregularly defined geographic areas are there?  Combine the two into a database. A survey of one year will always find outliers at the 5% level — by chance, because there are so many possibilities.Given that, it is easy for ingenious scientists to link some of them to anthropogenic effects.

More useful would be to see if the overall class of extreme events showed trends over time. Or at least some classes of extreme events (temp, precipitation) did so.  Otherwise they have not shown that anything unusually happened in 2016. Just weather.

Literally, all that is going on here is “p-hacking”, and it is well known to have bias problems.

Data dredging (also data fishing, data snooping, and phacking) is the use of data mining to uncover patterns in data that can be presented as statistically significant, without first devising a specific hypothesis as to the underlying causality.

After they find a statistically significant set of data, then they use the “three pillars” to assign causality, and that causality is always climate change. Problem is, there’s a built-in bias involved, for example, this 2015 paper in PLOS One explains why(bold mine):

The Extent and Consequences of P-Hacking in Science


A focus on novel, confirmatory, and statistically significant results leads to substantial bias in the scientific literature. One type of bias, known as “p-hacking,” occurs when researchers collect or select data or statistical analyses until nonsignificant results become significant. Here, we use text-mining to demonstrate that p-hacking is widespread throughout science. We then illustrate how one can test for p-hacking when performing a meta-analysis and show that, while p-hacking is probably common, its effect seems to be weak relative to the real effect sizes being measured. This result suggests that p-hacking probably does not drastically alter scientific consensuses drawn from meta-analyses.

In Wikipedia, there is this description and a curious graph:

The process of data dredging involves automatically testing huge numbers of hypotheses about a single data set by exhaustively searching — perhaps for combinations of variables that might show a correlation, and perhaps for groups of cases or observations that show differences in their mean or in their breakdown by some other variable. Conventional tests of statistical significance are based on the probability that a particular result would arise if chance alone were at work, and necessarily accept some risk of mistaken conclusions of a certain type (mistaken rejections of the null hypothesis). This level of risk is called the significance. When large numbers of tests are performed, some produce false results of this type, hence 5% of randomly chosen hypotheses turn out to be significant at the 5% level, 1% turn out to be significant at the 1% significance level, and so on, by chance alone. When enough hypotheses are tested, it is virtually certain that some will be statistically significant but misleading, since almost every data set with any degree of randomness is likely to contain (for example) some spurious correlations. If they are not cautious, researchers using data mining techniques can be easily misled by these results.

There’s this example provided:

An example of data produced by data dredging, apparently showing a close link between the best word winning a spelling bee competition and the number of people in the United States killed by venomous spiders.

There’s also the famous graph showing climate change correlates to the number of pirates.

Larry Kummer adds via email:

In general use, p-hacking can be used to mine datasets for patterns to support a pre-existing hypothesis — such as “AGW is increasing incidence and magnitude of exteme weather.”  Operationally, it can result from scientists each checking their own database (a specific kind of weather in a specific geographical region) for increased extremes — which can be attributed to AGW or CC.  Since negative findings are not reported, this inevitably results in “findings.”

This is a classic setup for replication failure, as so many other fields — in both “hard” and “soft” sciences — have discovered. I’ve written about this, starting with this from April 2016: The replication crisis in science has just begun. It will be big.

Scientists have been searching for years for the elusive link between “climate change” and “severe weather” for example, this editorial in Nature put the onus on them back in 2012:

From Nature: Extreme weather

Better models are needed before exceptional events can be reliably linked to global warming.

As climate change proceeds — which the record summer melt of Arctic sea-ice suggests it is doing at a worrying pace — nations, communities and individual citizens may begin to seek compensation for losses and damage arising from global warming. Climate scientists should be prepared for their skills one day to be probed in court. Whether there is a legal basis for such claims, such as that brought against the energy company ExxonMobil by the remote Alaskan community of Kivalina, which is facing coastal erosion and flooding as the sea ice retreats, is far from certain, however. So lawyers, insurers and climate negotiators are watching with interest the emerging ability, arising from improvements in climate models, to calculate how anthropogenic global warming will change, or has changed, the probability and magnitude of extreme weather and other climate-related events. But to make this emerging science of ‘climate attribution’ fit to inform legal and societal decisions will require enormous research effort.

Attribution is the attempt to deconstruct the causes of observable weather and to understand the physics of why extremes such as floods and heatwaves occur. This is important basic research. Extreme weather and changing weather patterns — the obvious manifestations of global climate change — do not simply reflect easily identifiable changes in Earth’s energy balance such as a rise in atmospheric temperature. They usually have complex causes, involving anomalies in atmospheric circulation, levels of soil moisture and the like. Solid understanding of these factors is crucial if researchers are to improve the performance of, and confidence in, the climate models on which event attribution and longer-term climate projections depend.

Read the full editorial here.

Dr. Roger Pielke Jr. observed then:

The 116 scientists finally have come to a point where they figured out how to justify their claims with with “better models” and data mining, but all the correlation in the world does not equate to causation.

Meanwhile, examining one of the most fearful severe weather metrics, tornadoes, doesn’t seem to show a correlation:

But in the case of this recent BAMS special report, the researchers truly believe the correlation must be there, and belief is a powerful motivator, so they set out on a path of self-reinforcing data discovery to prove it, just like those kids at spelling bees and venomous spiders, they certainly found what they are looking for.

I weep for science.

NOTE: Shortly after publication, several updates and edits were added to improve the article, including additional excerpts from the 2016 report, and a new comment from Larry Kummer.

0 0 votes
Article Rating
Newest Most Voted
Inline Feedbacks
View all comments
December 13, 2017 8:58 am

Anyone that reads Tony’s RealClimateScience blog….has seen post after post…..after post…
….showing that all these “extreme events” are declining

george e. smith
Reply to  Latitude
December 13, 2017 10:59 am

So almost all (97%) of wild fires are caused by either lightning strikes, plain old arson, or careless campers who let deliberately lit fires get away from them, or vehicular accidents. (car fires, plane crashes).

So these chaps are saying that climate changes affected some or all of these causes ?? A fire is an event whose origin can usually be specifically identified.

Which fire of California’s in the last 24 months was specifically caused by climate change ??


Reply to  george e. smith
December 13, 2017 3:37 pm

“Made worse” would be the claim. Never mind the Santa Ana winds occur every year and wind is one of the biggest problems. As noted previously, there are red flag warnings in the winter in many places.

The fires are made worse by urban sprawl and not removing fuel. It’s easier to blame global warming, of course.

Dr. S. Jeevananda Reddy
Reply to  Latitude
December 13, 2017 7:44 pm

Let me present two cases of events assessed by AMS, namely Flash droughts in South Africa and Drought in Northeast Brazil. In this context let me quote from my book “Unfortunately, it has become a ritual, to attribute every weather event to El Nino or Global Warming [which is used as de-facto climate change] without looking in to weather & climate data of the region. In 2014, WMO Secretary General in his WMO Day release attributed the 2013 drought & warm conditions prevailing in Southern Hemisphere to global warming; and the current Ethipian drought conditions are attributed toEl Nino by FAO Representative in Ethiopia. I sent my response to WMO Secretary General saying that the drought conditions and associated warming was part of the natural cycle in precipitation [based on my studies in Brazil, South African countries]. Now, the current drought conditions prevailing in Ethiopia are also due to natural cycles in precipitation data series. All these are published in 80s/90s”.

In the case of Brazil, I analysed the precipitation data series of northeast Brazil and divided in to homogenised zones. The Fortaleza precipitation data series of 1849-1981 with the mean 1385 mm presented 52-year cycle with sub-multiples of 26, 13 & 65 years. The integrated pattern presented a M followed by W shape pattern, where in M represented above the average and W represented the below the average. Here the centre of W & M dips equal to start and end lines of the letters. The predicted below the average pattern [W-shape] is around 2002-2020 [in between a short period of above average]. The starting period is strong below the average for around 10 years.

In the case of South Africa, Durban precipitation data presented a 66 year cycle with a sub-multiple of 22 years. The integrated predicted pattern also presented W followed by M [unlike in Brazil, the centre is here is weak with two prominent below the average patterns on either side]. The 66 year cycle started in around 1975 — starting with M and W started in 2008 and will end by around 2040.

When there is no significant raise in global average due to “global warming”, how can we expect its impact. In 2002 and 2009 severe droughts in India increased the temperature by 0.7 and 0.9 oC. Unless we integrate all existing information around the region, simply attributing to global warming will mislead the research and computer time and thus power consumption.

Dr. S. Jeevananda Reddy

December 13, 2017 9:03 am

A few years ago floods in England were the hot topic but i see that they’ve moved on as move on they must.

Reply to  chaamjamal
December 13, 2017 9:21 am

Thanks chaamjamal. Also, have been meaning to ask, is there a reason you use Excel for all your analyses rather than R? (As someone who’s just beginning to play in the statistical sandbox, I’m curious…)


Reply to  ripshin
December 13, 2017 10:29 am

Don’t know about chaamjamal, but I use Excel because I learned statistics using Fortran and punch cards. Excel is so much easier it takes the incentive out of trying to finding something better.

Reply to  ripshin
December 13, 2017 10:59 am

C programming is my make-money language. For presentation graphics, I write out a data file and use GNU-Plot.
R within R-Studio is an outstanding package. The combination of a vector based programming language with an enormously flexible graphics package just can not be beat. I use R-Studio for all my ad-hock data examinations and graphics generations. It is great for grabbing a climate data set and just working it up.
All of the graphs I have posted here over the last 2+ years have all been done with R-Studio.
edX has courses in introductory R use and programming.
The courses are free and are highly recommended.
The course I took is currently archived (pity that), But it will likely be rescheduled sometime.
R is so much more powerful than Excel, you will never look back.

george e. smith
Reply to  ripshin
December 13, 2017 11:04 am

Actually a sand box is a poor place to do statistics; it is far too unpredictable and uncertain. But statistics is quite the opposite; it ALWAYS gives exact values, since it requires only 4-H club arithmetic and a finite set of finite real numbers whose exact values are always known in advance.
Sand castles are unpredictable.


Reply to  chaamjamal
December 13, 2017 10:28 am

I seem to remember the flooding in England was due to improper drainage maintenance? Kind of similar to Detroit flooding and Windsor Canada across the river not flooding not long ago.

Yogi Bear
Reply to  ossqss
December 13, 2017 5:01 pm

That was blocking from the ‘warm blob’. The same setup happened Dec 1876 and Jan 1877, and was wetter than in Jan-Feb 2014. That was followed by California drought 1877, most of the sheep died, California floods in 1878 from the 1877-78 super El Nino, then almighty California wildfires in 1879.

Smart Rock
December 13, 2017 9:08 am

“attribution science”? A new oxymoron

Reply to  Smart Rock
December 13, 2017 9:24 am

An oxymoron is a pointed stupidity, a seemingly nonsensical phrase that upon reflection has some underlying wisdom? For example, “You have to be cruel to be kind”. Attribution science is not a pointed stupidity, it is just pointedly stupid.

Reply to  BCBill
December 13, 2017 10:51 am

‘Efficient government.’

‘peer review’

george e. smith
Reply to  BCBill
December 13, 2017 11:05 am

Common sense !


Thomas Homer
Reply to  BCBill
December 13, 2017 11:29 am

carbon pollution

Reply to  Smart Rock
December 13, 2017 10:59 am

As Kummer says,

How many kinds of extreme weather are there? How many of these irregularly defined geographic areas are there?

Very very many.

Temperature low, high, precipitation low high, freeze date, melt date, max wind, number of storms total, or in category, snow depth low high, number of days above or below temp, sunshine hours, growing season length, start, end date, wave height, number of wildfires, area burned, drought days, flood heights, sea surface temps, area of bleached coral… For example.

Then there are literally thousands of geographical areas, from Alabama to Maldives, and Greenland to New South Wales.

Then starts the fun, because you can take different time frames, like lowest September ice extent at Baffin Bay since 1979, or highest Christmas Day sea level max at the Carteret atoll since modern measurements with (gauge, satellite) started.

You really don’t need to think a lot of this – there is no way of collecting and trusting just anecdotal evidence.

December 13, 2017 9:13 am

A classic was: correlation between rising global temps, and the number of Harvard-Law-grad Supreme Court Justices.

December 13, 2017 9:15 am

Maybe I haven’t read widely enough about the replication crisis, but it seems to center on research in the “social sciences” and other “soft sciences”, such as psychology. Is there a replication crisis in any branch of physics or chemistry?

Reply to  Retired_Engineer_Jim
December 13, 2017 10:08 am


” it seems to center on research in the “social sciences” ”

The core of the crisis is in biomedical research, all hard science. That was where the first blockbuster failures to replicate were found — and the field in which these failures probably have had the greatest effect on society.

Failures in biomedical research were especially shocking because much of that work has stringent procedures to avoid commonplace design and execution errors. Unfortunately, biomedicine also has some of the strongest rewards for …lax conduct. Now we see which side won that tug-of-war.

Lots of lessons in that story for climate science.

See my post, cited by Anthony, for details plus links to more information.

Reply to  Larry Kummer, Editor
December 14, 2017 1:30 am

Also serious contamination of cell lines. One used for testing lung cancer drugs turned out to be liver cancer cells for example.

Reply to  Retired_Engineer_Jim
December 13, 2017 10:40 am

Replication problems also plague “things that cause cancer” studies, nutrition studies, genetic studies purporting to show that one gene variant or another is linked to a disease or condition, “chemicals are always harmful to people or the environment” studies, and the like. Genetics and environmental effects on phenotypes is my job but I don’t have time left in my lunch break to find specific references, maybe can come back to it tonight.

Chad Irby
December 13, 2017 9:16 am

I’m still waiting for “Climate Change Corresponds to Size of Chicken Entrails: Sky Gods Angry.”

David Dibbell
December 13, 2017 9:17 am

Does anyone remember this rare occurrence from 9-12-15? Is this also to be attributed to AGW? No tropical cyclones anywhere on earth.

December 13, 2017 9:19 am

Well, I should think their analysis is every bit as useful as Dr. Tyler Vigen’s.

See, for example, here:


December 13, 2017 9:31 am

It will only get worse before it gets better, thanks to data, data processing and statistical tools becoming more widely abundant and inexpensive. Almost every researcher and engineer I know has become well acquainted with R, Python and MATLAB.

To win the ensuing propaganda war related to p-hacking will require teaching people, especially children and young people, what p-hacking is and how it works, and how it relates to the scientific process. P-hacking and the SeaLegacy polar bear footage represent the same problem – information being selected primarily for its story-telling impact to support a narrative rather than understand something.

There is lots in the way of media studies, probability and statistics being taught to students even in early grades, but far too little in the way of critical or skeptical evaluation of information, or how people might be manipulated by stories rooted in activities claiming to be scientific.

george e. smith
Reply to  DMH
December 13, 2017 11:17 am

Well I have been doing research (Physics related) and Engineering (Optics and Electronics related) for now 60 years, and I don’t do R, or Python or MATLAB for anything.
Well I do have a degree in Mathematics as well as Physics, so I just use what I learned from high school and on the job since.
Statistics requires arithmetic. You can do that with a stick on a sandy beach.


Paul Penrose
Reply to  george e. smith
December 13, 2017 3:18 pm

The critical thing is understanding what the result of any statistical calculation really means. That seems to escape many researchers.

Reply to  george e. smith
December 14, 2017 5:32 am

You also need to understand when you can’t apply a statistical technique because it isn’t valid because of the underlying background. One you see a fair bit in climate science is the use of Central Limit theorem on a Cauchy distribution and many seem unaware of the problem.

December 13, 2017 9:40 am

Well …..
In my experience I don’t think the total number of Pirates in the South China Sea has changed much over the last 3 thousand years.
But I am open to being convinced, and if you send money (lots of money) I am prepared to set up a study to look into this.
I might even hire some of these scientists to help me.
It would be as good a use of their time as this study and I am sure we could make a case for anything you like, just as they have done here.

Reply to  Oldseadog.
December 13, 2017 10:42 am


george e. smith
Reply to  Oldseadog.
December 13, 2017 11:19 am

Well Osd, you are just talking about a bunch of junk !

G (g too)

December 13, 2017 9:44 am

…’the “three pillars” of event attribution…’ How many pillocks of event attribution does it take to change a climate into something it isn’t? It’s not the outliers we need to worry about; it’s the out-and-out liars.

Reply to  jorgekafkazar
December 13, 2017 10:03 am

For anyone that missed this gem:

“It’s not the outliers we need to worry about; it’s the out-and-out liars.”

December 13, 2017 9:48 am

If you torture data long enough, it’ll confess to anything.
——Ronald Coase

December 13, 2017 9:57 am

When you’re intent on finding something you believe in you will eventually find it. This is nothing more than another answer looking for a question.

Reply to  markl
December 13, 2017 3:18 pm

Even if it is not there.

Reply to  markl
December 13, 2017 8:10 pm

When you are intent on finding something *you know* is there, you will eventually find it.

These scientists are just sure that CO2 is heating up the atmosphere to the point that they think natural variability is a minor player, so they attribute everything to human-caused CO2.

The truth is they couldn’t prove that human-caused CO2 is causing the Earth’s weather to change, if their lives depended on it. They look at a Hockey Stick chart and that’s enough proof for them. But of course, we know that’s not proof of anything, it is a Lie.

It’s amazing how many scientists are presuming too much when it comes to the Earth’s climate. Sheep.

December 13, 2017 9:59 am

statistics are like a bikini. What it covers up is more interesting than what it shows.

george e. smith
Reply to  AleaJactaEst
December 13, 2017 11:22 am

Well in the real world, statistics erases the real observed measured data, and replaces it with made up numbers that were never observed or measured by anybody at any place at any time.
Pure numerical origami.


Dave Fair
Reply to  george e. smith
December 13, 2017 12:37 pm

Liars, damned liars and statistics. Mark Twain.

Dave Fair
Reply to  george e. smith
December 13, 2017 12:38 pm

File that in the “nothing ever changes” folder.

Reply to  george e. smith
December 13, 2017 2:49 pm

Figures don’t lie, but liars can figure.

Steve Zell
December 13, 2017 10:29 am

How do these geniuses at AMS explain how global warming causes a cold snap in eastern China and frost in Australia? If it was really GLOBAL warming, it would warm up everywhere on the globe!

Welcome to the Orwellian Newspeak of Anthropogenic Weather Manipulation. Hot is cold, at least in Oceania and Eastasia.

george e. smith
Reply to  Steve Zell
December 13, 2017 11:31 am

Well on a typical northern midsummer day, you might find any (condensed surface) temperature you like on earth between an extreme low value of -94 deg. C to at least +60 deg. C, and some claim maybe +90 deg. C

And a simplistic argument will show that there are an infinite number of places on earth which will have whatever number you want to choose between the then existing high and low extremes.

So far as I know, that extreme low and that extreme high have never actually been both observed at the same time, but they could both happen together some time.

Certainly simultaneous +/- 60 deg. could be found almost any day of the year.


December 13, 2017 10:35 am

So, the warm blob in the Northern Pacific could not have happened without man-made global warming. Fair enough, I would like to concede the point. However, just to be sure please show me your temperature measurements in the Northern Pacific for the last 200 years. Wait, what? Nobody was taking those temperatures 200 years ago? Find I will settle for 100 years. Come again, not 100 years ago? Okay one more try. Let’s demonstrate that this is a 1 in 50 year event by showing the sea surface temperatures in the North Pacific from 50 years ago calibrated to the same standard as today’s systems and with good enough accuracy that we can compare the two.
That data doesn’t exist either? Then how do we know that this even isn’t a regular part of the the Pacific’s oscillation?
By the way, who gives a flip about the temperature of the Northern Pacific anyway?

Reply to  chadb
December 13, 2017 10:54 am

Definitely! In most of these cases, the data just doesn’t exist over a long enough period to actually tell anything, or still doesn’t exist even now.

Dave Fair
Reply to  chadb
December 13, 2017 12:42 pm

The Blob was caused by the “ridiculously resilient” high pressure ridge in the North Pacific. Since it hasn’t reoccurred, how do we blame CO2?

December 13, 2017 10:35 am

And when we are in the depths of the next Maunder minimum, the ignorant fools will blame that on man as well.

Jim Gorman
December 13, 2017 10:41 am

It seems that most mathematicians need to also study how science is done. Using large data and statistics is only the first part of doing science. It can lead you to correlations which are nothing more than a general hypothesis. Then the hard part begins, using science to provide hard, real world evidence through experiments and measurements to determine if your hypothesis is true.

Mathematicians that run through a myriad of statistical gyrations on different data are not doing science. They neither report their failures nor how likely their so-called findings are to be true. And, I use the word findings loosely, very loosely in fact.

george e. smith
Reply to  Jim Gorman
December 13, 2017 11:37 am

Well science involves observation, measurement, and experimentation, which are ALL real world (universe) things.
Mathematics is NOT science it is an art form and completely fictional; we made it all up in our heads, and can make up more any time we feel like it.
Where are the 8km high snow capped mountains in : x^2 + y^2 + z^2 = r^2 ??


Very useful as a tool though.

Bruce Ploetz
December 13, 2017 10:43 am

A friend used to work at a major university in the computer room. Back in the days of mainframes. He tells me that students used to come to him for SPSS analysis of their study data. When he asked what they wanted for analysis parameters they always wanted “All Correlations”.

Of course, students can’t afford to do a huge survey or a longitudinal data gathering effort so they had tiny data sets. So, small data sets, poor survey methodology, lumpy data, put it through a Big Blue computer and out spits a tiny slip of paper with some barely significant correlation. Success! Smoking causes cornflakes!

Looks like some of these students never grew up.

But it is a really dangerous game that they are playing. No huge US funded Climate Fund, thank you Mr. President. But huge lawsuits and reparations demands from natural disaster victims. Redistribution of income on a grand scale, all based on skeevy science but politically correct in the extreme.

How does this work? They assume that any temperature rise is partly human-caused. Could be, but how do you isolate the natural rise, starting in 1750, from the possibly human caused rise, starting in 1950? Then they tie any extreme weather or drought, flood, forest fire or whatever to the temperature rise using skeevy statistics. OK, but what percentage is really human caused? And of that percentage, how much is really attributable to the big moneybags, the US? No point in trying to extract cash from India or China.

I can’t see how this ends well. The “science” is skeevy but those who stand to gain $billions from it are very tolerant about that. As Upton Sinclair said, “It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

December 13, 2017 10:50 am

It would be nice if these folks would look at odd events like the 12 year drought in major US Atlantic hurricane landfalls through last year, or apply the same analysis to old events like the several extreme events in the 1930s. How could that decade be anything but a harbinger of climate upheaval?

December 13, 2017 10:56 am

Okay, so you (or someone) goes off and creates a model that does a gorgeous job of matching the tornado/hurricane/what have you data from the last decade.

Put it and projections for the next decade on the web, and see how it does for the next decade.

What’s the point in trying to tie events to climate change if the process doesn’t lead to predictive tools? Or is the point an all expense paid trip to the next December AGU meeting?

December 13, 2017 11:06 am

Can I ask who is doing the p-hacking? Is it is the 35% of scientists who submitted a paper showing that
extreme events cannot be attributed to climate change? Or is it the editors who omitted a large number of papers showing heat waves can be attributed to climate change? Both would appear to massaging the statistics to downplay the role that climate change has on extreme events?

Or are you suggesting that the 35% of papers that don’t find an effect are the only honest ones in the whole report?

Reply to  Germonio
December 13, 2017 11:35 am

Hello SkS,
I thought there was a consensus on not only attribution but that the AGW is dangerous and we can easily stop it if we act now. Only that Obama needs some help from Trump.

Reply to  Germonio
December 13, 2017 12:43 pm


“Can I ask who is doing the p-hacking? ”

Nobody. It is an emergent property at the top level of an unstructured research project (an ad hoc project, in this case). As i described in the post.

Reply to  Larry Kummer, Editor
December 13, 2017 4:05 pm

If nobody is doing the p-hacking then do you have any evidence that it is being done? The claim that it is an “emergent property” is quite an astonishing claim and one which would need considerable evidence to back it up. The PLOS paper you mention states that “We then illustrate how one can test for p-hacking when performing a meta-analysis” so it should be possible for you to test whether or there is any p-hacking going on. Otherwise you are just making unfounded claims of which there are already far to many.

Reply to  Larry Kummer, Editor
December 13, 2017 6:03 pm


“The claim that it is an “emergent property” is quite an astonishing claim ”

Only to you. It is a well-documented phenomenon in the many papers about the replication crisis. It is a pervasive methodological problem in modern science.

Lots of proposed solutions. It will work itself out, eventually.

December 13, 2017 11:10 am

One simple way to address this research is to get them to answer the very easy looking and scientifical important question, ‘what would disprove AGW’ if like most they cannot or will not answer that question. Then you know they are merely playing ‘heads you lose, tails I win’game that as frack all to do with science.

Coeur de Lion
December 13, 2017 11:14 am

I’m lost on this one. ‘Climate Change ‘ is caused by global warming, right? Which is measurable and has happened, just. So should produce droughts, and possibly some atmospheric effects. How come extreme snow? And unusual cold?. How can all these clever people sleep at night?

December 13, 2017 11:18 am

Does anybody actually believe all this garbage?

Reply to  Phillip Bratby
December 13, 2017 12:38 pm


This story — exaggerated, as usual — probably will have a high profile in the news for a while — and be endlessly cited afterwards.

That a few don’t believe makes no difference whatsoever. It’s successful propaganda. Much like Exxon folding against its critics.

Each wave gains some ground against science, influencing the public. It’s the long ground game. Small sure steps are the short path to victory.

Rhoda R
Reply to  Phillip Bratby
December 13, 2017 2:26 pm

Yes, actually some people do. Not necessarily the ones doing the ‘science’ or pushing the narrative, but there are lots of people – primarily youngsters – who have been fed this garbage from all sides for at least thirty years now. Many of these people haven’t been alive long enough to have gone through even one half of the typical 60 year climate cycle.

December 13, 2017 11:31 am

‘In the new collection of peer-reviewed papers analyzing the link between extremes in 2016 and climate change, scientists identified three events that would not have been possible without human-caused changes in the climate.’

A juvenile appeal to ignorance.

Reply to  Gamecock
December 13, 2017 7:03 pm

Yes, Gamecock! It is the same appeal to ignorance that is the foundation of all claims of evidence of AGW in weather events. “It has to be man-made, because we don’t know of any natural variability that could account for it.” Their ignorance is understandable, as they have been working very hard at ignoring natural climate variability and even denying some well-known cycles even exist (“we have to get rid of the Medieval Warm Period”).

Our ancestors blamed the gods, because our ancestors could not explain things without them. Modern man still blames the gods, but believes he is one of them.

December 13, 2017 11:36 am

I have to challenge a massive error in this discussion. The Lack of Pirates Is Causing Global Warming. Jo Nova proved beyond a reasonable doubt that the price of postage stamps is the leading cause of global warming. 🙂

4 Eyes
December 13, 2017 11:55 am

Only when you know what the natural rate of warming is can you say how much anthropogenic warming is and thereby make conclusions about causality of extreme events. This reality is avoided repeatedly but it is why the historical temperature is constantly being adjusted. The new breed of young scientists will be unaware that the 1930s were very hot. This is a very sinister game with a serious agenda.

Dave Fair
Reply to  4 Eyes
December 13, 2017 12:49 pm

Mann’s hockey stick worked for years and was foundational in the consensus and governmental policies. At the time, I took it as truth and it helped form the belief on my part that CAGW was real.

Reply to  4 Eyes
December 13, 2017 3:54 pm

What if the natural rate of warming is a constantly changing value that cannot be predicted or explained?

Dave Fair
Reply to  Sheri
December 13, 2017 4:25 pm

Then one shouldn’t spend bunches of OPM trying to control it, Sheri.

Svend Ferdinandsen
Reply to  4 Eyes
December 13, 2017 4:29 pm

Who say that extreme events increase with warming?
Reality show no positive correlation.

Reply to  Svend Ferdinandsen
December 14, 2017 4:06 am

Its frustrating isn’t it. Start with the assumption that “warming” or indeed any effect is bad and argue as though there is simply no alternative and wonder whether it will just be bad or really bad. Its kind of like they’re putting blinkers on the crowd up front.

December 13, 2017 12:05 pm

The real story about freak weather is that green-politicised energy policy has rendered the electricity grids in many European countries fragile and vulnerable to extreme weather outbreaks:

December 13, 2017 12:06 pm

This type of “research” is not about discovery but about persuasion and advocacy. The intended audience is not real skilled scientists who follow the scientific process, but the public, policy makers, and, most especially, funders who can be tricked into paying more for the same nonsense year after year. In medical science the journals are filled with false claims and soon to be refuted discoveries because academia seems to reward this type of nonsense. Unfortunately those who use medical services suffer both through poor care and through higher payments because of the failure to do proper science. In climate science the harm will be larger and have far greater impact as entire economies are damaged.

December 13, 2017 12:46 pm

I am surprised they did not attribute record global grain harvests to climate change.

Dave Fair
Reply to  Mohatdebos
December 13, 2017 12:51 pm

Some actually do so! Then you get “just wait; our models show harm in the future.”

December 13, 2017 2:09 pm

For more research about North American wildfires

See this slideshow by Scott St. George, Associate Professor of Geography at the University of Minnesota (University page, his website).

[slideshare id=7516538&doc=class17firehistory-110404213848-phpapp01&w=650&h=500 ]

Gunga Din
Reply to  Larry Kummer, Editor
December 14, 2017 8:56 am

The kangaroo rat was common in CA.
Part of its habitat is dry brush.
The k.rat is now endangered, partly through loss of habitat.
Land owners are not allowed to remove dry brush from their property. (Just in case one shows up.)
CA has had a bunch of wildires.

Kangaroo Rats were expert fire-fighters.

Gary Kerkin
December 13, 2017 3:24 pm

Post hoc, ergo propter hoc.

Svend Ferdinandsen
December 13, 2017 4:25 pm

“scientists identified three events that would not have been possible without human-caused changes in the climate.”
I ask out of how many events? Haven’t we been told of all the disasters that has happened and that they are signs of Global Warming, and they only found 3 events!
Yesterday i had 3 leaves falling of my tree out of 1000, but these 3 where a sure sign of Global Warming.

December 13, 2017 5:00 pm

“If your experiment needs statistics, you ought to have done a better experiment.”

Anyone who believes that doesn’t understand science, and doesn’t understand statistics.

Having said that, it does not surprise me that climate researchers resort to all sorts of fallacies and deceptions. They’ve done it right from the beginning.

Reply to  Karim Ghantous
December 13, 2017 5:18 pm

if a proposition does not resolve to true/false, then it is not reasonable and unscientific, mmk?
so i’m not persuaded that you are speaking truth and i’m betting you can’t resolve the truth of your own statement with statistics, amirite?

Reply to  gnomish
December 13, 2017 10:05 pm

Some propositions result in a probability, until they are able to be shown as false (0) or true (1). Sometimes you can’t go beyond probabilities with current methods, but probabilities, where available, are more useful than nothing at all.

Dave Fair
Reply to  Karim Ghantous
December 14, 2017 11:47 am

I still say all statistics are 50/50: Either something happens or it doesn’t.

Gary Kerkin
Reply to  Karim Ghantous
December 13, 2017 5:44 pm

The quote is attributed to Ernest Lord Rutherford of Nelson. His long standing reputation as a scientist, especially that gained while head of the Cavendish Laboratory at the University of Cambridge through much of the first half of the 20th century, was established through his ability to devise an experiment and then build the instrumentation which enabled him to measure the outcomes of the experiment. Rutherford was born in New Zealand towards the end of the 19th century and his particular ability stood him in good stead in his early career. At the time science in New Zealand lacked the instrumentation capable of measuring the results of experiments and anyone who could devise appropriate instrumentation was bound to succeed. Conversely anyone who couldn’t was doomed to fail. So to say that he didn’t understand science and/or statistics is not helpful. I suggest that his ability to devise such instruments indicates a very, very good understanding of science.

Reply to  Gary Kerkin
December 13, 2017 10:01 pm

“So to say that he didn’t understand science and/or statistics is not helpful.”

I wasn’t directing my criticism towards Rutherford.

Yogi Bear
December 13, 2017 5:10 pm

Increased El Nino frequency and intensity is normal during a solar minimum, because of increased negative NAO/AO. The same with with AMO and Arctic warming.comment image

John Bills
December 13, 2017 7:57 pm

I see Nick Stokes didn’t comment so the p-hacking allegation must be true.
Nick, Nick ?

Mark - Helsinki
December 13, 2017 9:10 pm

Replace attribution with junk. Attribution science has decades at least to go before they can even use the word mature to describe the field. Self important liars

December 14, 2017 1:33 am

Not long ago one of the medical journals said it would not publish this sort of stuff. Researchers had to state what they were looking for before they started looking, and then say if they had found it, not just claim to have found “something”. Rates of “success” fell from over 50% to around 5%.

Al Saletta
December 14, 2017 11:07 am

Back when Accutron watches were a luxury only the faculty could afford, our grad school statistics teacher rigged his to sound a buzzing alarm in the middle of a guest lecture. He apologized for his “watch” interrupting but then said, “As long as I have everyone’s attention, I just don’t see a significant result in the data on the chart you are displaying.” “But,” the presenter replied, “It is statistically significant at the 5% level!” “Who cares?”, replied our statistic’s expert, “I don’t see any effect in the chart.”

December 16, 2017 6:24 am

Re: p-hacking 12/13/2017: P-hacking, along with data mining, deep data diving, HARKing (Hypothesizing After Results are Known), are certain methods peculiar to Post Modern Science, where they are met with criticism by would-be gatekeepers of that species of science. It’s a bit of philosophical hand-waving, which at its roots is the result of gross failures of PMS to produce reliable scientific models. Indeed, the real gatekeeper of PMS is the American Statistical Association (ASA), which in January last year disapproved reliance on p-values at all following a half century of unreproducible scientific studies. See

Modern Science has no such restrictions. That species of science grades scientific propositions (models) as conjectures, hypotheses, theories, and laws according to their predictive power. By contrast, MS relies on Neyman-Pearson statistics along with Shannon’s Information Theory.

Science philosopher and champion of PMS, Paul Feyerabend, a student and follower of Popper, taught along with Popper that the scientific method was a fiction. Feyerabend’s problem in particular was that he believed that the method was a recipe, an ordered procedure that arrived at valid models. Instead, the method as it developed in MS is a logical organization. Its elements in language, observations, measurements, prediction, experiment and validation cam come to pass in any order, including validation first. P-hacking, HARKing, and other such procedures are quite acceptable in MS.

The method called Deep Data Diving illuminates the problem. In DDD an investigator sorts his database, assigning some variables as causes and some of the others as effects, guided by derivatives of Fisherian statistics. Here, rather than postulate a Cause & Effect relationship among the variables to be tested on its prediction, the investigator rests his results on statistical coincidences. After all, the p statistic is a random variable, based solely on the null hypothesis, and the null hypothesis has no relationship to the affirmative (a much better word than alternative) hypothesis which is supposed to have generated the data.

What the AMS, along with the whole of the academic community of scientists, has yet to realize is that when the ASA disapproved reliance on p-values, it also disapproved confidence limits, statistical significance, and anything else based on the null hypothesis alone. It has disapproved Fisherian statistics.

December 16, 2017 3:42 pm

What does the past tell us about higher temperatures leading to more extreme Climate change?

Throughout the worlds history, during warm periods, there was less variability to the weather. During cold periods, there were larger variations and more extreme weather. Thus, warm means more stable times, cold means more extreme climate change. For example, storms were worse a few centuries ago during the Little Ice Age, the best we can tell. The climate, overall, has gotten slightly better, or is about the same, on average, for the last 100 years, the best we can tell. Not worse as pushed by the fear peddlers and their poorly done biased studies. And this is with 60 or so years of man use of fossil fuels, thus confirming C02 doesn’t make the climate worse.

There have been close to the same or less tornadoes, droughts, and forest fires, worldwide, recently. The last years had the least hurricanes worldwide this century, and there has been a steady decline since 2005, other than this year (but this year is only slightly up).

If extreme and severe weather are not increasing, then how can the climate change movement claim that C02 is causing the climate to get more extreme, when the climate has not been getting more extreme, with 60 or so years of already increasing C02?
– Because their biased computer models tell them things will get worse in the future.
– They cherry pick certain locations that recently had many extreme weather events.
– Huge monies where poured into poorly done studies, backed by bad biased data, that anyone with a decent knowledge can poke huge holes in (as explained later).
– If you keep repeating and repeating, over and over again, and use the emotion of fear to scare, eventually many will believe that C02 is causing danger.

The supposed scientific explanation (hypothesis) of the link between global temperature increase and climate change, according to the pro global warming side, is a long daisy chain of very unlikely events and ties that may be only slightly correlated, with no causation proven. If true, only a small percentage responsible for each link. Sometimes the opposite, totally breaking the link. Rarely factually proven (by relying on computer model’s guestimate of the future, as if a fact). And if true, would take thousands of years to happen anyway. Ie; increased C02 from fossil fuels use causes “B”, “B” leads to “C”, “C” creates “D”, “D” results in “F”, “F” creates extreme weather. Also, this is all dependent on their belief that man’s increase of fossil fuels is increasing the temperature of the planet significantly, which was shown to be wrong on pages 11 – 26.

Then add that a less than 1-degree change over a 136-year period doesn’t change much, if anything, anyway. Ie; if the world is warming on average .007 of degree a year, how will that tiny amount cause something to get a lot more extreme or even change anything? It will not. It just gets lost in the noise.
And most of the warming wasn’t caused by fossil fuels, anyway. So even if C02 causes an unnoticeable amount of warming, it doesn’t cause the climate to change, and if it does, it is for the better, not worse. And the politician can’t regulate and tax C02 to make any more than an almost nothing difference, and the difference would probably be in the opposite of the desired direction, anyway.

December 18, 2017 4:48 am

Degrees of freedom.

The critical feature of a probability analysis is how many different classes an observation can fall into. When people slice and dice data into multiple classes, they should be increasing the DF much more than they do. But of course that would mean the result would not be significant…….

%d bloggers like this:
Verified by MonsterInsights