UPDATE: a response to this paper has been posted, see below.
From McGill University , who blows the credibility of their science by putting the word “deniers” in it.
Statistical analysis rules out natural-warming hypothesis with more than 99 percent certainty
An analysis of temperature data since 1500 all but rules out the possibility that global warming in the industrial era is just a natural fluctuation in the earth’s climate, according to a new study by McGill University physics professor Shaun Lovejoy.
The study, published online April 6 in the journal Climate Dynamics, represents a new approach to the question of whether global warming in the industrial era has been caused largely by man-made emissions from the burning of fossil fuels. Rather than using complex computer models to estimate the effects of greenhouse-gas emissions, Lovejoy examines historical data to assess the competing hypothesis: that warming over the past century is due to natural long-term variations in temperature.
“This study will be a blow to any remaining climate-change deniers,” Lovejoy says. “Their two most convincing arguments – that the warming is natural in origin, and that the computer models are wrong – are either directly contradicted by this analysis, or simply do not apply to it.”
Lovejoy’s study applies statistical methodology to determine the probability that global warming since 1880 is due to natural variability. His conclusion: the natural-warming hypothesis may be ruled out “with confidence levels great than 99%, and most likely greater than 99.9%.”
To assess the natural variability before much human interference, the new study uses “multi-proxy climate reconstructions” developed by scientists in recent years to estimate historical temperatures, as well as fluctuation-analysis techniques from nonlinear geophysics. The climate reconstructions take into account a variety of gauges found in nature, such as tree rings, ice cores, and lake sediments. And the fluctuation-analysis techniques make it possible to understand the temperature variations over wide ranges of time scales.
For the industrial era, Lovejoy’s analysis uses carbon-dioxide from the burning of fossil fuels as a proxy for all man-made climate influences – a simplification justified by the tight relationship between global economic activity and the emission of greenhouse gases and particulate pollution, he says. “This allows the new approach to implicitly include the cooling effects of particulate pollution that are still poorly quantified in computer models,” he adds.
While his new study makes no use of the huge computer models commonly used by scientists to estimate the magnitude of future climate change, Lovejoy’s findings effectively complement those of the International Panel on Climate Change (IPCC), he says. His study predicts, with 95% confidence, that a doubling of carbon-dioxide levels in the atmosphere would cause the climate to warm by between 2.5 and 4.2 degrees Celsius. That range is more precise than – but in line with — the IPCC’s prediction that temperatures would rise by 1.5 to 4.5 degrees Celsius if CO2 concentrations double.
“We’ve had a fluctuation in average temperature that’s just huge since 1880 – on the order of about 0.9 degrees Celsius,” Lovejoy says. “This study shows that the odds of that being caused by natural fluctuations are less than one in a hundred and are likely to be less than one in a thousand.
“While the statistical rejection of a hypothesis can’t generally be used to conclude the truth of any specific alternative, in many cases – including this one – the rejection of one greatly enhances the credibility of the other.”
“Scaling fluctuation analysis and statistical hypothesis testing of anthropogenic warming”, S. Lovejoy, Climate Change, published online April 6, 2014. http://link.springer.com/search?query=10.1007%2Fs00382-014-2128-2
=============================================================
Christopher Monckton has posted a rebuttal to this paper, see here
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
They need to go back to 1450.
I bet he’ll wish he could take that one back once he’s had a year or two under his belt as a professor.
Gee, if he went back 18,000 to 10,000 years, he might find that the world went from a glaciated one to one wherein the massive continental glaciers in North American and Eurasia melted almost completely. I wonder how that happened, since CO2 levels did not rise above 280 ppm during all that time period?
“Statistical analysis rules out natural-warming hypothesis with more than 99 percent certainty”
“Statistical analysis”
of what exactly?
Of course silly me……….. yeah, “statistical analysis” of ‘homogenized data’ – oh that’s alright then?!
GIGO.
Do these clowns [Lovejoy et al up at McGill…….. Gull?] proof read the stuff they regurgitate?
Chris B sez: “I bet he’ll wish he could take that one back once he’s had a year or two under his belt as a professor.”
Not if he has or gets tenure. He can probably rationalize anything, including redefining “natural factors.”
This is what you get when you take ignorance of climate history and add religious-like zeal. If this professor does not get a blank look on his face when asked about the Holocene Optimum, Minoan, Roman, and Medieval warm periods, that would indicate that he is being willfully ignorant.
The dear professor needs to brush up on confirmation, selection and survivor bias.
“We’ve had a fluctuation in average temperature that’s just huge since 1880 – on the order of about 0.9 degrees Celsius,”
And if they had just gone back 2 more years they might have noticed that Feb 1878 was warmer than Feb 2014:
HADCRUT4 Feb 1878 0.403C
HADCRUT4 Feb 2014 0.299C
Consider the past 12,000 years not the last 500.
He appears to be more about getting funded than what he is asking to be funded.
http://www.physics.mcgill.ca/~gang/eprints/eprintLovejoy/neweprint/IPC10.challenges.23.6.10.pdf
I wonder what William M. Briggs, Statistician to the Stars, will have to say about this methodology.
Will Dr. Lovejoy show the data?
Never heard of McGill university. Probably will never hear of it again.
“Statistical analysis rules out natural-warming hypothesis with more than 99 percent certainty”
Statistical analyses cannot be used to attribute global warming to natural or anthropogenic factors. The only way is with a total understanding and audit of the complex coupled ocean-atmosphere processes….which still elude the climate science community.
A complex statistical analysis must be based on a statistical model of some type which itself is based on a series of assumptions. The multiple proxy reconstruction you choose will also likely determine the assumptions in the model. Given one’s degrees of freedom here you can produce nearly any conclusion you wish.
We don’t need no stinking statistics. One look with the naked eye of a graph of just 20th century temperatures alone says otherwise. Common sense isn’t very common any more.
His study predicts, with 95% confidence, that a doubling of carbon-dioxide levels in the atmosphere would cause the climate to warm by between 2.5 and 4.2 degrees Celsius. That range is more precise than – but in line with — the IPCC’s prediction that temperatures would rise by 1.5 to 4.5 degrees Celsius if CO2 concentrations double.
Which has not mirrored measurements.
Otter, not Canadian I take it…
McGill is located in downtown Montreal, one of the top Canadian universities in many areas, probably best known for their law faculty. And statistics….;-)
The paper assumes that natural variability is only solar and volcanoes.
And then after that, it finds a residual of natural variability of +/- 0.3C.
Then it confidently finds that the 0.7C increase in temperatures cannot be caused by natural variability at the 99% confidence level.
Let’s see -0.3C changing to +0.3C = 0.6C
Since natural variability can only explain 0.6C of the 0.7C, it is ruled out at the 99% confidence level.
This is what climate science has become. It is a joke. It is actually un-natural in how obscene it has become.
Tell him, he’s dreaming!
Meanwhile from the left coast…
http://fullcomment.nationalpost.com/2014/04/10/donner-harrison-hoberg-lets-talk-about-climate-change/?utm_source=dlvr.it&utm_medium=twitter
Readers respond…
http://fullcomment.nationalpost.com/2014/04/11/todays-letters-climate-hogwash-is-no-reason-to-halt-progress/
Ross McKitrick rebuttal.
http://fullcomment.nationalpost.com/2014/04/11/ross-mckitrick-a-cost-benefit-analysis-of-pipeline-expansion/
This actually interests me as the only reason I’m a skeptic is that I don’t have trust in models. I’m specifically interested in how he uses statistical analysis to attribute global warming. I have a math background and a prior this seems either impossible or so difficult that you cannot have 99% certainty except in extreme cases. Although, I did not emphasize in statistics so I do not know.
I intend to devote some time this weekend to going over the paper and seeing what method he used, however I would like to see a post dedicated to this subject – perhaps from someone more knowledgeable in statistics than me?
Reading the script read by the publicist, looks like the “study” gathered the same ole proxies (tree rings, lake sediments, etc.) that Mann used to create his hockey stick. Then add in a mix of CO2 as “a proxy for everything (all industrial effects) and dash in particulate changes as required to make the numbers come out right.
An unreliable statistical analysis of unreliable proxy data has determined that warming in the past 100 years (using unreliable station data) indicates that a global temperature increase of 0.6-0.8 deg C is unprecedented at a 99% confidence level.
In the words of Joe Biden…”is this a joke?” We are lost if this is what passes for Science these days.
Doesn’t Hansen create empirical estimates of climate sensitivity using temperature data? I believe their claim they are the first to do this is incorrect.
Natural variability?
This from Phil Jones and Keith Briffa.(in the public domain) This extract below was interesting as in the period 1700 to 1745 AD there was a remarkable non co2 warming (with the exception of winter 1740) It shows the largest Hockey stick in the entire CET record
“The year 1740 is all the more remarkable given the anomalous warmth of the 1730s. This decade was the warmest in three of the long temperature series (CET, De Bilt and Uppsala) until the 1990s occurred (warmer by 0.3C and has since dropped sharply back). The mildness of the decade is confirmed by the early ice break-up dates for Lake Malaren and Tallinn Harbour. The rapid warming in the CET record from the 1690s to the 1730s and then the extreme cold year of 1740 are examples of the magnitude of natural changes which can potentially be recorded in long series. Consideration of variability in these records from the early 19th century, therefore, may underestimate the range that is possible.”
tonyb