UPDATE: It appears the paper has been withdrawn and credit acknowledgement given to Steve McIntyre, see below:
There was yet another recent “hockey stick” being foisted on the public. Gergis et al.
It says:
The average reconstructed temperature anomaly in Australasia during A.D. 1238–1267, the warmest 30-year pre-instrumental period, is 0.09°C (±0.19°C) below 1961–1990 levels.
Basically, another “ah-ha, man is at fault” pitch.
At Climate Audit, the paper was examined in more detail, and alarm bells went off. Concern centered around the 27 proxy data sets used in the study. Now, after Steve McIntyre found some major faults, it seems this paper has gone missing from the AMS website without explanation. All that remains is the Google cache thumbnail image, not even the cached web page. See below:
Here is the original URL:
http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-11-00649.1
Here’s a backup copy: http://static.stuff.co.nz/files/melbourne.pdf
To read about how the takedown came about, I suggest this excellent summary from Bishop Hill as the technical details are rather thick: Another Hockey Stick broken
The problems with the paper in a nutshell:
- upside down proxy data again
- preselection of data, ignoring the whole set in many cases
- though they tried to justify preselection, the paper’s methodology doesn’t hold up (circular reasoning)
- inability to replicate given the data and methods used
In Gergis defense, they provided full *some documentation and data at the outset, unlike some other hockey stick purveyors we know. This allowed the work to be checked independently. This is how science is supposed to work, and apparently it has.
(*Added: apparently Gergis refused some additional data Steve McIntyre requested, the documentation of this on his CA website)
It appears from my perspective that this is a failure of peer review at the AMS.
UPDATE: Further proof that the paper has truly been taken down, and this isn’t a web glitch.
1. The DOI link is also broken over at Real Climate in their article: Fresh hockey sticks from the Southern Hemisphere
References
- J. Gergis, R. Neukom, S.J. Phipps, A.J.E. Gallant, and D.J. Karoly, “Evidence of unusual late 20th century warming from an Australasian temperature reconstruction spanning the last millennium”, Journal of Climate, 2012, pp. 120518103842003-. DOI.
2. On the AMS search page: http://journals.ametsoc.org/action/doSearch
I put in both the author name and the DOI, and got nada:
Search Results
Search Query: Authors: gergis
Your search did not match any articles.
Search Query: PubIdSpan: JCLI-D-11-00649.1
Your search did not match any articles.
============================================================
UPDATE2: Steve McIntyre reports the paper has been put “on hold” http://climateaudit.org/2012/06/08/gergis-et-al-put-on-hold/
![gergisgraph[1]](http://wattsupwiththat.files.wordpress.com/2012/06/gergisgraph1.jpg?resize=640%2C433&quality=83)

Steve McIntyre says:
June 8, 2012 at 3:27 pm
It’s not correct to say that they credited Climate Audit. They say that they first discovered the error themselves on June 5 prior to the commentary at Climate Audit,
============
Oz is after all 14 hours ahead of Canada, so by reading Canadian Blogs, they are able to discover all sorts of things as much as 14 hours sooner than anyone in Canada.
It is a normal part of science that the errors always trend the same direction.
Apparently.
Scientific ethics lives and survives only where the funding sources are ‘blinded’ or disinterested. But since infinite money isn’t available, choices get made, and inevitably favorites are picked.
I’m beginning to think that the only way to prevent this kind of consensual lock-up is to explicitly establish funding, as a significant % of the total, for ‘dissident’ or ‘rogue’ science in any field where controversy exists. That would heighten the chances of catching groupthink disasters like Climate Science at an early stage, before they ran roughshod over public policy and academia — to the great detriment of society.
GregK says:
The claim of….“The average reconstructed temperature anomaly in Australasia during A.D. 1238–1267, the warmest 30-year pre-instrumental period, is 0.09°C (±0.19°C) below 1961–1990 levels.” is patently ridiculous.
You’d be hard pushed, even with modern instruments, to measure temperatures with that kind of accuracy. That’s before doing any kind of data processing, which would tend to decrease the level of accuracy.
The suggestion of a mean of 0.09 and a standard deviation of 0.19 is meaningless. The mean is below the detection limit of the analytical method used
You see this frequently in “climate science” including ocean pH and sea level. It’s as though these people have never heard of the term “significant figures”. Combined with an apparent inability to consider the accuracy and precision of their measuring methods.
No doubt someone else could take the (same) data and come up with something like 0°C (±2.5°C)
Ike says:
…
June 9, 2012 at 7:28 pm The answer – whether the result of calculation or of measurement – is both unknown and unknowable from the data or calculated results provided. Therefore, there is no probability that the difference is real, when as in this case, the range of possible answers extends from above to below the claimed answer.
…
THere may be a probability the difference is real, but it is unknowable from the methodology, both physical and analytic. As many have mentioned, this issue constantly dogs climate science, and most life sciences in general. Inability to measure to the necessary resolution, inability to account for and identify confounding variables, lack of understanding of the totality of variability inherent in the subject, renders most of this “science” worthless. This is a common issue with the study of chaotic and complex systems, many of which are also self-determinant, by physicists, mathematicians and chemists. They simply don’t have the knowledge base necessary to ask the right questions about the hypotheses they’re trying to resolve. In no way will a graduate astrophysicist, organic chemist, fluid engineer or any other traditional discipline produce scientists capable of addressing “climate change” or any number of broadly described “environmental” disciplines with the science that they graduated with. Understanding climate will have to be multi-disciplinary, with the most rigorous application of scientific methodology and principle that we can muster.
We are limited in so many ways as to our ability to conduct any kind of multi-variate analysis on the scale necessary to deduce climate, that we are decades away, if not generations, from any form of conclusions about the operation of the earth’s dynamic systems, much less policy decisions.
The current approaches produce nothing but “anthropogenic global warming” (ie, “hot air”), and contribute almost nothing to the science, only confusion. Presently active scientists need to understand that their individual contribution will, inevitably, be miniscule to the bigger questions, but also understand that their work must also be of the very highest quality, if it is to have any contribution at all. To do otherwise, for systems of the scale of climate, is to simply waste the money and lives of their contemporaries.
Steig et al (2009) got a scary cover in “Nature” before being refuted.
Gergis et al (2012) was on the verge of being embraced as gospel by the IPCC’s AR5
The “climate science community” needs and can benefit from the “auditors”!!
Unless of course their purpose is not really science but propaganda and political activism.
Thank you, markx; I re-read what I had written and what Crispin in Waterloo wrote and I see that you’re correct. He’s just more polite than I. *smile* Thank to all who wrote in reply to my comments; you’ve helped my understanding of the material considerably. Lawyers who are neither members of a government nor attorneys for a government agency have little or no chance of altering the apparently poisonous relationship between climate research funding by governments and the resulting papers and claims. Politics, in its modern form of unlimited scope untrammelled by reason or conscience or constitutional law, is the eight-hundred-pound guerrila with a machinegun of modern life: nothing can happen without the approval of those who control the governments. Of course, due to institutionalized ignorance and other unavoidable characteristics of political institutions, nothing that is effective or productive can happen with the approval of those who control the governments. *sigh* The answer ought to be less government intrusion and more freedom, but that seems a pipe dream at best, as I look around the world and consider what I see, based on 50 years of adult education, training and experience. Our fathers defeated the Nazis and Imperial Japan, the Soviet collapsed of its own incompetence and now it seems in America that we are dominated by people who seek the destruction of Western culture, replacing it with … nothing. Nihilism made flesh, as it were.
No you wouldn’t. Because then you’d have to find another line of work, having confirmed that proxies aren’t up to the task.
Gavin at RealClimate says that when all is cleared up, he believes the Southern Hockey Stick will prevail. It sounds like a faith-based assessment, or is it a triumph of hope?
In their attribution part Gergis et al. did NOT compare model runs with and without human influences. They compared forced runs with unforced runs. That is, with an imaginary Earth without volcanoes, sealed off from any solar or orbital influences, simulated for a 10,000 year period detached from any specific date in history. The only variation in their so called natural run comes from the internal pseudo-randomness of the computer program. They then announce their discovery that the real world has more variation.
This is clear from the paper itself and the briefing powerpoint: “When we applied natural (volcanic events, solar activity) and human forcings (greenhouse gases) (blue), the model now reproduces the late 20th century warming. This demonstrates that the warming cannot be explained by natural variability alone.”
It’s a screaming non-sequitur: natural AND human forcings reproduce the late 20th century warming, THUS the warming cannot be explained by natural variability alone. They didn’t even look (or show) whether it can or not, even inside the computer. The comparison left out such periods as MWP and LIA.
Incidentally, the very model they use happens to underestimate natural variability in the very area they’re studying.
It’s unfortunate that the main conclusion rests on a computer program that its developer describes as “non-physical”:
Apparently programming a physical model just didn’t work out so they had to pick the smaller non-physicalness.
More generally, it’s interesting how comparison between observations and a computer program is conclusive enough to reject the natural variability null hypothesis and blame the human, but if you want to assess the validity of the said computer program, the same comparison becomes non-conclusive. You can’t falsify a climate model.
The model used here was CSIRO Mk3L based on two earlier CSIRO models that have been included in IPCC AR reports. The main developer S. J. Phipps recounts his PhD project in an almost harry-read-me’esque technical report.
The pristine pre-industrial paradise unexpectedly had an energy imbalance way larger than mankind has supposedly achieved so far.
What do they do? Someone (pers. comm.) suggested fudging cloud albedo. After reducing convective cloud reflectivity to 59.5% and the rest to 86.5% the imbalance was brought down to an acceptable level. Problem solved!
When they base these huge conclusions on observations being “consistent with” models (with human forcings), inconsistency of 59.5% with ERBE measurements in cloud reflectiveness apparently doesn’t count.
Negative salinities.
The model had an annoying habit of stopping due to floating point errors when ocean salinity went negative. Solution: change the code to allow negative salinities. Water has traditionally contained either a positive amount of salt, or none. It’s nice to see the climate modeling community has moved on from this kind of old-fashioned, boxed thinking.
Energy conservation. Unforced control runs had 7.3*10^23 J per millennium appearing in the ocean. That’s 730000000000000000000 Joules per year. It was discovered that somewhere in the depths of Fortran, water temperature was forced to -1.8501°C whenever it fell under -1.85°C.
I haven’t heard of this physical constant of -1.8501°C but it must be a part of gold standard science. After all the previous model using this value was accepted by IPCC.
Interpolation errors.
Error of 5.6mm/year in sea level is 20 to 30 times the currently observed rate.
Figure D.3 shows surface heat flux interpolation errors. Some of the grid boxes have an error of 40*10^12 W per year. So there’s an annual ~40000000000 kW energy appearing from nowhere or disappearing in an area of one grid box.
About flux corrections.
Happily, as Phipps mentions, natural feedbacks tend to dampen these drifts. Except ocean salinity drift, but that’s just salt, not temperature.
Apparently some work has been done to solve or mitigate some of these problems in the new Mk3L model but I understood they originate from the predecessor models used by IPCC.
I wonder if scientists outside the modeling community actually know what’s going on there? It seems rather important since attribution of climate change rests on “consistency” between real world and these models. That’s the first letter in “AGW”.