Throwing down the gauntlet on reproducibility in Climate Science – Forest et al. (2006)

After spending a year trying to get the data from the author without success, Nic Lewis has sent a letter to the editor of Geophysical Research Letters (GRL) and has written to me to ask that I bring attention to his letter published at Judith Curry’s website, and I am happy to do so.  He writes:

I would much appreciate it if you could post a link at WUWT to an article of mine (as attached) that has just been published at Climate Etc. It concerns the alteration of data used in an important climate sensitivity study, Forest 2006, with a radical effect on the resulting climate sensitivity estimated PDF.

I’m including the foreword here (bolding mine) and there is a link to the entire letter to the editor of GRL.

Questioning the Forest et al. (2006) sensitivity study

By Nicholas Lewis

Re:  Data inconsistencies in Forest, Stone and Sokolov (2006)  GRL paper 2005GL023977 ‘Estimated PDFs of climate system properties including natural and anthropogenic forcings

In recent years one of the most important methods of estimating probability distributions for key properties of the climate system has been comparison of observations with multiple model simulations, run at varying settings for climate parameters.  Usually such studies are formulated in Bayesian terms and involve ‘optimal fingerprints’. In particular, equilibrium climate sensitivity (S), effective vertical deep ocean diffusivity (Kv) and total aerosol forcing (Faer) have been estimated in this way. Although such methods estimate climate system properties indirectly, the models concerned, unlike AOGCMs, have adjustable parameters controlling those properties that, at least in principle, are calibrated in terms of those properties and which enable the entire parameter space to be explored.

In the IPCC’s Fourth Assessment Report (AR4), an appendix to WGI Chapter 9, ‘Understanding and attributing climate change’[i], was devoted to these methods, which provided six of the chapter’s eight estimated probability density functions (PDFs) for S inferred from observed changes in climate. Estimates of climate properties derived from those studies have been widely cited and used as an input to other climate science work. The PDFs for S were set out in Figure 9.20 of AR4 WG1, reproduced below.

The results of Forest 2006 and its predecessor study Forest 2002 are particularly important since, unlike all other studies utilising model simulations, they were based on direct comparisons thereof with a wide range of instrumental data observations – surface, upper air and deep-ocean temperature changes – and they provided simultaneous estimates for Kv and Faer as well as S. Jointly estimating Kv and Faer together with S is important, as it avoids dependence on existing very uncertain estimates of those parameters. Reflecting their importance, the IPCC featured both Forest studies in Figure 9.20. The Forest 2006 PDF has a strong peak which is in line with the IPCC’s central estimate of S = 3, but the PDF is poorly constrained at high S.

I have been trying for over a year, without success, to obtain from Dr Forest the data used in Forest 2006. However, I have been able to obtain without any difficulty the data used in two related studies that were stated to be based on the Forest 2006 data. It appears that Dr Forest only provided pre-processed data for use in those studies, which is understandable as the raw model dataset is very large.

Unfortunately, Dr Forest reports that the raw model data is now lost. Worse, the sets of pre-processed model data that he provided for use in the two related studies, while both apparently deriving from the same set of model simulation runs, were very different. One dataset appears to correspond to what was actually used in Forest 2006, although I have only been able to approximate the Forest 2006 results using it. In the absence of computer code and related ancillary data, replication of the Forest 2006 results is problematical. However, that dataset is compatible, when using the surface, upper air and deep-ocean data in combination, with a central estimate for climate sensitivity close to S = 3, in line with the Forest 2006 results.

The other set of data, however, supports a central estimate of S = 1, with a well constrained PDF.

I have written the below letter to the editor-in-chief of the journal in which Forest 2006 was published, seeking his assistance in resolving this mystery. Until and unless Dr Forest demonstrates that the model data used in Forest 2006 was correctly processed from the raw model simulation run data, I cannot see that much confidence can be placed in the validity of the Forest 2006 results. The difficulty is that, with the raw model data lost, there is no simple way of proving which version of the processed model data, if either, is correct. However, so far as I can see, the evidence points to the CSF 2005 version of the key surface temperature model data, at least, being the correct one. If I am right, then correct processing of the data used in Forest 2006 would lead to the conclusion that equilibrium climate sensitivity (to a doubling of CO2 in the atmosphere) is close to 1°C, not 3°C, implying that likely future warming has been grossly overestimated by the IPCC.

This sad state of affairs would not have arisen if Dr Forest had been required to place all the data and computer code used for the study in a public archive at the time of publication. Imposition by journals of such a requirement, and its enforcement, is in my view an important step in restoring trust in climate science amongst people who base their beliefs on empirical, verifiable, evidence.

Nic Lewis

==============================================================

Just let me say that there’s movement afoot to address the issues brought up about reproducibility in journal publications in the last paragraph. I’ll have more on this at a future date.

Here’s the foreword and letter to the GRL editor in PDF form:  Post on Forest 2006 GRL letter final

This figure from that letter by Lewis suggests a lower climate sensitivity to a doubling of CO2 than the original:

-Anthony

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

82 Comments
Inline Feedbacks
View all comments
Peter Lang
June 26, 2012 4:12 pm

Nic Lewis,
Thank you for your replies to my question. In case others are interested, Nic’s replies to my initial question and follow up question are here:
http://judithcurry.com/2012/06/25/questioning-the-forest-et-al-2006-sensitivity-study/#comment-212919

timetochooseagain
June 26, 2012 5:02 pm

Peter Lang-Thank you for linking to those comments! I especially liked this:
“Note that, as I understand it, many members of the ‘subjective Bayesian’ school of statisticians would think it OK to use whatever prior they thought fit, notwithstanding that it did not result in objective probablilistic inference. IMO, and I hope in that of the vast majority of scientists, such an approach has no place in science.”

Peter Lang
June 26, 2012 5:03 pm

This discussion is fascinating. It seems the estimates of climate sensitivity may be coming down (but that may be my bias).
Follow the comments from Professor Forest’s first comment here:
http://judithcurry.com/2012/06/25/questioning-the-forest-et-al-2006-sensitivity-study/#comment-212944
He has said he will get back later with more responses to Nic’s follow up questions and others.
Here is another interesting comment:
http://judithcurry.com/2012/06/25/questioning-the-forest-et-al-2006-sensitivity-study/#comment-212952
This gives information on other work that suggests climate sensitivity may be significantly lower than the IPCC AR4 consensus estimate.
And read my question to Nic Lewis and his response to this and a follow up question starting here:
http://judithcurry.com/2012/06/25/questioning-the-forest-et-al-2006-sensitivity-study/#comment-212911
This is about what progress has been made regarding his paper of a year ago which pointed out that the IPCC had replotted the Forster and Gregory (2006) to make the climate sensitivity much higher and the tail of high consequence much thicker (see figure 4 here: http://judithcurry.com/2011/07/05/the-ipccs-alteration-of-forster-gregorys-model-independent-climate-sensitivity-results/ ).
If we cut to the basics, these are the parameters that are really important for estimating the consequences of man’s GHG emissions, and therefore for informing optimal policy:
• What the climate will do in the absence of man made GHG emissions (it will cool as we are past the Interglacial maximum and on the cooling part of the glacial-interglacial cycle)
• Climate sensitivity
• Damage function (damage costs per degree of climate change (up and down))
• Rate we will convert to low emissions energy in the absence of high-cost mitigations policies
It seems to me there is strong and growing evidence that the damages are not potentially catastrophic, not dangerous, and not high cost.
Therefore, adaptation is the best strategy, IMO.

Spector
June 26, 2012 8:17 pm

Based on the MODTRAN utility provided by the University of Chicago, the *raw* sensitivity (no feedback) for CO2 is about 0.9 deg K per doubling in clear tropical air with a nominal energy flow of 292.993 W/m² (picked from one of the standard program output values) at current CO2 concentrations. I understand MODTRAN to be a program developed by the Air Force for instrumentation calibration, and is a computer calculation based on the measured line-by-line absorbance parameters for the gases in the atmosphere. The surface temperatures were found by a hunt-and-pick process for each CO2 level to achieve the standard energy flow number at 70 km up.
Ref: http://forecast.uchicago.edu/Projects/modtran.html
I understand that this web-tool is hosted courtesy of Dr. David Archer, a non-skeptic.
Ref: http://forecast.uchicago.edu/Projects/modtran.doc.html
Another example of the minimal forcing change with a doubling (300:600 PPM) of CO2 is provided by a plot from the Wikipedia article on “Radiative Forcing.” The blue curve for 600 PPM CO2 almost completely covers the green curve for 300 PPM.
Ref: http://en.wikipedia.org/wiki/File:ModtranRadiativeForcingDoubleCO2.png
It is my understanding that the higher values posited by the IPCC are based on an assumed, dangerously high, positive feedback factor. Although interactions with the water-vapor absorption spectra are sometimes said to be to said to be the cause of this, I see no water-vapor holes in the forcing spectrum. Perhaps convection makes water-vapor a leaky greenhouse gas.
It is interesting to note that a MODTRAN analysis of radiative transfer by altitude seems to indicate that most of the energy leaving the Earth is actually radiated directly from the atmosphere–specifically the troposphere. Most of the energy radiated directly from the surface (396 W/m²) is returned by back-radiation (333 W/m².) That is clearly shown in the standard IPCC heat transfer diagram.
Ref: http://climateknowledge.org/figures/WuGblog_figures/RBRWuG0086_Trenberth_Radiative_Balance_BAMS_2008.GIF

June 27, 2012 9:20 am

Spector says: “most of the energy leaving the Earth is actually radiated directly from the atmosphere–specifically the troposphere.”
Yes. I and another blogger:
http://troyca.wordpress.com/
Had a paper written up for, GRL I think it was that made the point, partially based on this fact, that the TOA flux changes are poorly correlated in with the surface temperature variations in part because the bulk of the radiating energy comes from the atmosphere and is thus determined by the atmospheric temperatures-also that clouds don’t magically know what the sea surface is doing instantaneously, they react to the temperatures of their ambient environment, which lags the sea surface temperatures, in terms of anomalies, significantly. We found that if you use atmospheric temps (say UAH or RSS LT) the strength of the correlations improves and you should get a better estimate of climate feedback, and lower sensitivity, as it happened. Sadly the journal rejected our paper, even though our sensitivity estimate was not that low. I thought it was biased high, personally.

Brian H
July 3, 2012 7:06 pm

HankH says:
June 25, 2012 at 2:26 pm

Further, all data is mirrored and stored in two data centers and warehoused with a data vault company. Such data is considered so precious that such controls are an absolute requirement.
Anyone who outright looses [loses] the original data has such bad organization and lack of controls in place that any results of their work must be called into question. I continue to be astounded at the shoddy research practices of these climatologists and even more astounded that their work is not thrown in the waste bin by the publishing journal when such gross negligence is discovered.

The pattern of brazen disregard for standards and even strict legal requirements seems to be a core strategy and tactic of the B3 crowd (B.S. Baffles Brains.) See the US Administration for a close parallel.