In my opinion, this is a testament to Steve McIntyre’s tenacity.
Via the GWPF: At Last, The Right Lesson From Climategate Fiasco
A diverse group of academic research scientists from across the U.S. have written a policy paper which has been published in the journal Science, suggesting that the time has come for all science journals to begin requiring computer source code be made available as a condition of publication. Currently, they say, only three of the top twenty journals do so.
The group argues that because computer programs are now an integral part of research in almost every scientific field, it has become critical that researchers provide the source code for custom written applications in order for work to be peer reviewed or duplicated by other researchers attempting to verify results.
Not providing source code, they say, is now akin to withholding parts of the procedural process, which results in a “black box” approach to science, which is of course, not tolerated in virtually every other area of research in which results are published. It’s difficult to imagine any other realm of scientific research getting such a pass and the fact that code is not published in an open source forum detracts from the credibility of any study upon which it is based. Articles based on computer simulations, for example, such as many of those written about astrophysics or environmental predictions, tend to become meaningless when they are offered without also offering the source code of the simulations on which they are based.
The team acknowledges that many researchers are clearly reticent to reveal code that they feel is amateurish due to computer programming not being their profession and that some code may have commercial value, but suggest that such reasons should no longer be considered sufficient for withholding such code. They suggest that forcing researchers to reveal their code would likely result in cleaner more portable code and that open-source licensing could be made available for proprietary code.
They also point out that many researchers use public funds to conduct their research and suggest that entities that provide such funds should require that source code created as part of any research effort be made public, as is the case with other resource materials.
The group also points out that the use of computer code, both off the shelf and custom written will likely become ever more present in research endeavors, and thus as time passes, it becomes ever more crucial that such code is made available when results are published, otherwise, the very nature of peer review and reproducibility will cease to have meaning in the scientific context.
More information: Shining Light into Black Boxes, Science 13 April 2012: Vol. 336 no. 6078 pp. 159-160 DOI: 10.1126/science.1218263
Abstract
The publication and open exchange of knowledge and material form the backbone of scientific progress and reproducibility and are obligatory for publicly funded research. Despite increasing reliance on computing in every domain of scientific endeavor, the computer source code critical to understanding and evaluating computer programs is commonly withheld, effectively rendering these programs “black boxes” in the research work flow. Exempting from basic publication and disclosure standards such a ubiquitous category of research tool carries substantial negative consequences. Eliminating this disparity will require concerted policy action by funding agencies and journal publishers, as well as changes in the way research institutions receiving public funds manage their intellectual property (IP).
=========================================
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
cool
This basic common sense rule is long overdue. It falls under the oft neglected science rule of: “If it’s not reproducible, it’s not science.”
Does this site require this of all of its posters (e.g. Dr Spencer) ?
That’s great news! I’ve wondered why this wasn’t demanded from the beginning, especially with regard to the climate studies, which were mostly based on computer models. Had this been in place, the “decline” could not have been hidden.
Of course the corollary to this is that the source data must be available as well, otherwise with what would the computer program be fed?
Does it look like this might actually have the legs to change standard practice across the research world?
The need to examine programs and code has been compelling for at least the last decade. How can you replicate/check anything without it?
Pointman
Amen….
That’s great news! I had always wondered why this wasn’t a requirement all along. Of course, the corollary to this is that the source data must be made available as well. Had this been the practice there would have been no “hidden decline”.
Does this have the legs to actually become standard practice?
This is right on in spaids too. The philosophical unserpinnings of science simply damands complete and total openesness when it comes to data, proceedures and methods. Anything less is not sicnece. Some how we scientists must learn of overcome our egos and clearly and publicaly in our papers, books and presentations, not just what we did but exactly what assumptions were applied and more importently precicely what it is we simply do not know.
“… many researchers are clearly reticent to reveal code that they feel is amateurish due to computer programming not being their profession”. An excellent reason FOR publishing the code, since a program that has been written ‘amateurishly’ is potentially erroneous, as well as ugly.
No fair! You’re asking the magicians to reveal “how they did their tricks.”
Wonderful news. This should have been government (federal, state, local, universities, non-profits, anyone reciving taxpayer funding or tax breaks or support of any kind) policy all along. Anyone receiving public funds for any purpose should be 100% transparaent in everthing. The only exception would be military and other legitimate national security information. And, I believe that even this information should be safeguarded and archived for release at an appropriate time.
Because governments of alll kinds take our taxes by force, we should know completely and without limitations where and how that money was spent. And, those responsible for spending any tax money should be held accountable, fully accountable. Kind of an opt out policy for national security, all else is public property and public information. This should extend to all fruit of publically funded activities.
Happy day today and three cheers for Science Magazine.
The comments at Phys Org are interesting. But none addresses what I suspect is the central point of keeping the code secret: not letting on that it is filled with bias, fudge factors, and errors. (I use the term “suspect” because NO one has any way to evaluate the quality of the codes.)
Furthermore, if the code is released, it becomes impossible to argue that the data feeding it can or should be kept secret.
As a reminder, it was requests for the codes and data that started the CAGW smear campaign to begin with. I expect the scientists making this call will be attacked in short order.
Even baby steps add up. Of course, so much of science is “citing the previous guys” and all that code will still be exempt as “prior art”.
Len, FYI: As a general rule, any code developed by a DoD contractor with any contract money is the property of the US government.
Only 20 years late, but welcome none the less.
Yawn
http://esr.ibiblio.org/?p=1436
A great initiative, and not just for climate science. Of all the fields of scientific endeavor, medical research has shown the highest incidence of academic fraud. Requiring the release of source code along with the publication would have caught many of these frauds long before they saw the light of day.
Won’t matter much, since the military has adopted Carbon Sequestration as its own main mission. This will exempt all warmist code on national security grounds.
This should be required for any academic work. It is a part of the research and the source information. No academician should be able to hide numbers behind an unpublished code. No peer review committee should just accept numbers without that source provided.
As Steven Mosher says – cool.
With regard to this bit:
many researchers are clearly reticent to reveal code that they feel is amateurish due to computer programming not being their profession
If they’re going to draw conclusions based on the results produced by that code, they’re going to have to get used to the code flapping about in the wind with everybody staring at it. Nobody who matters will laugh, I promise. The other thing to remember is that if the best criticism that can be levelled is “your code is amateurish”, then clearly said critics can’t find issue with the way the “amateurish” code works, or the models’ outputs.
And with regard to this:
some code may have commercial value
If the code has commercial value, and can be shown beyond doubt to have been developed with strictly private funds only, then it gets tricky. Validation of the code by an NDA-bound third party would be a good start, as would submittal of all raw data input.
This is something that should be clear AND easy for everyone to understand. IF the science is to be trusted, shouldn’t everything be open for review?
Wow, after how many years? That’s just another affirmation that the fight is being won!
In my opinion, this is a testament to Steve McIntyre’s tenacity.
Seconded. And to many others as well.
How much of the stuff that’s out there is amateurish and therefore should be retracted until its code is improved?
I wonder also why many wouldn’t approach physics or a lab in an amateurish fashion but think nothing of writing rubbish computer code.
Don’t think I’ll hold my breath waiting for code release to be a norm…