The case for open computer programs

This paper, recently published in Nature โ€“vol. 482, Feb 23rd, 2012โ€“ by Darrel C. Ince, Leslie Hatton & John Graham-Cumming gives voice to a debate that has interested the scientific community for a while: linking data and models to publications. The authors argue that, with some exceptions, anything less than the release of source programs is intolerable for results that depend on computation.

Scientific communication relies on evidence that cannot be entirely included in publications, but the rise of computational science has added a new layer of inaccessibility. Although it is now accepted that data should be made available on request, the current regulations regarding the availability of software are inconsistent. 

We have reached the point that, with some exceptions, anything less than release of actual source code is an indefensible approach for any scientific results that depend on computation, because not releasing such code raises needless, and needlessly confusing, roadblocks to reproducibility. The vagaries of hardware, software and natural language will always ensure that exact reproducibility remains uncertain, but withholding code increases the chances that efforts to reproduce results will fail.


The authorsโ€™ thesis is that journal and funding body strictures relating to code implementations of scientific ideas are now largely obsolete. They suggested one modest path to code availability in this article. There are a number of further steps that journals, academies and educational organizations might consider taking: 

  • Research funding bodies should commission research and development on tools that enable code to be integrated with other elements of scientific research such as data, graphical displays and the text of an article.
  • Research funding bodies should provide metadata repositories that describe both programs and data produced by researchers. The Australian National Data Service (http://www.ands.org.au/) which acts as an index to data held by Australian research organizations, is one example of this approach. 
  • Journals should expect researchers to provide some modular description of the components of the software that support a research result; referees should take advantage of their right to appraise software as part of their reviewing task. An example of a modular description can be seen in a recent article published in Geoscientific Model Development ( Yool, A., Popova, E. E. & Anderson, T. R. Medusa-1.0: a new intermediate complexity plankton ecosystem model for the global domain. Geosci. Model Develop. 4, 381โ€“417 (2011) An example of an article from a journal that asks for code release for model description papers and encourages code release for other categories of paper). 
  • Science departments should expand their educational activities into reproducibility. Clearly such teaching should be relevant to the science at hand; however, courses on statistics, programming and experimental method could be easily expanded and combined to include the concept of reproducibility.โ€
For more information, please read the full version of thearticle.


Date: 23/02/2012 | Tag: | News: 22 of 1618
All news

News

More news

Events

More events
newsletter

Subscribe to the VPH Institute Newsletter

ARCHIVE

Read all the newsletters of the VPH Institute

GO