How do we standardize peer review of bioinformatics software?


The 20th Annual International Conference on Intelligent Systems for Molecular Biology (ISMB), an international meeting positioned at the intersection of computer science and biology, took place in Long Beach last week. Bioinformaticians build software on which much of modern biological computation depends, and openness – in data, software and of course journal articles – is a refreshingly familiar concept to many scientists working in this field.

I’ve recently returned from the conference and in 2012 there were many
reasons to make the trip – not least the launch of new ‘big data’
journal GigaScience, reported elsewhere. Another reason to attend was an invitation to participate in a panel discussion at the Bioinformatics Open Source 2012
(#BOSC2012) pre-ISMB Special Interest Group meeting. Under discussion was
implementation of shared standards that all peer reviewers could use
when judging bioinformatics software articles.

Image credit: Eagle Genomics’ (sponsors of the BOSC 2012 student travel awards) “The Elements of Bioinformatics”, available under a Creative Commons Attribution-Non-Commercial-ShareAlike 3.0 Unported License

Standardization in science is undoubtedly a good thing. In data sharing, standards breed efficiency through interoperability and, if implemented in peer review of bioinformatics software, widely-agreed standards would allow for more objective comparisons of articles. Standards might also enable more efficient transfer of articles between journals if peer reviews are shared. The CONSORT checklist for assessing the reporting of randomized trials is a case in point.

So how do you implement standards across multiple journals and publishers? There is a diversity of journal policies to be found – within information for authors and reviewer guidelines – on an equally diverse collection of issues (see here for inter-publisher differences in policy on data and source code availability). Journals endorse a number of initiatives to improve the quality and consistency of reporting of research, such as MIAME and the aforementioned CONSORT, but these are often non-mandatory. And even mandatory policies, such as prospective registration of clinical trials, have been found to be inconsistently applied by some journals. Also, we may need to consider the limitations of authors and reviewer guidelines alone in effecting change. If Frank Davidoff, editor of Annals of Internal Medicine, 1995–2001, is to be believed information for authors may be one of the best places to hide top-secret information.

But standardization in peer review and editorial process can be possible. The ICMJE implemented in 2010 a standard conflict of interests form across its core member journals. This shows the importance of collective change in influencing behaviour. Another example is the consortium of journals in ecology and evolutionary biology which collectively required all authors to deposit the data supporting their accepted articles in a repository, such as Dryad. And at BioMed Central we have been working with other publishers on standardizing journals’ expectations of peer reviewers who receive manuscripts that include supporting data as supplementary material (although this is still is a work in progress).

To improve (re)usability of bioinformatics software the quality of the underlying code, measured through unit testing, must further improve. More transparent and objective measures of quality, which might result from standardized peer review, are valuable, but it’s important to be pragmatic and not set the bar to entry – publication – too high, particularly if this is not standard across all journals in the field. To paraphrase a recent blog post by Source Code for Biology and Medicine series editor Cameron Neylon, changing the world is hard – particularly without support of the entire scientific community.

Whether an article meets standard criteria (such as CONSORT), or not, usually complements the peer-review process rather than determines its outcome. But when published, it is desirable for articles which meet the best quality standards, such as for reproducibility and availability of supporting data, to be recognized appropriately. This allows readers to decide for themselves if and how their interpretation of the article should be affected (as generally happens for competing interests statements). For example, we are looking at new ways to highlight articles which include or have permanent links to supporting data, here at BioMed Central. The journal Biostatistics uses an approach of kite-marking for reproducibility.

The panel discussions also strayed into the familiar ground of the benefits and risks of open peer review. We know that the quality of open peer reviews is equivalent to closed reviews, even if those reviews might be posted online as happens on many BioMed Central journals, although a few more reviewers might decline to review openly (GigaScience addresses this risk by offering an opt-out to open review on request). Resulting from the discussion, panel Chair Brad Chapman, who has posted his notes from the panel on his blog, is now working to draw up best practice guidelines for reviewers, which would be published in a journal and which might lead to community implementation in the future.

Arguably one of the most successful policies implemented in life sciences – international agreements for sharing of genetic sequence data – was achieved through a number of factors in combination including:

1.    Journals and funders making compliance with the policy mandatory
2.    Development of technology needed to support the change
3.    The clear need to work together to achieve a major shared goal (the sequencing of the human genome)

Perhaps a culmination of similar factors, which have dramatically improved the availability of data in genomics, might in the future prove influential in improving the quality of tools to analyse those data.

This report from the panel discussion represents the key discussion points as perceived by the author and does not represent the views of all panelists.

View the latest posts on the Research in progress blog homepage

One Comment

Comments are closed.