Reproducibility: what are we going to do about it?

In recent decades, the reproducibility of a shocking number of scientific studies has been called into question. What we’re going to do about the reproducibility ‘crisis’ is a question for everyone involved in research. Today, in an effort to address this BioMed Central is launching the pilot of a new Minimum Standards of Reporting Checklist.

2

With an increasing number of studies revealing much of science is not able to be reproduced or replicated, the question is now being asked, Can we do science better?

Today, BioMed Central is launching a pilot, trialling the use of a reporting checklist in a selection of our journals. For now, the checklist will be rolled out on a small group of select journals: BMC Biology, BMC Neuroscience, Genome Biology, and GigaScience.

The checklist addresses three areas of reporting: experimental design and statistics, resources, and availability of data and materials.

Why is this important?

In a launch editorial for the new checklist BioMed Central staff and the Editors of GigaScience and Genome Biology stress the need for ensuring the reproducibility of scientific work:

“Clearly important for clinical research, verification is equally important for preclinical research, something we all have an equal stake in.

“No one can innovate new drugs overnight, no matter how rich they are, no matter which doctor they see. Better, more robust preclinical research benefits us all. Our ability to rely on published data for potential therapeutics is critical, and recently its reliability has been called into question.”

The editorial goes on to cite the case of an oncology study of preclinical research findings in which researchers were able to confirm only 11% of the findings.

The editorial goes on to cite the case of an oncology study of preclinical research findings in which researchers were able to confirm only 11% of the findings.

How will the checklist work?

Journals clearly have an important part to play in helping to ensure that experimental design and analysis are appropriate, and that reporting standards are met. The new checklist for authors and referees aims to do just that.

It builds on the accepted standards of EQUATOR, PRISMA and MIQE-precis and the principles behind them, formalizing, tailoring and standardizing these efforts across journals. It has also been produced in conjunction with our endorsement of the NIH Principles and Guidelines for Reporting Preclinical Research.

“Authors will be asked on submission to confirm that they have included the information asked for in the checklist or give reasons for any instances where it is not made available or not applicable,” say the authors of the Editorial.

“Likewise, reviewers will be asked to confirm the information has been satisfactorily reported and reviewed.”

In six months’ time, we plan to review the data we’ve collected around this trial, checking whether reporting has increased and collating author, editor, and reviewer feedback, with the aim to roll out the checklist (with any revisions) across all BioMed Central journals.

What next?

This isn’t the end of our work to ensure reproducibility of research published with us. Future projects will include:

  • Providing a range of article types to adequately support and encourage reproducibility, and ensure authors receive credit for all of their work. This may include: Registered Reports (protocols), Data Notes, Short Updates, and Replication Studies.
  • Working across our journal portfolios to link related articles, products or resources, tools and lab notebooks.
  • Working with the research community to provide guidance to authors on data standards.
  • Incorporating the new checklist within our Roadshows and Author Workshops for young researchers, as well as training for our editors and reviewers, helping to ensure researchers are made aware of the reporting standards before

Tell us what you think

Make sure to take a look at our checklist and tell us what you think. We’d love to hear your ideas, not just about the checklist, but about what can be done to tackle the problem of reproducibility. Leave your comments below or email reproducibility@biomedcentral.com.


You can read more about reproducibility in these posts for our On Biology blog.

 

View the latest posts on the Research in progress blog homepage

2 Comments

Stephen Eglen

I think this is a great idea, and I look forward to seeing the results of the trial. I have a few comments.

1. Do you have any papers and checklists (real or mock) that you could show prospective authors to demonstrate how to complete these checklists? Having case studies always helps authors what you are expecting to see. Authors are often unsure about what you might expect them to share, so examples would help.

2. As a reviewer, it can be quite hard to check that everything you’d expect to be made available to be included. Likewise, authors might not realise that a particular resource is likely to be of use to others and omit it. It can be only post-publication that when readers study a paper, that they wish to access e.g. a particular dataset that has not yet been shared. How do you recommend these cases be handled?

3. If a reviwer thinks that something useful is not being shared, what happens? Is this included in the reviewer’s report with a request to share that item?

Amye Kenall

Hi,

Thanks for your support! To answer your questions:

1. At present we don’t have any examples as of yet, but this is a great idea. Examples could help authors in completing their checklists. In the next six months, we’ll be collating exactly this type of feedback to determine how best to revise the checklist when it is rolled out across more journals. This is something we’ll definitely consider.
2. Given that all journal content is essentially curated by humans (through peer review), these post-publication cases will never be 100% avoidable. However, we have implemented a workflow to try to prevent them. For example, reviewers and authors are asked a series of questions to ensure authors’ manuscripts adhere to the checklist criteria (identifying resources, depositing data, etc). At the end of the six months, we will perform a cross-sectional study looking at accepted articles before the pilot and after to evaluate whether reporting increased or not. All accepted authors and reviewers will also be asked to complete a survey on the checklist. Finally, the checklist is something we will also be incorporating into our author workshops in order to help with training the community on how to report their findings and review articles. We hope these measures will help us to strengthen the checklist in the future.
3. If a reviewer feels something wasn’t shared that should be, they will have a chance to mention this in the reviewer report (indeed, they are explicitly asked this for each section of the checklist).

Thanks again for your support, and please do continue to feed back on your experience with the checklist and how we can do our part as a publisher to improve the robustness of reported research.

Comments are closed.