Advancing meta-research through data sharing and transparency

There is an increasing concern over the lack of research reproducibility in biomedical literature, particularly in meta-research where there have been few attempts to investigate it. A study published today in Systematic Reviews compares two concurrent systematic reviews from the Medtronic-Yale partnership that established the Yale Open Data Access (YODA) Project, which offered a unique opportunity to study meta-research reproducibility and to test models of data sharing.

Over the past few years, scientists have grown increasingly concerned about the lack of research reproducibility in the biomedical literature. There have been multiple efforts to examine reproducibility of research in psychology, drug development, and cancer biology, each of which has found that approximately half of studies could be reproduced.

In many instances meta-analyses are conducted by single research teams, with the input of peer reviewers but without external replication efforts.

Until now, there have been few attempts to investigate meta-research reproducibility. Currently, in many instances meta-analyses are conducted by single research teams, with the input of peer reviewers but without external replication efforts. Although previous studies investigating the replicability of systematic reviews have focused on new teams performing new searches and analyses of the literature at different time points, it has been unclear whether multiple systematic reviews with the same research objectives, participant-level data, time, and funding would employ the same analytic methods and then end up with the same conclusions.

In 2011 Medtronic, Inc was the first company to contract with the Yale Open Data Access (YODA) Project. The company provided all of their recombinant human bone morphogenetic protein-2 (rhBMP-2) clinical trial data to Yale, which in turn provided the data first to two highly reputable research groups (Oregon Health & Science University and the University of York) to perform independent systematic reviews of the entire body of scientific evidence regarding the safety and effectiveness of Medtronic’s rhBMP-2 product. The data were subsequently made available to the wider research community on request. This Medtronic-Yale partnership offered a unique opportunity to test models for sharing clinical trial data, but also to study the reproducibility of meta-research.

In the new Systematic Reviews article, Low et al., including the YODA Project team and leads of the two independent research teams from Oregon and York, compared the methods used, results found, and conclusions from the two concurrent systematic reviews. The research teams had been provided with the same objectives, resources to complete the work, and data. Furthermore, both research teams used best practices in meta-research, including registering and pre-specifying their protocols on PROSPERO.

In order to understand this lack of reproducibility, it is necessary to disentangle the meaning of the word.

Even in a situation such as this, where one would expect few to any differences between the teams, there were dissimilarities, including which trials were selected for inclusion, the methodology used for meta-analysis, and overall study conclusions. In the end, the research teams’ effect estimates for most outcomes and adverse events were similar, suggesting that the clinical importance of the methodological and inferential differences is debatable. Yet it is impossible to argue that the methods, results, and points of emphasis were all identical. In order to understand this lack of reproducibility, it is necessary to disentangle the meaning of the word.

Recently, Steven Goodman and members of the Meta-Research Innovation Center at Stanford (METRICS) attempted to standardize the reproducibility nomenclature by allying the word reproducibility with three main descriptors: “Methods reproducibility,” which refers to the ability of subsequent investigators to use the same data, tools, and procedures as a previous study to obtain the same results; “results reproducibility,” which refers to the corroboration of previous results following the same experimental methods with new data; and “inferential reproducibility”, which refers to the process of drawing conclusions of similar strength from a study’s replication or re-analysis.

This reproducibility framework illustrates the many ways that researchers may not always come up with the same answers to the exact same questions, even when they have the same resources, use the same data, or undertake the work at the same time. There are often more than one methodologically defensible approaches when it comes to designing a study, each of which can lead to slight differences in meta-research results or conclusions.

Our goal as researchers should be to accumulate evidence so that we can eventually triangulate on the truth.

The findings from this Medtronic-Yale reproducibility experiment ultimately underscore the importance of making data more openly available. Our goal as researchers should be to accumulate evidence so that we can eventually triangulate on the truth. The results from a single, even high quality systematic review, are not etched in stone and would not be what any other high quality team would report. As Goodman et al. suggest, methods reproducibility in the biomedical sciences requires at a minimum “a detailed study protocol, a description of measurement procedures, the data gathered, the data used for analysis with descriptive metadata, the analysis software and code, and the final analytical results”. Although evidence suggests that we are nowhere near the desired level of research transparency, increased open access to all study methods and data will allow external researchers and research consumers to evaluate study characteristics, re-run analyses, and synthesize new data.

Meta-research transparency, including public registration, results reporting, and data sharing, is critical to enhance the reproducibility of research findings, promote scientific integrity, and promote stricter adherence to robust and unbiased scientific methods. At a minimum, reanalysis of openly available data can strengthen our confidence in the findings of a systematic review while also allowing for updated conclusions about interventions.

View the latest posts on the On Medicine homepage

Comments