We need to develop counter adaptations to predatory journals

To the unacquainted, predatory journals can be considered scholarly publishing’s equivalent to fake news. They are defined as journals that prioritize self-interest at the expense of scholarship. They typically make money from article processing charges common to the open access (OA) publishing model.

In the OA model accepted papers are made publicly available, but journals often take in a fee called an article processing charge from researchers to offset article production costs. These fees can generate millions of dollars for predatory journals and their publishers. Early research examining predatory journals has suggested that they will publish virtually anything in order to make money from this fee.

The term ‘predatory’ was coined to describe how these journals look to dupe authors (‘prey’) into submitting to their journal by appearing legitimate and falsely promising things like peer review, indexing, and archiving. We have previously described how even senior scientists have inadvertently submitted to predatory journals, but given the pressure to publish suspect that other researchers knowingly publish in these outlets.

There have been several sting studies conducted (see examples here) in which a non-sensical or obviously flawed paper has been submitted to a presumed predatory journal, with the outcome typically being that the work is accepted, often without peer review. In one particular study researchers created a fictitious C.V. of an underqualified researcher which they then submitted to several journals in application to become an editor. Thirty-three percent of predatory journals, compared to 7% of OA journals listed on the DOAJ (Directory of Open Access Journals), accepted the editor’s application, with some even offering to share the costs obtained from article processing charges with the editor if attracted article submissions.

These findings need to be reconciled against a more recent group of studies which suggest that predatory journals are adapting. As described in our recent paper we submitted a previously published article to over 600 journals. Submissions were sent in roughly equivalent numbers to randomly selected predatory journals, legitimate open access journal, and traditional subscription-based journals. To approximately half of these journals we submitted the accepted paper in the PDF formatted version from the journal where the work was published, while the other half received a typical manuscript style submission formatted in Microsoft Word. We believe this stress test was more methodologically rigorous than previous studies.

We received correspondence back from 308 (51.1%) journals within our study timeline (32 days). Just four journals (1.3%) accepted our paper, only one acceptance related to the version submitted in PDF form. A total of 13 journals requested a revision to our paper (1 predatory, 6 open access, 6 subscription-based journals). Of the journals that responded 94.5% rejected the paper, but just 45.7% did so because they identified ethical concerns with the submission. This suggests that overall predatory journals are unlikely to accept a paper quickly without evaluation.  It also suggests that most journals don’t have the safeguards in place to identify plagiarism and promptly reject duplicate submissions. The Committee on Publication Ethics specifically suggests journals address suspected plagiarism.

Our research approach is contentious. We burdened editors and peer reviewers at legitimate journals with our submission. The fact is that even legitimate journals fail to operate transparently (e.g. make openly available their peer review) or provide clarity on training and processes related to peer review at their journal. Until the black box of peer review is open, and more journals engage with and conduct research about their processes including peer review, these types of studies are needed to function as a stress test on journal operations and practices.

Our finding that predatory journals may be less likely than they once were to accept an obviously problematic paper is consistent with research recently reported in a preprint which provides evidence that predatory journals have implemented peer review. These researchers showed that a proportion of reviews captured in Publons, a commercial website that allows researchers to track publications, citations, and peer reviews, were conducted in relation to work submitted to presumed predatory journals.

In previous research we surveyed more than 80 authors who had published in presumed predatory journal, 83.3% indicated their paper was peer reviewed, with 79.7% indicating that peer review received was helpful and substantive. Even more perplexing, more than a third indicated they did not pay fees to publish. We speculate that predatory journals may not charge a fee initially to obtain opening content when the journal is starting out. These results are counter to earlier research and common perceptions.

Like an evolutionary arms race, it appears that predatory journals have adapted to the publication landscape in order to adhere, or at least give the appearance of adherence, to a greater extent to typical publishing best practices. Adaptations on the part of predatory journals lead them to appear less distant from legitimate journals and as a result make it more challenging for researchers producing research, and for consumers of research, including the public, to evaluate the integrity of a journal.

We urgently need to counter adaptations by predatory journals; differentiation them from legitimate OA journals is critical. Predatory journals pose a threat to the integrity of science. In a climate where public perception of science is already waning, predatory journals sow further confusion. We are concerned that the exciting push by a coalition of funders to enforce open access publishing, in the absence of concerted action among stakeholders against predatory journals, will drive more predatory publishing.

Addressing predatory publishing will require a re-evaluation of the incentives and rewards common to academia and a shift in value from quantity of publications to quality appraisals of publications. It will also require that legitimate publishers lead the way in adopting transparent and auditable practices.

View the latest posts on the On Medicine homepage

Comments