Last month, guest blogger Fiona Russell – a post doctoral researcher studying chronic pain, inflammation, and arthritis – attended a Sense about Science peer review workshop (which our Biology Editor, Elizabeth Moylan, took part in on the panel). Here, she writes about the need for young researchers to develop their peer review skills and how journals could do more to recognize and incentivize reviewers.
I remember the worry the first time I was asked to peer review a paper: Would I miss something crucial? Would I attach too much importance to something inconsequential?
It wasn’t until I received the final decision from the editor, with the other reviewer’s report attached, that I was reassured I had done a satisfactory job.
That is why last month when I participated in the Sense about Science Peer Review workshop, I was most interested in peer review training. I asked the expert panel, if journals or editors give much feedback or formal training to reviewers.
The consensus seemed to be that this rarely happens. Some journals rate the reviewers for their own purposes – but often this is not fed back to the reviewers. But surely, just as the paper can be improved by reviewing, reviewers can benefit from constructive criticism.
I think there should be a system whereby PhD students have to review at least three papers during their training. The supervisor would officially review the paper for the journal, but the student would write a review to be compared with the supervisor’s. Feedback can then improve the student’s next review.
It is also important that reviewers get acknowledged for their hard work. At the moment my CV lists journals I’ve reviewed for, but it would be better if I could add that I was a 5* reviewer – or if I wasn’t. Knowing I was being rated, it’s likely I would work harder to ensure a good ranking. (I have just recently seen exciting news from PeerJ about their partnership with Publons to enable reviewers to get credit)
Punctuality of reviews could also be rewarded. Panel member, Alice Ellingham from the Editorial Office talked about the problem chasing AWOL reviewers, and we all know the frustration of delayed reviews. Knowing that someone was a 5* reviewer with 100% punctuality would be great (but could lead to the top reviewers being inundated with papers to review).
During the workshop we took part in a group discussion about strengths and weaknesses of the current peer review system and potential alternatives.
We discussed how peer review can improve a paper if the reviewers give constructive criticism, and someone mentioned how peer review should detect plagiarism.
I don’t agree with this last idea, as I feel it is the job of the journal to do this using automated plagiarism checkers. As a reviewer you do not always have intimate knowledge of the topic you are reviewing, so it is unrealistic to expect reviewers to have read all the relevant papers in order to detect plagiarism.
My favorite alternative to the current system has been successfully pioneered by F1000 Research. They are a new open access journal that does post-publication open peer review. All the reports are available to read and it is easy to see the discussion between authors and the named reviewers. This is a useful teaching tool too, as young scientists can read and learn what a review report looks like.
The lack of anonymity in peer review is also a good thing in my mind. Interestingly, one of the speakers, Elizabeth Moylan from BioMed Central, talked about research that showed when reviewers were named, reports were more constructive with a greater number of comments on the methods, and comments backed up with evidence.
During the workshop we also chatted about the public perception of peer review and how people need to realise the system isn’t infallible. Everyone needs to recognize the difference between peer-reviewed research and scientific claims based on little or no evidence with no peer review (something Sense About Science is doing very well with in their Ask for Evidence campaign).
Overall the workshop provoked much discussion on peer review. I have only touched on some of it in this post but would welcome more comments and ideas.
3 Comments