Abstract

The Role of PubPeer Comments in Alerting Editors to Serious Problems With Clinical Research Publications

Elizabeth Wager,1,2 Emma Veitch3

Objective

PubPeer is a self-described “online journal club” that facilitates commenting on published biomedical literature. We sought to determine how often postpublication comments on PubPeer identify serious misconduct or errors in clinical research articles, how often editors are alerted to problems via PubPeer, and how editors and authors respond.

Design

Two raters independently categorized all comments on PubPeer about research publications in BMC Medicine, The BMJ, and The Lancet from first comment appearance (October 2013) to December 31, 2016 (comments on editorials, letters, news, etc were counted but not analyzed). The categories, developed iteratively and by consensus, included well-supported allegations of fabrication, falsification, or plagiarism (FFP); vague FFP allegations (presenting no evidence); allegations of other misconduct; honest error; and methodological concerns. Differences were resolved by discussion. We contacted editors to ask whether PubPeer alerted them to the allegations and how they responded.

Results

We found 344 PubPeer comments relating to 150 articles. Of 177 comments relating to 99 research articles, 106 (60%) were imported from PubMed Commons (PMC) (all signed, as required by PMC), of which 11 (6%) were from journal clubs. Of the non-PMC comments, 67 (94%) were anonymous. Of the 177 comments on research articles, 7 (4%; 2 signed) made allegations about or mentioned investigations into FFP in 4 articles (3 strong, 4 vague), 5 (3%; 4 signed) identified errors in 5 articles (mainly concerning trial registration identifier numbers), 29 (16%; 26 signed) raised methodological issues about 20 articles, and 16 (9%) discussed clinical implications. Fifty-nine comments (33%) contained little or no text but gave links to other sites (eg, journal articles, blogs, retraction notices), and 10 (6%) provided extra information without criticism. Journal editors were unaware of the PubPeer postings about their published articles but had independently issued corrections (3) or expressions of concern (2). Authors responded on PubPeer to comments about 4 articles (4%). Commentary on other types of research (eg, comments on basic science, which occur more frequently on PubPeer than comments on clinical studies), on other sites, and other editors’ responses may be different from that on PubPeer.

Conclusions

Only 7% of comments on 9 research articles in our sample raised issues that might require journal action (7 fraud, 5 error). The 3 journals had not been alerted to problems via PubPeer but were generally aware of the concerns from other sources and issued corrections (3), or expressions of concern (2). While PubPeer provides a useful forum for postpublication comments, the frequency of comments requiring journal action in our clinical journal sample was low.

1Sideview, Princes Risborough, UK, liz@sideview.demon.co.uk; 2University of Split Medical School, Split, Croatia; 3Freelance Editor, London, UK

Conflict of Interest Disclosures:

None reported.

Funding/Support:

This study was funded by Sideview, which is owned by Elizabeth Wager and paid Emma Veitch for her work on the study.

Acknowledgments:

We thank Jigisha Patel and Lin Lee (BMC Medicine), Theodora Bloom (The BMJ), and Sabine Kleinert (The Lancet) for supplying information about journal editors’ awareness of PubPeer comments. We thank Brandon Stell of PubPeer for answering queries about the mechanisms of PubPeer.

Video