Abstract

Impact of a Novel Checklist on the Peer Review Process

Jorge Finke,1 Sumi Sexton2

Objective

The peer review process has been criticized for a lack of evidence that it improves the quality of scientific literature.1 Some authors have reported peer reviews to be incoherent, lacking constructive feedback, and slow.2 Importantly, little research has been done on peer review for narrative reviews. We have developed a checklist for peer reviewers to evaluate the effects on the quality of peer reviews and manuscripts.

Methods

This was a pseudorandomized nonblinded clinical trial. A 10-question checklist was developed for use by peer reviewers evaluating manuscripts for the medical journal American Family Physician (AFP). The checklist was created through the Delphi method, utilizing a panel of experts including AFP medical editors and editors from other journals who reviewed the proposed checklist. The checklist was internally validated by 2 medical editors independently comparing results of the new checklist to the journal’s existing manuscript quality scoring system. Manuscripts were assigned on alternating months to either the experimental group in which reviewers and editors applied the new checklist or the control group which utilized the traditional system. After peer review, the medical editor assigned to each manuscript rated the quality of each review, using a 0 to 100% scale to determine whether the checklist improved peer reviewer performance. Quality ratings of the initially submitted and finalized manuscripts were also determined by the medical editor using a 0 to100% scale. Changes in initial to finalized manuscript quality ratings were compared between the experimental and control groups.

Results

A total of 78 manuscripts were evaluated from December 2022 through March 2024. Forty of these manuscripts were assigned to peer reviewers utilizing the newly developed checklist, and 38 used the traditional review process. No statistically significant improvement was noted in review quality (68.5%; 95% CI, 63.5%-72.8% vs 65.7%; 95% CI, 61.0%-69.8%; P = .40), review word count length (683 words; 95% CI, 608.8-754.5 words vs 616.04 words; 95% CI, 545.8-684.1 words; P = .19) or speed of the peer review process (45.49 days; 95% CI, 37.2-52.8 days vs 37.3 days; 95% CI, 31.9-41.6 days; P = .08). Similarly, no statistically significant improvement was found in manuscript quality rating after revisions based on peer review with new checklist compared with the control group (difference, 15.7%; 95% CI, 9.1%-22.6% vs 12.6%; 95%CI, 6.7%-18.8%; P = .49).

Conclusions

Few interventions have had positive impacts in the peer-review process, including the appropriate use of checklists or guidelines.3 While this study showed no significant improvements with the new checklist in the peer review quality rating, reviewer word counts, speed of reviewer process, and overall manuscript ratings, no parameters were negatively affected. The checklist was still formally adopted as a modified version of the previous format developed based on Delphi feedback from experts in the field.

References

1. Kelly J, Sadeghieh T, Adeli K. Peer review in scientific publications: benefits, critiques, & a survival guide. EJIFCC. 2014;25(3):227-243.

2. Sciullo N, Duncan M. Professionalizing peer review suggestions for a more ethical and pedagogical review process. J Schol Pub. 2019;50:248-264. doi:10.3138/jsp.50.4.02

3. Gaudino M, Robinson NB, Di Franco A, et al. Effects of experimental interventions to improve the biomedical peer-review process: a systematic review and meta-analysis. J Am Heart Assoc. 2021;10(15):e019903. doi:10.1161/JAHA.120.019903

1Contributing editor, American Family Physician, Leawood, KS, US, jfinke@bidmc.harvard.edu; 2Editor in chief, American Family Physician, Leawood, KS, US.

Conflict of Interest Disclosures

None reported.

Acknowledgment

This study was conducted with support from American Family Physician as an internal quality improvement project and the intervention developed as adopted for use in the journal’s peer review process. This work was conducted with support from UL1TR002541 award through Harvard Catalyst (Biostatistics/Bioinformatics Consultation Program), The Harvard Clinical and Translational Science Center (National Center for Advancing Translational Sciences, National Institutes of Health), and financial contributions from Harvard University and its affiliated academic health care centers. The content is solely the responsibility of the authors and does not necessarily represent the official views of Harvard Catalyst, Harvard University and its affiliated academic health care centers, or the National Institutes of Health.