Abstract

A Synthesis of Studies on Changes Manuscripts Underwent Between Submission or Preprint Posting and Peer-Reviewed Journal Publication

Mario Malički,1 Ana Jerončić,2 Gerben ter Riet,3,4 Lex Bouter,5,6 John P. A. Ioannidis,1,7,8,9,10 IJsbrand Jan Aalbersberg,11 Steven N. Goodman1,7,8

Objective

The ability of peer review to improve the scientific endeavor (eg, the conduct, reporting, and validity of study findings) has been questioned,1 and calls have been made to showcase changes that occurred to each study due to peer review.2 Until such transparency is achieved, identification and synthesis of studies that analyzed differences between preprints or manuscript versions submitted to journals and peer-reviewed publications is being undertaken.

Design

In this stage of the living systematic review, studies were identified based on authors’ knowledge of the field and by checking all research at peer review conferences (presented as podium presentations or posters in the European Union and USA). References and citations of identified studies were then checked. For all studies, the following was extracted: year of publication, sampling method, conflict of interest, funding, data and protocol sharing, number of analyzed version pairs, sample size calculation, scholarly discipline, method used to compare versions, variables (ie, manuscript sections) analyzed for changes, and metric with which the changes were quantified or qualitatively classified.

Results

Of 25 studies published from 1990 through the end of 2021, 16 (64%) analyzed changes between submitted and published papers and 9 (36%) between preprints and published papers. Most commonly, changes were analyzed by filling out questionnaires or scales separately for each of the 2 manuscript versions (11 [44%]) or by manual comparison of the 2 manuscript versions (6 [24%]). The median number of analyzed version pairs was 59 (IQR, 41-122). Most studies analyzed changes that occurred in health (18 [72%]) or social sciences (4 [16%]) manuscripts. Overall, studies’ conclusions indicated very high similarity between version pairs, with the largest changes occurring in introduction and discussion sections. Examples of items for which most changes were found are presented in Table 13.

Conclusions

The current results indicate that submitted or preprinted manuscript versions and their peer-reviewed journal version are very similar, with main (analysis) methods and main findings rarely changing. Quantification of these results is pending. Large differences between studies, type of manuscript changes, and methods with which they were measured indicate greater need for collaboration in the peer review field and creation of the core outcomes measures for manuscript version changes.

References

1. Tennant JP, Ross-Hellauer T. The limitations to our understanding of peer review. Res Integr Peer Rev. 2020;5(1):6. doi:10.1186/s41073-020-00092-1

2. Limbu S. Building trust in peer review: a Q&A with Dr Mario Malički. BioMed Central. September 18, 2020. Accessed June 24, 2022. http://blogs.biomedcentral.com/on-medicine/2020/09/18/building-trust-in-peer-review-a-qa-with-dr-mario-malicki/

1Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA, mario.malicki@mefst.hr; 2Department of Research in Biomedicine and Health, University of Split School of Medicine, Split, Croatia; 3Urban Vitality Centre of Expertise, Amsterdam University of Applied Sciences, Amsterdam, the Netherlands; 4Amsterdam University Medical Centers, Department of Cardiology, Amsterdam, the Netherlands; 5Department of Philosophy, Faculty of Humanities, Vrije Universiteit, Amsterdam, the Netherlands; 6Department of Epidemiology and Data Science, Amsterdam University Medical Centers, Amsterdam, the Netherlands; 7Department of Medicine, Stanford University School of Medicine, Stanford, CA, USA; 8Department of Epidemiology and Population Health, Stanford University School of Medicine, Stanford, CA, USA; 9Department of Biomedical Data Science, Stanford University School of Medicine, Stanford, CA, USA; 10Department of Statistics, Stanford University School of Humanities and Sciences, Stanford, CA, USA; 11Elsevier, Amsterdam, the Netherlands

Conflict of Interest Disclosures

IJsbrand Jan Aalbersberg is senior vice president of research integrity at Elsevier. Mario Malički is a co–editor in chief of Research Integrity and Peer Review. Lex Bouter, John P. A. Ioannidis, and Steven N. Goodman are members of the Peer Review Congress Advisory Board but were not involved in the review or decision for this abstract.

Funding/Support

Elsevier funding was awarded to Stanford University for a METRICS postdoctoral position that supported Mario Malički’s work on the project.

Role of the Funder/Sponsor

IJsbrand Jan Aalbersberg is an employee of Elsevier and had a role in the design and conduct of the study; management and interpretation of the data; review and approval of the abstract; and decision to submit the abstract for presentation.

Video

Slideshow