Abstract

Feasibility of a Peer Review Intervention to Reduce Undisclosed Discrepancies Between Registrations and Publications

TARG Meta-Research Group & Collaborators

Robert T. Thibault,1,2,3 Tom E. Hardwicke,4 Robbie W. A. Clark,2,3 Charlotte R. Pennington,5,6 Gustav Nilsonne,7,8 Aoife O’Mahony,9 Katie Drax,2,3 Jacqueline Thompson,2,3 Marcus R. Munafò2,3

Objective

The authors developed a peer review intervention to reduce undisclosed discrepancies between study registrations and their associated publications, which are common.1,2 The aim of this study was to (1a) evaluate the feasibility of incorporating discrepancy review as a regular practice at scientific journals; (1b) evaluate the feasibility of conducting a trial on discrepancy review; (2) explore the benefits and time required to incorporate discrepancy review as a regular practice at scientific journals; and (3) refine the discrepancy review process.

Design

The authors invited editors in chief of 18 journals in medicine or psychology to participate and provided volunteer early-career researchers to act as peer reviewers who were specifically assigned to check for undisclosed discrepancies between registrations and submitted manuscripts of any study design. The authors called this process discrepancy review.

Results

Of the 18 invited journals, 5 agreed to participate, 2 of which did not receive any manuscripts reporting a registration during the study period and 1 of which had difficulty adding discrepancy review to their manuscript handling procedures and therefore did not provide any manuscripts to review, leaving 2 participating journals. Discrepancy review was performed between January 29 and May 18, 2021, on all registered studies submitted to Nicotine and Tobacco Research (n = 18) and on all registered studies for which the editor in chief of European Journal of Personality acted as action editor (n = 3). Table 47 details the reviewed manuscripts and findings. Registrations were generally too imprecise to be effectively evaluated by the original discrepancy review process, which used a detailed and structured checklist. Thus, the authors developed an updated discrepancy review process that used a semi-structured format with 8 guiding questions regarding exploratory studies, proper registration, retrospective registration, hypotheses, independent variables, outcome measures, analyses, and additional discrepancies. Discrepancy reviewers provided 59 comments on the 12 manuscripts that were accepted for publication. Authors fully addressed 31, partially addressed 10, and did not address 18. Optional questionnaires were completed by 5 of 13 action editors and 4 of 21 manuscript authors who showed no opposition to discrepancy review.

Conclusions

It was feasible for 2 journals interested in discrepancy review to implement this process when provided with discrepancy reviewers. A full trial of discrepancy review would be needed to evaluate its effect on reducing undisclosed discrepancies, possibly stratified by clinical trial vs Open Science Framework registration given differences in the detail required by each registry.

References

1. TARG Meta-Research Group and Collaborators. Estimating the prevalence of discrepancies between study registrations and publications: a systematic review and meta-analyses. medRxiv. Preprint posted online August 9, 2021. doi:10.1101/2021.07.07.21259868

2. Goldacre B, Drysdale H, Dale A, et al. COMPare: a prospective cohort study correcting and monitoring 58 misreported trials in real time. Trials. 2019;20(118):1-16. doi:10.1186/s13063-019-3173-2

1Meta-Research Innovation Center at Stanford (METRICS), Stanford, CA, USA, robert.thibault@stanford.edu; 2School of Psychological Science, University of Bristol, Bristol, UK; 3MRC Integrative Epidemiology Unit at the University of Bristol, Bristol, UK; 4Department of Psychology, University of Amsterdam, Amsterdam, the Netherlands; 5School of Psychology, Aston University, Birmingham, UK; 6Institute of Health and Neurodevelopment, Aston University, Birmingham, UK; 7Department of Clinical Neuroscience, Karolinska Institutet, Solna, Sweden; 8Department of Psychology, Stockholm University, Stockholm, Sweden; 9School of Psychology, Cardiff University, Cardiff, UK

Conflict of Interest Disclosures

Charlotte R. Pennington is the local network lead of the UK Reproducibility Network for Aston University. Gustav Nilsonne is a member of the Committee for Open Badges and served for several years as its chair. All other authors declare no conflict of interest.

Funding/Support

Robert T. Thibault is supported by a general support grant awarded to METRICS from the Laura and John Arnold Foundation and postdoctoral fellowships from the Canadian Institutes of Health Research and the Fonds de recherche du Québec–Santé. Tom E. Hardwicke receives funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 841188. Katie Drax is supported by the John Climax Benevolent Fund. Robbie W. A. Clark is supported by a SWDTP ESRC PhD studentship. Robert T. Thibault, Robbie W. A. Clark, Katie Drax, Jacqueline Thompson, and Marcus R. Munafò are all part of the MRC Integrative Epidemiology Unit (MC_UU_00011/7).

Role of the Funder/Sponsor

The funders had no role in design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the abstract; and decision to submit the abstract for presentation.

Poster