Abstract

Rejection Rates for Manuscripts Uploaded to an Artificial Intelligence–Driven Precheck Tool Compared With Manuscripts That Did Not Undergo a Precheck at a Multidisciplinary Medical Journal

Duncan A. MacRae,1 Abhishek Sudra,2 Kara Hamilton1

Objective

Online precheck tools identify common errors in grammar and formatting and are intended to help authors identify missing declarations and common language issues prior to first submission. The purpose of this study was to evaluate the use of an artificial intelligence–driven precheck tool and to examine the resulting association with initial rejection rates.

Design

This cohort study involved original research manuscripts submitted to Medicine, an open access multidisciplinary medical journal, during a 7-month period from June 2021 to January 2022. Prior to submission, authors were encouraged to upload their manuscript to an online artificial intelligence–driven precheck tool, which understands the precise meaning of phrases within a document and automatically captures both semantic and syntactic variations. The tool is configured to check for language and grammar quality as well as the presence of ethics statements, conflicts of interest declarations, and adherence to word count limits. The precheck tool offers 2 levels of feedback: a free basic report, which summarizes issues that the system suggests should be addressed prior to submission, and a premium check (costing US $29), which provides the author with a downloadable Word document containing all suggested changes in detail. Authors were not mandated to use the precheck tool, and the choice to purchase the premium report was entirely at the author’s discretion. The resulting report was provided to the authors so that changes could be made prior to submission. The journal editors did not receive a copy of the report. All manuscripts were also subjected to a technical check carried out by the editorial office prior to the assignment of editors or reviewers. Articles uploaded to the precheck tool platform were then crosschecked against all articles submitted to the journal’s submission platform, allowing the journal to compare the proportions initially rejected (ie, decisions made prior to undergoing peer review) amongst the 3 distinct groups.

Results

Among 7904 submitted manuscripts, author selections for the 3 groups of manuscripts (no precheck, basic precheck, and premium precheck) and numbers initially rejected are detailed in Table 51. Among manuscripts in the no precheck group, 2073 of 6062 (34.2%) were rejected following technical check compared with 333 of 1661 (20.1%) in the basic precheck group and 13 of 181 (7.3%) in the premium precheck group. Overall, 15.4% fewer manuscripts that underwent prechecking were rejected compared with those that underwent no prechecking (346 of 1842 [18.8%] vs 2073 of 6062 [34.2%]).

Conclusions

The use of a precheck tool to assist authors in identifying language errors and missing manuscript elements prior to submission was associated with a decrease in initial manuscript rejections (Table 51).

1Wolters Kluwer Health, Philadelphia, PA, USA, duncan.macrae@wolterskluwer.com; 2Cactus Communications, Mumbai, India

Conflict of Interest Disclosures

None reported.

Poster