Back to all papers

RadCLARE: an automated clinical language engine for detecting semantic errors in radiology reports.

December 22, 2025pubmed logopapers

Authors

Pan F,Lou J,Guo Y,Du W,Wang Z,Fan Q,Wang H,Zheng C,Yang L

Affiliations (11)

  • Department of Radiology, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China.
  • Hubei Provincial Clinical Research Center for Precision Radiology & Interventional Medicine, Wuhan, China.
  • Hubei Key Laboratory of Molecular Imaging, Wuhan, China.
  • WanLiCloud Healthcare IT Co., Ltd., Beijing, China.
  • Information and Data Center, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China.
  • Department of Radiology, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China. [email protected].
  • Hubei Provincial Clinical Research Center for Precision Radiology & Interventional Medicine, Wuhan, China. [email protected].
  • Hubei Key Laboratory of Molecular Imaging, Wuhan, China. [email protected].
  • Department of Radiology, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China. [email protected].
  • Hubei Provincial Clinical Research Center for Precision Radiology & Interventional Medicine, Wuhan, China. [email protected].
  • Hubei Key Laboratory of Molecular Imaging, Wuhan, China. [email protected].

Abstract

Errors in radiology reports can result in inappropriate/harmful decisions. We investigated whether large language models can reduce the error rate. We developed the radiology-specific clinical language anomaly recognition engine (RadCLARE) network, an automated engine based on the bidirectional encoder representations from transformers (BERT)-base model, designed to detect semantic errors in Chinese radiology reports and trained using 1.4 million reports, including 615,920 digital radiography, 560,310 computed tomography reports, and 223,480 magnetic resonance reports. One thousand reports were randomly selected for expert manual annotation. Inter-reader agreement for error detection and classification was assessed using Cohen κ and Gwet AC1. The RadCLARE's detection was compared against the expert references. Changes in error rates before (baseline test dataset, BTD) and after (experimental test dataset, ETD) RadCLARE implementation were analyzed. Finally, radiologists were invited to complete questionnaires to evaluate satisfaction and rate the system across five dimensions. Among the 1,000 reports, a total of 506 errors were identified as the reference standard. Inter-reader agreement was substantial for error detection (κ = 0.77) and excellent for error classification (Gwet AC1 = 0.94). RadCLARE successfully detected 437/506 errors, with 87.3% accuracy, 88.3% precision, 86.4% recall, and 87.4% F1-score. The BTD comprised 571,264 reports, the ETD 873,030 reports. After RadCLARE implementation, the semantic error rate dropped significantly compared to the BTD (error rate, 0.85% [7408/873,030] versus 4.19% [23,909/571,264]; p < 0.001). The questionnaire results showed that 95.7% (44/46) of radiologists were satisfied with RadCLARE. RadCLARE showed the capability for automatic detection of semantic errors in radiology reports. RadCLARE demonstrated high performance in detecting semantic errors in radiology reports. Future studies should aim to extend their applicability across multiple languages and institutions. We developed the RadCLARE network, a BERT-based engine for detecting semantic errors in Chinese radiology reports. With the aid of RadCLARE, the semantic error rate in radiology reports dropped significantly from 4.19% to 0.85%. The large majority (96%) of radiologists who participated in the test were satisfied with the RadCLARE and felt that it reduced stress.

Topics

SemanticsRadiology Information SystemsDiagnostic ErrorsRadiologyJournal Article

Ready to Sharpen Your Edge?

Subscribe to join 7,500+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.