A South Korean study finds that AI-generated chest x-ray reports are nearly as clinically acceptable as radiologist-written reports under standard criteria.
Key Details
- 1AI-generated and radiologist-written reports showed similar acceptability under a standard criterion: 88.4% vs 89.2% (p = 0.36).
- 2Under a more stringent criterion (acceptable without revision), AI was less acceptable: 66.8% vs 75.7% (p < 0.001).
- 3The model (KARA-CXR) was trained on 8.8 million chest x-rays from 42 institutions across South Korea and the U.S.
- 4AI-generated reports demonstrated higher sensitivity for referable abnormalities (81.2% vs 59.4%) but lower specificity (81% vs 93.6%) compared to radiologists.
- 5Seven thoracic radiologists independently evaluated report acceptability; most felt AI was not yet ready to replace human radiologists.
- 6Editorials note AI is diagnostically positioned between residents and board-certified radiologists.
Why It Matters

Source
AuntMinnie
Related News

Debate at RSNA 2025 Examines If AI Is Ready for Autonomous Chest X-ray Reads
Experts at RSNA 2025 debated whether AI is ready for fully autonomous interpretation of chest x-rays, concluding that while technical progress is evident, significant challenges remain.

a2z Radiology Raises $5M, Lands FDA Clearance for Multi-Condition CT AI
Boston-based a2z Radiology raised $4.5M and earned FDA clearance for its Unified Triage AI solution for abdominal and pelvic CT scans.

AI Ultrasound Tool May Cut Unnecessary Breast Biopsies by 60%
An FDA-cleared AI tool for breast ultrasound may reduce unnecessary biopsies of benign breast lesions by about 60%.