An explainable deep learning AI model using gadoxetic acid-enhanced MRI improves sensitivity in diagnosing hepatocellular carcinoma (HCC).
Key Details
- 1The AI model was trained on 1,023 liver lesions from 839 patients using multi-phase MRI.
- 2It classified lesions as HCC or non-HCC and provided visual explanations (LI-RADS feature identification).
- 3On a test set, the model achieved AUC 0.97 for HCC diagnosis.
- 4Compared to LI-RADS 5, AI had higher sensitivity (91.6% vs. 74.8%) with similar specificity (90.7% vs. 96%).
- 5Radiologists assisted by the AI showed improved sensitivity (up to 89%) with no loss in specificity.
- 6Explainability is highlighted, aligning with regulatory emphasis on interpretable AI.
Why It Matters
This study demonstrates that explainable AI models can significantly increase diagnostic sensitivity for HCC on MRI, supporting radiologists without sacrificing specificity. The focus on explainability is crucial for regulatory compliance and clinical trust in AI tools for radiology.

Source
AuntMinnie
Related News

•Radiology Business
Hybrid AI Approach Cuts Mammography Workload by 38%
A Dutch research team demonstrated that a 'hybrid' AI strategy can reduce radiologist workload in mammography screening by nearly 40% without affecting performance.

•AuntMinnie
Habitat AI Model Improves Risk Stratification of Lung Nodules on LDCT
A 'habitat' AI model outperforms standard 2D approaches in stratifying lung adenocarcinoma risk in subsolid nodules on low-dose CT scans.

•AuntMinnie
AI Model Uses Chest CT to Diagnose and Grade COPD Severity
A machine learning model based on chest CT images accurately diagnoses and grades the severity of COPD.