Back to all papers

Clinically interpretable nomogram incorporating radiomics and deep learning feature fusion from abdominal CT for preclinical type 2 diabetes.

May 12, 2026pubmed logopapers

Authors

Alanazi MA,Alharbi SS

Affiliations (2)

  • Diabetes and Chronic Diseases Unit, Faculty of Medicine, University of Tabuk, Tabuk, Saudi Arabia.
  • Department of Family and Community Medicine, Faculty of Medicine, University of Tabuk, Tabuk, Saudi Arabia. [email protected].

Abstract

To develop and externally validate a multimodal artificial intelligence framework for opportunistic detection of preclinical type 2 diabetes mellitus (T2DM) from routine portal venous-phase abdominal CT in patients without recent laboratory testing. In this multicenter retrospective study, 1257 adults without prior diabetes who underwent routine portal venous-phase abdominal CT were included. Patients were classified as preclinical T2DM or normal glucose tolerance based on fasting plasma glucose and oral glucose tolerance testing. Pancreatic segmentation was performed using an nnU-Net-based deep learning model with expert validation. From the segmented pancreas, 1708 Image Biomarker Standardization Initiative (IBSI)-compliant radiomic features were extracted following standardized preprocessing, reproducibility filtering, and ComBat harmonization. In parallel, multi-scale deep features were derived from five state-of-the-art encoder backbones, including transformer-based and segmentation-derived architectures. Clinical variables were incorporated to construct clinical-only, radiomics-only, deep-only, and multimodal fusion models. Six feature selection methods and five classifiers were systematically evaluated using stratified cross-validation. In 1257 patients (879 training, 378 external), preclinical T2DM cases were significantly older and had higher BMI, waist circumference, and glycemic indices than controls (all p < 0.001), with comparable demographics between cohorts despite protocol heterogeneity. Clinical-only models reached AUC 0.738. Radiomics improved discrimination (best AUC 0.792). Deep feature models performed better, led by MedFormer-v2 (AUC 0.834), significantly surpassing radiomics. Multimodal fusion achieved the highest external performance (best AUC 0.861). In a secondary analysis excluding all glycemic laboratory variables, the multimodal model still reached AUC 0.837, confirming the added value of imaging biomarkers for opportunistic detection. The stacking ensemble was well calibrated (AUC 0.856) and significantly outperformed three abdominal radiologists who evaluated CT images alone (mean AUC 0.671), while AI assistance improved readers' performance. Multimodal analysis of routine abdominal CT enables accurate and generalizable detection of preclinical T2DM, supporting opportunistic imaging-based metabolic risk assessment. A simplified nomogram was developed to support individualized risk estimation, although its interpretability remains partial due to the inclusion of a multimodal fusion score.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.