Back to all papers

Interpretable multimodal PET/CT-EHR fusion via mixture-of-experts for prognostic stratification in mantle cell lymphoma: a multicenter study.

April 16, 2026pubmed logopapers

Authors

Jiang C,Zhang Z,Jiang Z,Ding C,Teng Y,Gao L,Jiang M,Qu L,Tian R

Affiliations (9)

  • Department of Nuclear Medicine and Clinical Nuclear Medicine Research Lab, West China Hospital, Sichuan University, Chengdu, 610041, Sichuan, China.
  • Department of Nuclear Medicine, West China Hospital, Sichuan University, Chengdu, Sichuan, China.
  • West China Biomedical Big Data Center, West China Hospital, Sichuan University, Chengdu, China.
  • Department of Nuclear Medicine, the First Affiliated Hospital of Nanjing Medical University, Jiangsu Province Hospital, Nanjing, China.
  • Department of Nuclear Medicine, Nanjing Drum Tower Hospital, the Affiliated Hospital of Nanjing University Medical School, Nanjing, China.
  • Department of Pathology, West China Hospital, Sichuan University, Chengdu, China. [email protected].
  • Department of Oncology, West China Hospital of Sichuan University, Chengdu, China. [email protected].
  • Shanghai Medical College, Fudan University, Shanghai, China. [email protected].
  • Department of Nuclear Medicine, West China Hospital, Sichuan University, Chengdu, Sichuan, China. [email protected].

Abstract

Mantle cell lymphoma (MCL) is a rare, biologically heterogeneous B-cell malignancy with highly variable outcomes. Existing prognostic tools are suboptimal. We developed an interpretable deep learning framework integrating baseline [<sup>18</sup>F]FDG PET/CT and electronic health record (EHR) data for individualized risk stratification. In this multicenter study, 187 treatment-naïve MCL patients were analyzed. A mixture-of-experts (MoE) fusion network integrated multimodal representations from PET/CT and EHR data. Expert modules comprising vision encoders, radiomics extractors, and a medical language model were integrated through an attention-based gating mechanism to construct multimodal radiomic signatures (R-signatures) predictive of progression-free survival (PFS) and overall survival (OS). R-signatures were validated and incorporated with clinical and metabolic factors into multiparametric models. Deep learning model interpretability was evaluated using attention visualization, expert-level contributions and pathologic correlation. R-signatures robustly discriminated relapse (AUC = 0.893 training, 0.755 validation) and death (AUC = 0.804 and 0.844), and independently predicted adverse outcomes (PFS: HR = 27.70, P < 0.001; OS: HR = 6.86, P = 0.001). Multiparametric models integrating R-signatures with total lesion glycolysis, β2-microglobulin, WBC, and Ki-67 outperformed conventional indices (C-indices: PFS 0.892 training, 0.781 validation; OS 0.877 training, 0.862 validation). Time-dependent ROC analyses consistently showed AUCs approaching or exceeding 0.800. Calibration and decision curve analyses confirmed excellent agreement and superior clinical net benefit. Attention maps localized high-weighted regions to hypermetabolic tumor areas, with higher R-signature values in blastoid and pleomorphic variants versus classical histology (P = 0.028 and P = 0.010). This interpretable PET/CT-EHR fusion framework substantially improves prognostic precision in MCL, providing a noninvasive, clinically translatable tool for risk-adapted management.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.