Deep learning radiopathomics based on pretreatment MRI and whole slide images for predicting over survival in locally advanced nasopharyngeal carcinoma.

Authors

Yi X,Yu X,Li C,Li J,Cao H,Lu Q,Li J,Hou J

Affiliations (5)

  • Department of Diagnostic Radiology, Hunan Cancer Hospital and the Affiliated Cancer Hospital of Xiangya School of Medicine, Central South University, Changsha 410013 Hunan, PR China.
  • Hunan Provincial Key Laboratory on Bioinformatics, School of Computer Science and Engineering, Central South University, Changsha 410083, PR China.
  • Department of Health Service Center, Hunan Cancer Hospital and the Affiliated Cancer Hospital of Xiangya School of Medicine, Central South University, Changsha 410013 Hunan, PR China.
  • Department of Pathology, Hunan Cancer Hospital and the Affiliated Cancer Hospital of Xiangya School of Medicine, Central South University, 283 Tongzipo Road, Yuelu District, Changsha, Hunan 410013, PR China.
  • Department of Diagnostic Radiology, Hunan Cancer Hospital and the Affiliated Cancer Hospital of Xiangya School of Medicine, Central South University, Changsha 410013 Hunan, PR China. Electronic address: [email protected].

Abstract

To develop an integrative radiopathomic model based on deep learning to predict overall survival (OS) in locally advanced nasopharyngeal carcinoma (LANPC) patients. A cohort of 343 LANPC patients with pretreatment MRI and whole slide image (WSI) were randomly divided into training (n = 202), validation (n = 91), and external test (n = 50) sets. For WSIs, a self-attention mechanism was employed to assess the significance of different patches for the prognostic task, aggregating them into a WSI-level representation. For MRI, a multilayer perceptron was used to encode the extracted radiomic features, resulting in an MRI-level representation. These were combined in a multimodal fusion model to produce prognostic predictions. Model performances were evaluated using the concordance index (C-index), and Kaplan-Meier curves were employed for risk stratification. To enhance model interpretability, attention-based and Integrated Gradients techniques were applied to explain how WSIs and MRI features contribute to prognosis predictions. The radiopathomics model achieved high predictive accuracy in predicting the OS, with a C-index of 0.755 (95 % CI: 0.673-0.838) and 0.744 (95 % CI: 0.623-0.808) in the training and validation sets, respectively, outperforming single-modality models (radiomic signature: 0.636, 95 % CI: 0.584-0.688; deep pathomic signature: 0.736, 95 % CI: 0.684-0.810). In the external test, similar findings were observed for the predictive performance of the radiopathomics, radiomic signature, and deep pathomic signature, with their C-indices being 0.735, 0.626, and 0.660 respectively. The radiopathomics model effectively stratified patients into high- and low-risk groups (P < 0.001). Additionally, attention heatmaps revealed that high-attention regions corresponded with tumor areas in both risk groups. n: The radiopathomics model holds promise for predicting clinical outcomes in LANPC patients, offering a potential tool for improving clinical decision-making.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.