Leveraging co-attention-based interactions between pathology and radiology for pan-cancer survival prediction.
Authors
Affiliations (5)
Affiliations (5)
- Department of Bioinformatics and Biostatistics, School of Life Sciences and Biotechnology, Shanghai Jiao Tong University, Shanghai, China; SJTU-Yale Joint Center for Biostatistics and Data Science, School of Life Sciences and Biotechnology, Shanghai Jiao Tong University, Shanghai, China.
- Department of Statistics, School of Mathematical Sciences, Shanghai Jiao Tong University, Shanghai, China; SJTU-Yale Joint Center for Biostatistics and Data Science, School of Life Sciences and Biotechnology, Shanghai Jiao Tong University, Shanghai, China.
- Clinical Research Institute, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
- Department of Urology, Renji Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
- Department of Bioinformatics and Biostatistics, School of Life Sciences and Biotechnology, Shanghai Jiao Tong University, Shanghai, China; Department of Statistics, School of Mathematical Sciences, Shanghai Jiao Tong University, Shanghai, China; SJTU-Yale Joint Center for Biostatistics and Data Science, School of Life Sciences and Biotechnology, Shanghai Jiao Tong University, Shanghai, China; Clinical Research Institute, Shanghai Jiao Tong University School of Medicine, Shanghai, China. Electronic address: [email protected].
Abstract
Integrating macroscopic radiology with microscopic pathology can provide a multiscale view of tumors, thereby advancing oncology research. However, utilizing the interplay between the two for survival analysis, particularly across diverse cancer types, remains underexplored. This study utilized PaRa-MIL, a co-attention-based early-fusion framework, to enhance survival prediction and excavate pan-cancer pathological-radiological interaction across seven cancer types. Pathological-radiological interactions are exploited across seven cancer types using 1436 patients. Multi-faceted features were extracted from both pathological and radiological images and packed into omics bags. Then, PaRa-MIL, with a co-attention module to characterize pathomics-radiomics relationships, fuses the multi-modal features for oncological prognostic prediction. Cancer-specific model training validates the advantage of the PaRa-MIL architecture. Next, inspired by foundational models that integrate heterogeneous data in the pretraining stage to assist in downstream few-shot fine-tuning, we jointly trained the pathological and radiological features from different cancer types to uncover the potential cross-cancer pathomics-radiomics interaction patterns and then to assist small-sample survival modeling. PaRa-MIL outperforms all comparative deep learning-based methods as well as conventional staging systems on prognostic prediction performances. Besides, compared to the train-from-scratch approach, leveraging the learned cross-cancer pathological-radiological interactions yields an average improvement of nearly 5% in the c-index for survival prediction under small-sample scenarios. Furthermore, specific forms of the cross-cancer pathological-radiological synergies are uncovered through gradient-based analysis. This study can deepen the understanding of radiology and pathology and their interactions in oncology, thereby facilitating deeper insights into cancer imaging phenotypes and pathobiological traits, and contribute to more accurate prognostic predictions.