HPCSMN: A Classification Method of Chemotherapy Sensitivity of Hypopharyngeal Cancer Based on Multimodal Network.
Authors
Affiliations (5)
Affiliations (5)
- National Key Laboratory of Intelligent Tracking and Forecasting for Infectious Diseases, Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, College of Artificial Intelligence, Nankai University, Tianjin, 300350, China.
- Department of Otolaryngology, Tianjin Huanhu Hospital, Tianjin, 300350, China.
- National Key Laboratory of Intelligent Tracking and Forecasting for Infectious Diseases, Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, College of Artificial Intelligence, Nankai University, Tianjin, 300350, China. [email protected].
- Department of Maxillofacial & Ear, Nose and Throat, Tianjin Medical University Cancer Institute and Hospital, National Clinical Research Center for Cancer , Tianjin Clinical Research Center for Cancer, Key Laboratory of Cancer Prevention and Therapy, Tianjin, 300060, China. [email protected].
- College of Information and Intelligence, Hunan Agricultural University, Changsha, 410128, China.
Abstract
The treatment of hypopharyngeal cancer faces complex challenges, and accurate prediction of chemotherapy sensitivity is crucial for personalized treatment. In this study, a multimodal fusion network based on deep learning was used to classify the chemotherapy sensitivity of hypopharyngeal cancer, and the prediction accuracy was improved by integrating 3D CT images and radiomic features. The preprocessed and enhanced 3D CT images were analyzed by 3D ResNet branches to extract spatial features; the radiomic features screened by LASSO regression were processed by three layers of fully connected branches to analyze the tabular data. The extracted vectors were fused by fully connected layers, using complementary advantages to capture complex spatial dependencies and detailed radiomic features. Experiments on the manually segmented NKU-TMU-hphc dataset (containing 102 hypopharyngeal cancer CT images) showed that the multimodal fusion network had high accuracy and outperformed single-modality methods and other models in multiple evaluation indicators. Statistical analysis was performed on the extracted features and clinical characteristics. The model effectively integrates image and clinical data, provides a new method for chemotherapy sensitivity classification, and is expected to improve personalized medicine.