Back to all papers

HPCSMN: A Classification Method of Chemotherapy Sensitivity of Hypopharyngeal Cancer Based on Multimodal Network.

November 18, 2025pubmed logopapers

Authors

Fu W,Li H,Quan X,Wang X,Huang W,Zhang H

Affiliations (5)

  • National Key Laboratory of Intelligent Tracking and Forecasting for Infectious Diseases, Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, College of Artificial Intelligence, Nankai University, Tianjin, 300350, China.
  • Department of Otolaryngology, Tianjin Huanhu Hospital, Tianjin, 300350, China.
  • National Key Laboratory of Intelligent Tracking and Forecasting for Infectious Diseases, Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, College of Artificial Intelligence, Nankai University, Tianjin, 300350, China. [email protected].
  • Department of Maxillofacial & Ear, Nose and Throat, Tianjin Medical University Cancer Institute and Hospital, National Clinical Research Center for Cancer , Tianjin Clinical Research Center for Cancer, Key Laboratory of Cancer Prevention and Therapy, Tianjin, 300060, China. [email protected].
  • College of Information and Intelligence, Hunan Agricultural University, Changsha, 410128, China.

Abstract

The treatment of hypopharyngeal cancer faces complex challenges, and accurate prediction of chemotherapy sensitivity is crucial for personalized treatment. In this study, a multimodal fusion network based on deep learning was used to classify the chemotherapy sensitivity of hypopharyngeal cancer, and the prediction accuracy was improved by integrating 3D CT images and radiomic features. The preprocessed and enhanced 3D CT images were analyzed by 3D ResNet branches to extract spatial features; the radiomic features screened by LASSO regression were processed by three layers of fully connected branches to analyze the tabular data. The extracted vectors were fused by fully connected layers, using complementary advantages to capture complex spatial dependencies and detailed radiomic features. Experiments on the manually segmented NKU-TMU-hphc dataset (containing 102 hypopharyngeal cancer CT images) showed that the multimodal fusion network had high accuracy and outperformed single-modality methods and other models in multiple evaluation indicators. Statistical analysis was performed on the extracted features and clinical characteristics. The model effectively integrates image and clinical data, provides a new method for chemotherapy sensitivity classification, and is expected to improve personalized medicine.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.