Back to all papers

A Knowledge-Guided Multi-modal Neural Network for Breast Cancer Molecular Subtyping.

December 11, 2025pubmed logopapers

Authors

Ye J,Liu Y,Ren S,Wang C,Zhou Y,Yang L,Zhang W

Abstract

Precise determination of HER2 subtype is essential for selecting appropriate targeted therapies in breast cancer. However, current HER2 assessment methods remain dependent on invasive tissue biopsies, which are limited by tumor heterogeneity and sampling bias. To address these challenges, this paper proposes a knowledge-guided multi-modal neural network (KMNet) for non-invasive HER2 subtyping by integrating clinical data and ultrasound images. KMNet introduces a Graph-based Clinical Feature encoder (GCF), which constructs a causal graph among clinical indicators based on medical knowledge and extracts high-order feature relationships via the Graph Convolutional Network (GCN). Meanwhile, the Convolutional Neural Network (CNN) and Vision Transformer (ViT)-based hybrid image encoder (CVUIF) captures both local details (calcifications and blood flow) and global dependencies between intra- and peritumoral regions. In addition, the Reduced Dimensional Fusion (RDF) module integrates key information from clinical graph features, ultrasound image features, and structured clinical data to construct a unified multi-modal representation for downstream HER2 subtyping task. Experiments were conducted on the private datasets (HER2USC) and the public datasets (BCW, BCa and SIIM-ISIC). Experimental results demonstrate that KMNet outperformed other reported state-ofthe- art multi-modal algorithms in HER2 subtyping task, offering strong potential for clinical decision support in breast cancer treatment.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 7,100+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.