Back to all papers

Improving Breast Cancer Diagnosis in Ultrasound Images Using Deep Learning with Feature Fusion and Attention Mechanism.

May 27, 2025pubmed logopapers

Authors

Asif S,Yan Y,Feng B,Wang M,Zheng Y,Jiang T,Fu R,Yao J,Lv L,Song M,Sui L,Yin Z,Wang VY,Xu D

Affiliations (9)

  • Taizhou Key Laboratory of Minimally Invasive Interventional Therapy & Artificial Intelligence, Taizhou Campus of Zhejiang Cancer Hospital (Taizhou Cancer Hospital), Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., D.X.); Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., V.Y.W., D.X.); Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., V.Y.W., D.X.); Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Hangzhou, Zhejiang 310022, China (S.A., Y.Y., B.F., T.J., J.Y., L.L., M.S., M.S., L.S., V.Y.W., D.X.).
  • Taizhou Key Laboratory of Minimally Invasive Interventional Therapy & Artificial Intelligence, Taizhou Campus of Zhejiang Cancer Hospital (Taizhou Cancer Hospital), Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., D.X.); Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., V.Y.W., D.X.); Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., V.Y.W., D.X.); Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Hangzhou, Zhejiang 310022, China (S.A., Y.Y., B.F., T.J., J.Y., L.L., M.S., M.S., L.S., V.Y.W., D.X.); Zhejiang Provincial Research Center for Cancer Intelligent Diagnosis and Molecular Technology, Hangzhou 310022, China (Y.Y., T.J., J.Y., L.S., D.X.).
  • Second Clinical College, Zhejiang University of Traditional Chinese Medicine, Hangzhou, China (M.W., Y.Z., Z.Y.).
  • Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Hangzhou, Zhejiang 310022, China (S.A., Y.Y., B.F., T.J., J.Y., L.L., M.S., M.S., L.S., V.Y.W., D.X.); Zhejiang Provincial Research Center for Cancer Intelligent Diagnosis and Molecular Technology, Hangzhou 310022, China (Y.Y., T.J., J.Y., L.S., D.X.).
  • Hangzhou Fuyang Hospital of Orthopedics of Traditional Chinese Medicine Special Inspection Department, Zhejiang, China (R.F.).
  • Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Hangzhou, Zhejiang 310022, China (S.A., Y.Y., B.F., T.J., J.Y., L.L., M.S., M.S., L.S., V.Y.W., D.X.); Zhejiang Provincial Research Center for Cancer Intelligent Diagnosis and Molecular Technology, Hangzhou 310022, China (Y.Y., T.J., J.Y., L.S., D.X.); Key Laboratory of Head & Neck Cancer Translational Research of Zhejiang Province, Hangzhou 310022, China (J.Y.).
  • Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Hangzhou, Zhejiang 310022, China (S.A., Y.Y., B.F., T.J., J.Y., L.L., M.S., M.S., L.S., V.Y.W., D.X.).
  • Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., V.Y.W., D.X.); Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., V.Y.W., D.X.); Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Hangzhou, Zhejiang 310022, China (S.A., Y.Y., B.F., T.J., J.Y., L.L., M.S., M.S., L.S., V.Y.W., D.X.).
  • Taizhou Key Laboratory of Minimally Invasive Interventional Therapy & Artificial Intelligence, Taizhou Campus of Zhejiang Cancer Hospital (Taizhou Cancer Hospital), Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., D.X.); Center of Intelligent Diagnosis and Therapy (Taizhou), Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., V.Y.W., D.X.); Wenling Institute of Big Data and Artificial Intelligence in Medicine, Taizhou, Zhejiang 317502, China (S.A., Y.Y., B.F., L.S., V.Y.W., D.X.); Department of Diagnostic Ultrasound Imaging & Interventional Therapy, Zhejiang Cancer Hospital, Hangzhou Institute of Medicine (HIM), Chinese Academy of Sciences, Hangzhou, Zhejiang 310022, China (S.A., Y.Y., B.F., T.J., J.Y., L.L., M.S., M.S., L.S., V.Y.W., D.X.); Zhejiang Provincial Research Center for Cancer Intelligent Diagnosis and Molecular Technology, Hangzhou 310022, China (Y.Y., T.J., J.Y., L.S., D.X.). Electronic address: [email protected].

Abstract

Early detection of malignant lesions in ultrasound images is crucial for effective cancer diagnosis and treatment. While traditional methods rely on radiologists, deep learning models can improve accuracy, reduce errors, and enhance efficiency. This study explores the application of a deep learning model for classifying benign and malignant lesions, focusing on its performance and interpretability. In this study, we proposed a feature fusion-based deep learning model for classifying benign and malignant lesions in ultrasound images. The model leverages advanced architectures such as MobileNetV2 and DenseNet121, enhanced with feature fusion and attention mechanisms to boost classification accuracy. The clinical dataset comprises 2171 images collected from 1758 patients between December 2020 and May 2024. Additionally, we utilized the publicly available BUSI dataset, consisting of 780 images from female patients aged 25 to 75, collected in 2018. To enhance interpretability, we applied Grad-CAM, Saliency Maps, and shapley additive explanations (SHAP) techniques to explain the model's decision-making. A comparative analysis with radiologists of varying expertise levels is also conducted. The proposed model exhibited the highest performance, achieving an AUC of 0.9320 on our private dataset and an area under the curve (AUC) of 0.9834 on the public dataset, significantly outperforming traditional deep convolutional neural network models. It also exceeded the diagnostic performance of radiologists, showcasing its potential as a reliable tool for medical image classification. The model's success can be attributed to its incorporation of advanced architectures, feature fusion, and attention mechanisms. The model's decision-making process was further clarified using interpretability techniques like Grad-CAM, Saliency Maps, and SHAP, offering insights into its ability to focus on relevant image features for accurate classification. The proposed deep learning model offers superior accuracy in classifying benign and malignant lesions in ultrasound images, outperforming traditional models and radiologists. Its strong performance, coupled with interpretability techniques, demonstrates its potential as a reliable and efficient tool for medical diagnostics. The datasets generated and analyzed during the current study are not publicly available due to the nature of this research and participants of this study, but may be available from the corresponding author on reasonable request.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 7,100+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.