Automated whole-breast ultrasound tumor diagnosis using attention-inception network.

Authors

Zhang J,Huang YS,Wang YW,Xiang H,Lin X,Chang RF

Affiliations (6)

  • Graduate Institute of Biomedical Electronics and Bioinformatics, National Taiwan University, Taipei, Taiwan.
  • Department of Computer Science and Information Engineering, National Changhua University of Education, Changhua, Taiwan.
  • Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan.
  • Department of Ultrasound, Sun Yat-sen University Cancer Center, Guangzhou, China.
  • Department of Ultrasound, Sun Yat-sen University Cancer Center, Guangzhou, China. Electronic address: [email protected].
  • Graduate Institute of Biomedical Electronics and Bioinformatics, National Taiwan University, Taipei, Taiwan; Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan; Graduate Institute of Network and Multimedia, National Taiwan University, Taipei, Taiwan. Electronic address: [email protected].

Abstract

Automated Whole-Breast Ultrasound (ABUS) has been widely used as an important tool in breast cancer diagnosis due to the ability of this technique to provide complete three-dimensional (3D) images of breasts. To eliminate the risk of misdiagnosis, computer-aided diagnosis (CADx) systems have been proposed to assist radiologists. Convolutional neural networks (CNNs), renowned for the automatic feature extraction capabilities, have developed rapidly in medical image analysis, and this study proposes a CADx system based on 3D CNN for ABUS. This study used a private dataset collected at Sun Yat-Sen University Cancer Center (SYSUCC) from 396 breast tumor patients. First, the tumor volume of interest (VOI) was extracted and resized, and then the tumor was enhanced by histogram equalization. Second, a 3D U-Net++ was employed to segment the tumor mask. Finally, the VOI, the enhanced VOI, and the corresponding tumor mask were fed into a 3D Attention-Inception network to classify the tumor as benign or malignant. The experiment results indicate an accuracy of 89.4%, a sensitivity of 91.2%, a specificity of 87.6%, and an area under the receiver operating characteristic curve (AUC) of 0.9262, which suggests that the proposed CADx system for ABUS images rivals the performance of experienced radiologists in tumor diagnosis tasks. This study proposes a CADx system consisting of a 3D U-Net++ tumor segmentation model and a 3D attention inception neural network tumor classification model for diagnosis in ABUS images. The results indicate that the proposed CADx system is effective and efficient in tumor diagnosis tasks.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.