Hybrid adaptive attention deep supervision-guided U-Net for breast lesion segmentation in ultrasound computed tomography images.

Authors

Liu X,Zhou L,Cai M,Zheng H,Zheng S,Wang X,Wang Y,Ding M

Affiliations (6)

  • Department of Bio-Medical Engineering, School of Life Science and Technology, Advanced Biomedical Imaging Facility, Huazhong University of Science and Technology, Wuhan, 430074, Hubei, China.
  • WeSee Medical Imaging Inc, Wuhan, 430070, Hubei, China.
  • Breast Cancer Center, Hubei Cancer Hospital, Tongji Medical College, Huazhong University of Science and Technology, National Key Clinical Specialty Discipline Construction Program, Hubei Provincial Clinical Research Center for Breast Cancer, Wuhan Clinical Research Center for Breast Cancer, Wuhan, 430079, Hubei, China.
  • Department of Ultrasound, Hubei Cancer Hospital, Wuhan, 430079, Hubei, China.
  • Department of Ultrasound, People's Hospital of Macheng City, Macheng City, 438399, Hubei, China.
  • Department of Bio-Medical Engineering, School of Life Science and Technology, Advanced Biomedical Imaging Facility, Huazhong University of Science and Technology, Wuhan, 430074, Hubei, China. [email protected].

Abstract

Breast cancer is the second deadliest cancer among women after lung cancer. Though the breast cancer death rate continues to decline in the past 20 years, the stages IV and III breast cancer death rates remain high. Therefore, an automated breast cancer diagnosis system is of great significance for early screening of breast lesions to improve the survival rate of patients. This paper proposes a deep learning-based network hybrid adaptive attention deep supervision-guided U-Net (HAA-DSUNet) for breast lesion segmentation of breast ultrasound computed tomography (BUCT) images, which replaces the traditionally sampled convolution module of U-Net with the hybrid adaptive attention module (HAAM), aiming to enlarge the receptive field and probe rich global features while preserving fine details. Moreover, we apply the contrast loss to intermediate outputs as deep supervision to minimize the information loss during upsampling. Finally, the segmentation prediction results are further processed by filtering, segmentation, and morphology to obtain the final results. We conducted the experiment on our two UCT image datasets HCH and HCH-PHMC, and the highest Dice score is 0.8729 and IoU is 0.8097, which outperform all the other state-of-the-art methods. It is demonstrated that our algorithm is effective in segmenting the legion from BUCT images.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.