Back to all papers

BreAST-U²Net: A Twin-Stream U<sup>2</sup>Net with Attention-based Tumor Fusion for 2-D Tumor Segmentation in Automated Breast Ultrasound.

November 1, 2025pubmed logopapers

Authors

Haqkiem N,Mohd Faizal Abdullah ER,Saw SN,Tan LK,Wong JHD,Rahmat K,Ian PTW,Mumin NA

Affiliations (5)

  • Department of Artificial Intelligence, Faculty of Computer Science & Information Technology, Universiti Malaya, Kuala Lumpur, Malaysia. Electronic address: [email protected].
  • Department of Artificial Intelligence, Faculty of Computer Science & Information Technology, Universiti Malaya, Kuala Lumpur, Malaysia. Electronic address: [email protected].
  • Department of Artificial Intelligence, Faculty of Computer Science & Information Technology, Universiti Malaya, Kuala Lumpur, Malaysia.
  • Department of Biomedical Imaging, Faculty of Medicine, Universiti Malaya, Kuala Lumpur, Malaysia; Universiti Malaya Research Imaging Centre (UMRIC), Universiti Malaya, Kuala Lumpur, Malaysia.
  • Department of Biomedical Imaging, Faculty of Medicine, Universiti Malaya, Kuala Lumpur, Malaysia; Universiti Malaya Research Imaging Centre (UMRIC), Universiti Malaya, Kuala Lumpur, Malaysia; Department of Radiology, Faculty of Medicine, Universiti Teknologi MARA, Selangor, Malaysia.

Abstract

Automated breast ultrasound (ABUS) shows potential for breast cancer diagnosis but faces tumor segmentation challenges due to limited annotated data and the tendency of deep convolution neural networks to introduce irrelevant features, hindering performance. In this paper we introduce BreAST-U²Net, a collaborative twin-stream neural network featuring an attention-based fusion mechanism. This hybrid architecture enables the model to collaboratively enrich tumor features from neighboring 2-D axial slices within a single 3-D ABUS volume, allowing for minimal short-term spatial continuity between ABUS slices while maintaining competitive and robust performance. BreAST-U²Net combines a twin-stream U²Net encoder with an efficient channel attention-based fusion module and a shared decoder tailored for breast tumor segmentation in 3-D ABUS. We evaluated its performance using Dice score, as demonstrated in ablation studies comparing BreAST-U²Net with its single-stream and twin-stream variants. Additionally, the model was benchmarked against state-of-the-art methods, including DSGMFFN and DEMAC-Net. BreAST-U²Net achieved strong performance in cases with acceptable segmentation quality (Dice ≥0.1), reaching 66.4% on Tumor Detection, Segmentation, and Classification (TDSC) and 65.2% on Universiti Malaya Medical Center (UMMC). It consistently ranked among the top models under certain shadow thresholds, demonstrating robustness to acoustic artifacts and domain shifts. However, certain levels of tumor complexity may still require further tuning, which helps to explain why overall Dice scores on the full test sets remained more modest-38.1% for TDSC and 23.8% for UMMC. BreAST-U²Net demonstrates domain-robust segmentation performance across both source and unseen ABUS datasets, achieving Dice scores of up to 66.4% with acceptable prediction quality and showing resilience to shadow artifacts through its dual-stream encoding and efficient channel attention-based feature fusion.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.