Back to all papers

Real-time deep learning for tumor segmentation and tool tracking: development and validation of an AI navigation system in vacuum-assisted breast biopsy.

November 13, 2025pubmed logopapers

Authors

Shao X,Shen Y,Sun P,Sun Y,Ju X,Zhu H,Li H,Li Q,Ting R,Liu J,Wang Y,Guo Q,Ma Y,Fei X,Sun H,Cui J

Affiliations (10)

  • Department of Thyroid and Breast Surgery, People's Hospital of China Medical University(People's Hospital of Liaoning Pronvince), Shenyang, China.
  • Dalian Medical University, Dalian, China.
  • Department of Cardiology, People's Hospital of China Medical University(People's Hospital of Liaoning Pronvince), Shenyang, China.
  • Department of General Medicine, People's Hospital of China Medical University(People's Hospital of Liaoning Pronvince), Shenyang, China.
  • The College of Medicine and Biological Information Engineering, Northeastern University, Shenyang, China.
  • Liaoning University of Traditional Chinese Medicine, Shenyang, China.
  • The Graduate School of Artificial Intelligence, China Medical University, Shenyang, China.
  • Department of Thyroid and Breast Surgery, People's Hospital of China Medical University(People's Hospital of Liaoning Pronvince), Shenyang, China. [email protected].
  • The School of Information Science and Engineering, Shenyang Ligong University, Shenyang, China. [email protected].
  • Department of Thyroid and Breast Surgery, People's Hospital of China Medical University(People's Hospital of Liaoning Pronvince), Shenyang, China. [email protected].

Abstract

Vacuum-assisted breast biopsy (VABB) is a widely adopted minimally invasive technique for the diagnosis and treatment of breast lesions. However, the procedure heavily relies on real-time ultrasound guidance, posing significant challenges for junior surgeon who lack radiological experience in accurately localizing both the lesion and the biopsy needle. Currently, there is no dedicated real-time artificital intelligence (AI) navigation system specifically designed for VABB procedures. We developed a novel two-stage real-time AI navigation system based on the YOLOv11 deep learning architecture. Model 1 performs initial localization of the tumor and the cutter slot, while Model 2 focuses on real-time tracking of the knife and tumor during resection. The system was trained and validated using 22,278 annotated ultrasound images from 167 VABB procedures conducted at People's Hospital of China Medical University. A rigorous three-fold cross-validation was implemented to assess model performance, and the localization accuracy was compared with that of junior surgeon. Additionally, we evaluated the system's real-time processing performance on both GPU and CPU platforms. The AI system demonstrated superior performance across all evaluated metrics. ForModel 1, the mean Average Precision at IoU threshold 0.5 (mAP50) for tumor detection and groove localization reached 0.907 and 0.671, respectively, significantly outperforming junior surgeon (0.551 and 0.120). For Model 2, the mAP50 for tumor and needle tip tracking were 0.829 and 0.765, respectively, compared to 0.758 and 0.350 achieved by surgeons. The system achieved a real-time processing speed of 1.2 ms per frame on GPU and 32.6 ms per frame on CPU. This study presents the first dedicated AI-based navigation system for VABB, showing substantial improvement in localization accuracy over manual operation by junior surgeon. The system's robust detection capability and real-time performance highlight its strong potential for clinical application, especially in surgical training and complex cases requiring precise instrument control. Clinical Trial Registration No. 2022JH2/101300026.

Topics

Deep LearningBreast NeoplasmsImage-Guided BiopsyBreastJournal ArticleValidation Study

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.