Back to all papers

Identification and Localization of Breast Tumor Components via a Convolutional Neural Network Based on High-Frequency Ultrasound Combined With Histopathologic Registration: Prospective Study.

January 23, 2026pubmed logopapers

Authors

Yao JQ,Zhou WW,Chai ZF,Ren F,Huang TY,Zhen TT,Shi HJ,Xie XY,Zhao Z,Xu M

Affiliations (4)

  • Department of Medical Ultrasonics, The First Affiliated Hospital, Sun Yat-sen University, 58 Zhongshan 2nd Road, Guangzhou, 510080, China, +86-020-8776 518.
  • Department of Medical Ultrasonics, Suzhou Municipal Hospital Affiliated with Nanjing Medical University, Suzhou, China.
  • Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China.
  • Department of Pathology, The First Affiliated Hospital, Sun Yat-sen University, Guangzhou, China.

Abstract

Given the highly heterogeneous biology of breast cancer, a more effective noninvasive diagnostic tool that unravels microscopic histopathology patterns is urgently needed. This study aims to identify cancerous regions in ultrasound images of breast cancer via convolutional neural network based on registered grayscale ultrasound images and readily accessible biopsy whole slide images (WSIs). This single-center study prospectively included participants undergoing ultrasound-guided core needle biopsy procedures for Breast Imaging Reporting and Data System category 4 or 5 breast lesions for whom breast cancer was pathologically confirmed from July 2022 to February 2023 consecutively. The basic information, ultrasound image data, biopsy tissue specimens, and corresponding WSIs were collected. After core needle biopsy procedures, the stained breast tissue specimens were sliced and coregistered with an ultrasound image of a needle tract. Convolutional neural network models for identifying breast cancer cells in ultrasound images were developed using FCN-101 and DeepLabV3 networks. The image-level predictive performance was evaluated and compared quantitatively by pixel accuracy, Dice similarity coefficient, and recall. Pixel-level classification was illustrated through confusion matrices. The cancerous region in the testing dataset was further visualized in ultrasound images. Potential clinical applications were qualitatively assessed by comparing the automatic segmentation results and the actual pathological tissue distributions. A total of 105 participants with 386 ultrasound images of breast cancer were included, with 270 (70%), 78 (20.2%), and 38 (9.8%) images in the training, validation, and test datasets, respectively. Both models performed well in predicting the cancerous regions in the biopsy area, whereas the FCN-101 model was superior to the DeepLabV3 model in terms of pixel accuracy (86.91% vs 69.55%; P=.002) and Dice similarity coefficient (77.47% vs 69.90%; P<.001). The two models yielded recall values of 54.64% and 58.46%, with no significant difference between them (P=.80). Furthermore, the FCN-101 model had an advantage in predicting cancerous regions, while the DeepLabV3 model achieved more accurate predictive pixels in normal tissue (both P<.05). Visualization of cancerous regions on grayscale ultrasound images demonstrated high consistency with those identified on WSIs. The technique for spatial registration of breast WSIs and ultrasound images of a needle tract was established. Breast cancer regions were accurately identified and localized on a pixel level in high-frequency ultrasound images via an advanced convolutional neural network with histopathologic WSI as the reference standard.

Topics

Breast NeoplasmsNeural Networks, ComputerUltrasonography, MammaryJournal Article

Ready to Sharpen Your Edge?

Subscribe to join 9,500+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.