Back to all papers

Convolutional-transformer fusion of ultrasound and diffuse optical tomography for breast lesion classification.

April 15, 2026pubmed logopapers

Authors

Xue M,Bennett D,Hagemann IS,Mannix J,Wiele K,Tang A,Hossain MI,Poplack SP,Zhu Q

Affiliations (6)

  • Biomedical Engineering Department, Washington University in St. Louis, MO, 63130, USA.
  • Department of Radiology, School of Medicine, Washington University in St. Louis, MO, 63110, USA.
  • Department of Pathology and Immunology, School of Medicine, Washington University in St. Louis, MO, 63110, USA.
  • Biomedical Engineering Department, Washington University in St. Louis, MO, 63130, USA; Imaging Science Program, Washington University in St. Louis, MO, 63130, USA.
  • Department of Radiology, Stanford University School of Medicine , 300 Pasteur Drive Stanford, CA 94305,USA.
  • Biomedical Engineering Department, Washington University in St. Louis, MO, 63130, USA; Department of Radiology, School of Medicine, Washington University in St. Louis, MO, 63110, USA; Imaging Science Program, Washington University in St. Louis, MO, 63130, USA. Electronic address: [email protected].

Abstract

Accurate breast cancer diagnosis remains a significant challenge in clinical practice. This study presents a deep learning fusion framework that integrates high-resolution ultrasound (US) images with low-resolution diffuse optical tomography (DOT) total hemoglobin (HbT) images to improve diagnostic performance and reduce unnecessary benign biopsies. A cohort of 287 patients who underwent US and US-guided DOT imaging before biopsy of suspicious breast lesions was analyzed. A novel Convolutional-Transformer-Translator (CTT) fusion model was developed to automatically combine complementary US and DOT images for lesion classification. Standalone US and DOT models achieved mean areas under the ROC curve (AUCs) of 0.829 and 0.830, respectively, whereas the CTT model reached an AUC of 0.946, surpassing radiologist assessments using standard Breast Imaging Reporting and Data System (BI-RADS) and DOT-enhanced BI-RADS. At a matched sensitivity of 98%, the CTT model achieved biopsy specificity of 41.22%, compared with 8.59% for standard BI-RADS and 30.28% for DOT-enhanced BI-RADS evaluated by four study radiologists. The proposed CTT model was also compared with ten state-of-the-art classification models and demonstrated superior performance. These findings highlight the potential of the CTT fusion framework to improve breast lesion diagnostic accuracy and reduce benign biopsies in clinical workflows.

Topics

Breast NeoplasmsTomography, OpticalUltrasonography, MammaryImage Interpretation, Computer-AssistedJournal Article

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.