Back to all papers

End-to-end CNN-based deep learning enhances breast lesion characterization using quantitative ultrasound (QUS) spectral parametric images.

Authors

Osapoetra LO,Moslemi A,Moore-Palhares D,Halstead S,Alberico D,Hwang A,Sannachi L,Curpen B,Czarnota GJ

Affiliations (8)

  • Physical Sciences, Sunnybrook Research Institute, Toronto, Canada.
  • Department of Radiation Oncology, Sunnybrook Health Sciences Centre, 2075 Bayview Avenue, Suite T2-167, Toronto, ON, M4N 3M5, Canada.
  • Department of Medical Imaging, University of Toronto, Toronto, ON, Canada.
  • Physical Sciences, Sunnybrook Research Institute, Toronto, Canada. [email protected].
  • Department of Radiation Oncology, Sunnybrook Health Sciences Centre, 2075 Bayview Avenue, Suite T2-167, Toronto, ON, M4N 3M5, Canada. [email protected].
  • Department of Radiation Oncology, University of Toronto, Toronto, Canada. [email protected].
  • Department of Physics, Toronto Metropolitan University, Toronto, Canada. [email protected].
  • Department of Medical Biophysics, University of Toronto, Toronto, Canada. [email protected].

Abstract

QUS spectral parametric imaging offers a fast and accurate method for breast lesion characterization. This study explored using deep CNNs to classify breast lesions from QUS spectral parametric images, aiming to enhance radiomics and conventional machine learning. Predictive models were developed using transfer learning with pre-trained CNNs to distinguish malignant from benign lesions. The dataset included 276 participants: 184 malignant (median age, 51 years [IQR: 27-81 years]) and 92 benign cases (median age, 46 years [IQR: 18-75 years]). QUS spectral parametric imaging was applied to the US RF data and resulted in 1764 images of QUS spectral (MBF, SS, and SI), along with QUS scattering parameters (ASD and AAC). The data were randomly split into 60% training, 20% validation, and 20% test sets, stratified by lesion subtype, and repeated five times. The number of convolutional blocks was optimized, and the final convolutional layer was fine-tuned. Models tested included ResNet, Inception-v3, Xception, and EfficientNet. Xception-41 achieved a recall of 86 ± 3%, specificity of 87 ± 5%, balanced accuracy of 87 ± 3%, and an AUC of 0.93 ± 0.02 on test sets. EfficientNetV2-M showed similar performance with a recall of 91 ± 1%, specificity of 81 ± 7%, balanced accuracy of 86 ± 3%, and an AUC of 0.92 ± 0.02. CNN models outperformed radiomics and conventional machine learning (p-values < 0.05). This study demonstrated the capability of end-to-end CNN-based models for the accurate characterization of breast masses from QUS spectral parametric images.

Topics

Deep LearningBreast NeoplasmsUltrasonography, MammaryJournal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.