Generative AI for weakly supervised segmentation and downstream classification of brain tumors on MR images.

Authors

Yoo JJ,Namdar K,Wagner MW,Yeom KW,Nobre LF,Tabori U,Hawkins C,Ertl-Wagner BB,Khalvati F

Affiliations (20)

  • University of Toronto, Institute of Medical Science, Toronto, M5S 1A8, Canada.
  • Department of Diagnostic & Interventional Radiology, The Hospital for Sick Children, Toronto, M5G 1X8, Canada.
  • Department of Computer Science, University of Toronto, Toronto, M5S 1A8, Canada.
  • Vector Institute, Toronto, M5G 1M1, Canada.
  • Department of Medical Imaging, University of Toronto, Toronto, M5S 1A8, Canada.
  • University Hospital Augsburg, Institute of Diagnostic and Interventional Neuroradiology, Augsburg, 86156, Germany.
  • Department of Radiology, Stanford University School of Medicine, Stanford, 94305, USA.
  • Lucile Packard Children's Hospital at Stanford, Stanford, 94304, USA.
  • Division of Hematology/Oncology (iHOPE), Department of Pediatrics, University of Alberta, Edmonton, T6G 1C9, Canada.
  • Division of Hematology/Oncology, The Hospital for Sick Children, Toronto, M5G 1X8, Canada.
  • Developmental and Stem Cell Biology Program, The Hospital for Sick Children, Toronto, M5G 1X8, Canada.
  • Department of Medical Biophysics, University of Toronto, Toronto, M5S 1A8, Canada.
  • Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, M5S 1A8, Canada.
  • Department of Paediatric Laboratory Medicine, The Hospital for Sick Children, Toronto, M5G 1X8, Canada.
  • University of Toronto, Institute of Medical Science, Toronto, M5S 1A8, Canada. [email protected].
  • Department of Diagnostic & Interventional Radiology, The Hospital for Sick Children, Toronto, M5G 1X8, Canada. [email protected].
  • Department of Medical Imaging, University of Toronto, Toronto, M5S 1A8, Canada. [email protected].
  • Department of Computer Science, University of Toronto, Toronto, M5S 1A8, Canada. [email protected].
  • Department of Mechanical and Industrial Engineering, University of Toronto, Toronto, M5S 1A8, Canada. [email protected].
  • Vector Institute, Toronto, M5G 1M1, Canada. [email protected].

Abstract

Segmenting abnormalities is a leading problem in medical imaging. Using machine learning for segmentation generally requires manually annotated segmentations, demanding extensive time and resources from radiologists. We propose a weakly supervised approach that utilizes binary image-level labels, which are much simpler to acquire, rather than manual annotations to segment brain tumors on magnetic resonance images. The proposed method generates healthy variants of cancerous images for use as priors when training the segmentation model. However, using weakly supervised segmentations for downstream tasks such as classification can be challenging due to occasional unreliable segmentations. To address this, we propose using the generated non-cancerous variants to identify the most effective segmentations without requiring ground truths. Our proposed method generates segmentations that achieve Dice coefficients of 79.27% on the Multimodal Brain Tumor Segmentation (BraTS) 2020 dataset and 73.58% on an internal dataset of pediatric low-grade glioma (pLGG), which increase to 88.69% and 80.29%, respectively, when removing suboptimal segmentations identified using the proposed method. Using the segmentations for tumor classification results with Area Under the Characteristic Operating Curve (AUC) of 93.54% and 83.74% on the BraTS and pLGG datasets, respectively. These are comparable to using manual annotations which achieve AUCs of 95.80% and 83.03% on the BraTS and pLGG datasets, respectively.

Topics

Brain NeoplasmsMagnetic Resonance ImagingGliomaImage Processing, Computer-AssistedSupervised Machine LearningImage Interpretation, Computer-AssistedJournal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.