A deep learning model leveraging SAM improved segmentation speed and matched radiologist performance in classifying ovarian lesions on MRI.
Key Details
- 1Researchers from Johns Hopkins developed an MRI-based, end-to-end DL pipeline incorporating Meta’s Segment Anything Model (SAM) and DenseNet-121.
- 2Integrated model reduced segmentation time by 4 minutes per lesion compared to manual segmentation.
- 3SAM achieved a Dice coefficient of 0.86 to 0.88 for lesion segmentation.
- 4DL model achieved an AUC of 0.85 internally and 0.79 externally, on par with radiologists’ AUC of 0.84 (p > 0.05).
- 5Training data included 534 lesions (internal) and 87 lesions (external) from the US and Taiwan.
Why It Matters

Source
AuntMinnie
Related News

Toronto Study: LLMs Must Cite Sources for Radiology Decision Support
University of Toronto researchers found that large language models (LLMs) such as DeepSeek V3 and GPT-4o offer promising support for radiology decision-making in pancreatic cancer when their recommendations cite guideline sources.

AI Model Using Mammograms Enhances Five-Year Breast Cancer Risk Assessment
A new image-only AI model more accurately predicts five-year breast cancer risk than breast density alone, according to multinational research presented at RSNA 2025.

AI Model Uses CT Scans to Reveal Biomarker for Chronic Stress
Researchers developed an AI model to measure chronic stress using adrenal gland volume on routine CT scans.