Back to all papers

Artificial Intelligence-Enabled Imaging for Predicting Preoperative Extraprostatic Extension in Prostate Cancer: Systematic Review and Meta-Analysis.

December 9, 2025pubmed logopapers

Authors

Zhang X,Qi Y,Wang X,Chen H,Li J

Affiliations (3)

  • Department of Nursing, Zhuhai Campus of Zunyi Medical University, 368 Jinhaian Community, Sanzao Town, Jinwan District, Zhuhai, Guangdong Province, 519000, China, 86 137 2625 6630.
  • Department of Ultrasound Imaging, Zhuhai People's Hospital, The Affiliated Hospital of Beijing Institute of Technology, Zhuhai Clinical Medical College of Jinan University, Zhuhai, Guangdong, China.
  • Department of Nursing, Kiang Wu Nursing College of Macau, Macau, China.

Abstract

Artificial intelligence (AI) techniques, particularly those using machine learning and deep learning to analyze multimodal imaging data, have shown considerable promise in enhancing preoperative prediction of extraprostatic extension (EPE) in prostate cancer. This meta-analysis compares the diagnostic performance of AI-enabled imaging techniques with that of radiologists for predicting preoperative EPE in prostate cancer. We conducted a systematic literature search in PubMed, Embase, and Web of Science up to September 2025, following PRISMA-DTA (Preferred Reporting Items for Systematic Reviews and Meta-Analysis of Diagnostic Test Accuracy) guidelines. Studies applying AI techniques to predict EPE using multiparametric magnetic resonance imaging (mpMRI) and prostate-specific membrane antigen positron emission tomography (PSMA PET) imaging were included. Sensitivity, specificity, and area under the curve (AUC) for both internal and external validation sets were extracted and pooled using a bivariate random effects model. Study quality was assessed using the modified Quality Assessment of Diagnostic Performance Studies (QUADAS-2) tool. A total of 21 studies were included in the analysis. For internal validation sets in patient-based analyses, mpMRI-based AI demonstrated a pooled sensitivity of 0.77 (95% CI 0.71-0.82), specificity of 0.71 (95% CI 0.64-0.78), and AUC of 0.81 (95% CI 0.77-0.84). In external validation, mpMRI-based AI achieved a sensitivity of 0.66 (95% CI 0.43-0.84), specificity of 0.80 (95% CI 0.64-0.90), and AUC of 0.80 (95% CI 0.77-0.84). In comparison, radiologists achieved a pooled sensitivity of 0.69 (95% CI 0.60-0.76), specificity of 0.73 (95% CI 0.66-0.78), and AUC of 0.77 (95% CI 0.73-0.80). Statistical comparisons between mpMRI-based AI and radiologists showed no significant difference in sensitivity (Z=1.61; P=.10), specificity (Z=0.43; P=.67). Conversely, the AUC of mpMRI-based AI was significantly higher than that of PSMA PET-based (Z=2.77; P=.01). PSMA PET-based AI showed moderate performance with sensitivity of 0.73 (95% CI 0.65-0.80), specificity of 0.61 (95% CI 0.30-0.85), and AUC of 0.74 (95% CI 0.70-0.77) in internal validation, and in external validation, it demonstrated sensitivity of 0.77 (95% CI 0.57-0.89) and specificity of 0.50 (95% CI 0.22-0.78), demonstrating no significant advantage over radiologists. mpMRI-based AI demonstrated improved diagnostic performance for preoperative prediction of EPE in prostate cancer compared to conventional radiological assessment, achieving higher AUC. However, PSMA PET-based AI models currently offer no significant advantage over either mpMRI-based AI or radiologists. Limitations include the retrospective design and high heterogeneity, which may introduce bias and affect generalizability. Larger, more diverse cohorts are essential for confirming these findings and optimizing the integration of AI in clinical practice.

Topics

Prostatic NeoplasmsArtificial IntelligenceJournal ArticleSystematic ReviewMeta-AnalysisReview

Ready to Sharpen Your Edge?

Subscribe to join 7,100+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.