Back to all papers

Artificial intelligence-driven prostate cancer diagnosis: Enhancing accuracy and personalizing patient care.

December 20, 2025pubmed logopapers

Authors

Zhang X,Xiao N,Liang H,Li P,Zhang Y,Zhang S,Zhou B,Yao S,Yang Z,Chen J

Affiliations (6)

  • NYC Health & Hospital/Jacobi, Albert Einstein College of Medicine, New York, NY.
  • Shandong University, Cheeloo College of Medicine, Jinan, Shandong, China.
  • Department of Urology, Qilu Hospital of Shandong University (Qingdao), Qingdao, Shandong, China.
  • Department of Urology, Qilu Hospital of Shandong University, Jinan, Shandong, China.
  • Urology Service, Department of Surgery, Memorial Sloan Kettering Cancer Center, New York, NY.
  • Department of Urology, Qilu hospital of Shandong University, China. Electronic address: [email protected].

Abstract

Prostate cancer remains a major global burden; diagnostic pathways rely on prostate-specific antigen (PSA), multiparametric magnetic resonance imaging (mpMRI), and histopathology but face false positives, interobserver variability, and risk of overtreatment. We conducted a narrative review of peer-reviewed human studies (2015-February 2025; PubMed, Web of Science, Google Scholar) on artificial intelligence (AI) across imaging and digital pathology. Evidence shows that assistive AI can match or exceed expert performance while improving workflow. In a large international paired confirmatory study (PI-CAI), an MRI-based AI system achieved an area under the receiver operating characteristic curve (AUROC) of 0.91 versus 0.86 for 62 radiologists, detected 6.8% more Grade Group (GG) ≥2 cancers at matched specificity, and yielded ∼50% fewer false positives and 20% fewer indolent (GG1) detections at matched sensitivity. Risk tools configured for high-sensitivity rule-out (90%-95%) report high negative predictive value (NPV) 97.5% to 98.0% and enable meaningful biopsy avoidance. In digital pathology, independent assessments of Paige Prostate report 97.7% sensitivity and 99.3% specificity on core biopsies, while real-world deployments reduce immunohistochemistry requests, second-opinion rates, and reporting time. Collectively, these data support deploying AI as a second-reader/triage with standardized acquisition and quality assurance, local calibration, and drift monitoring. Priority evidence needs include multicenter prospective studies and pragmatic real-world evidence (RWE) reporting patient outcomes and cost-effectiveness, with continued attention to fairness, privacy, and regulatory compliance.

Topics

Journal ArticleReview

Ready to Sharpen Your Edge?

Subscribe to join 7,500+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.