Back to all papers

AI-Assisted Real-Time Cytologic Diagnosis During EUS-FNA of Pancreatic Masses (With Video).

December 15, 2025pubmed logopapers

Authors

Ashida R,Kuwahara T,Koshikawa T,Hashimoto K,Okuno N,Haba S,Kawaji Y,Tamura T,Yamashita Y,Itonaga M,Kiriyama Y,Yamao K,Hara K,Kitano M

Affiliations (5)

  • Second Department of Internal Medicine, Wakayama Medical University, Wakayama, Japan.
  • Department of Gastroenterology, Aichi Cancer Center Hospital, Nagoya, Japan.
  • Department of Medical Technology, Shubun University Faculty of Medical Sciences, Ichinomiya, Aichi, Japan.
  • Department of Pathology, Narita Memorial Hospital, Toyohashi, Aichi, Japan.
  • Department of Gastroenterology, Narita Memorial Hospital, Toyohashi, Aichi, Japan.

Abstract

Rapid on-site evaluation enhances the diagnostic yield of endoscopic ultrasound-guided fine needle aspiration (EUS-FNA), but the cytopathologists are limited. This study aimed to assess the diagnostic capability of an artificial intelligence-assisted ROSE (AI-ROSE) for EUS-FNA. The study included 137 patients who underwent EUS-FNA of a pancreatic mass between April 2019 and August 2021. Participants were divided into training (n = 96), validation (n = 15), and test cohorts (n = 26). From the training/validation cohort, 5157/615 digital images of cell clusters were extracted and divided into 288 × 288-pixel patches. These cell clusters were annotated and labeled into five classes. A semantic segmentation architecture was developed. From the test cohort, 120 cell clusters were extracted to compare diagnostic performance between AI-ROSE and 21 endosonographers and 5 cytotechnologists with varying experience levels in ROSE. In total, 1,097,840 training, 31,817 validation, and 1920 test regions were extracted. In the test cohort, AI-ROSE accuracy for three-category classification (class 1/2, class 3, and class 4/5) was 89.8%. For two-category classification (class 1/2/3 and class 4/5), sensitivity, specificity, and accuracy were 89.3%, 98.1%, and 95.1%, respectively. In the comparison cohort, AI-ROSE accuracy for two-category classification was 93.3%, significantly higher than all endosonographers (68.3%; range, 45.8%-86.7%) and cytotechnologists (76.3%; range, 72.5%-78.3%). The AI-ROSE evaluation time for 120 cell clusters was 6.04 s, much shorter than that of all endosonographers (1800; 480-6000 s) and cytotechnologists (2160; 1020-3600 s). The AI-ROSE model shows remarkable speed and accuracy in diagnosing pancreatic cell clusters, enabling rapid decision-making during EUS-FNA. UMIN-CTR; No. 000042212.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 7,200+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.