Back to all papers

Real-Time Identification of Cricothyrotomy Landmarks in Emergency Care and Obstetric Patients Using Wireless Handheld Ultrasound and Edge-Computing Artificial Intelligence: A Prospective Observational Study.

Authors

Wu CY,Li JD,Shih PY,Huang CC,Cheng HL,Wu CY,Tay J,Wu MC,Wang CH,Chen CS,Huang CH

Affiliations (10)

  • Department of Emergency Medicine, National Taiwan University Hospital, Taipei, Taiwan.
  • NTU Joint Research Center for AI Technology and All Vista Healthcare, National Taiwan University, Taipei, Taiwan.
  • Department of Anesthesiology, National Taiwan University Hospital, Taipei, Taiwan.
  • Department of Computer Science and Information Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei 106, Taipei, Taiwan.
  • Department of Anesthesiology, College of Medicine, National Taiwan University Hsinchu Branch, National Taiwan University, Taipei, Taiwan.
  • Department of Emergency Medicine, National Taiwan University Hospital, Taipei, Taiwan. [email protected].
  • Department of Emergency Medicine, College of Medicine, National Taiwan University, No.7, Zhongshan S. Rd., Zhongzheng Dist., Taipei City 100, Taipei, Taiwan. [email protected].
  • Department of Emergency Medicine, National Taiwan University Hospital Yunlin Branch, Yunlin, Taiwan. [email protected].
  • Department of Computer Science and Information Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei 106, Taipei, Taiwan. [email protected].
  • Department of Emergency Medicine, College of Medicine, National Taiwan University, No.7, Zhongshan S. Rd., Zhongzheng Dist., Taipei City 100, Taipei, Taiwan.

Abstract

This study aimed to develop machine learning-based algorithms to assist physicians in ultrasound-guided localization of the cricoid cartilage (CC), thyroid cartilage (TC), and cricothyroid membrane (CTM) for cricothyroidotomy. Adult female participants presenting to the emergency department with dyspnea or to the obstetrics and gynecology department for a scheduled cesarean section between August 2022 and July 2024 were prospectively recruited. Ultrasonographic images were collected using a wireless handheld ultrasound device connected to an edge computing tablet. Three You Only Look Once (YOLO) model variants-v5n6, v8n, and v10n-were selected for development and evaluation. A total of 608 participants (median age: 58.0 years, interquartile range [IQR]: 40.0-73.0; median body mass index: 23.2 kg/m², IQR: 20.2-26.5) contributed 117,094 ultrasonographic frames. All three YOLO-based models demonstrated high accuracy in detecting CC, TC, and CTM, with area under the receiver operating characteristic curve values exceeding 0.88. In correctly identified frames, the models effectively localized CC (IOU values: YOLOv5n6, 0.713 [95% confidence interval (CI): 0.698-0.726]; YOLOv8n, 0.718 [95% CI: 0.702-0.733]; YOLOv10n, 0.718 [95% CI: 0.701-0.734]; p value: 0.03) and TC (YOLOv5n6, 0.700 [95% CI: 0.683-0.717]; YOLOv8n, 0.706 [95% CI: 0.687-0.725]; YOLOv10n, 0.703 [95% CI: 0.783-0.721] ; p value: 0.037), though localization accuracy was lower for CTM (YOLOv5n6, 0.364 [95% CI: 0.333-0.394]; YOLOv8n, 0.363 [95% CI: 0.331-0.394]; YOLOv10n, 0.354 [95% CI: 0.325-0.381] ; p value: 0.053). The mean frames per second for YOLOv5n6, YOLOv8n, and YOLOv10n were 3.67, 13.83, and 14.13, respectively, when deployed on the handheld ultrasound platform. YOLO-based models demonstrated high accuracy in detecting and localizing CC, TC, and CTM. YOLOv8n and YOLOv10n achieved clinically acceptable real-time imaging performance when deployed on a wireless handheld ultrasound device with an edge computing tablet. Further studies are needed to assess whether this favorable performance translates into actual clinical benefits.

Topics

Cricoid CartilageThyroid CartilageArtificial IntelligenceAnatomic LandmarksJournal ArticleObservational Study

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.