Back to all papers

Evaluating the impact of AI assistance on decision-making in emergency doctors interpreting chest X-rays: a multi-reader multi-case study.

Authors

Lyell D,Dinh M,Gillett M,Abraham N,Symes ER,Susanto AP,Chakar BA,Seimon RV,Coiera E,Magrabi F

Affiliations (7)

  • Australian Institute of Health Innovation, Macquarie University, North Ryde, New South Wales, Australia [email protected].
  • Emergency Department, Royal Prince Alfred Hospital, Sydney Local Health District, Sydney, New South Wales, Australia.
  • RPA Green Light Institute for Emergency Care, Sydney Local Health District, Sydney, New South Wales, Australia.
  • Emergency Department, Royal North Shore Hospital, Northern Sydney Local Health District, Sydney, New South Wales, Australia.
  • Emergency Department, Canterbury Hospital, Sydney Local Health District, Sydney, New South Wales, Australia.
  • Australian Institute of Health Innovation, Macquarie University, North Ryde, New South Wales, Australia.
  • Medical Technology Cluster, Indonesian Medical Education and Research Institute, Universitas Indonesia, Jakarta, Indonesia.

Abstract

Artificial intelligence (AI) tools could assist emergency doctors interpreting chest X-rays to inform urgent care. However, the impact of AI assistance on clinical decision-making, a precursor to enhanced care and patient outcomes, remains understudied. This study evaluates the effect of AI assistance on clinical decisions of emergency doctors interpreting chest X-rays. Junior and senior residents, emergency registrars and consultants working in Australian emergency departments were eligible. Doctors completed 18 clinical vignettes involving chest X-ray interpretation, representative of typical patient presentations. Vignettes were randomly selected from a bank of 49 based on the emergency medicine curriculum and contained a chest X-ray, presenting complaint, relevant symptoms and observations. Of the 18 vignettes, each doctor was randomly assigned to have half assisted by a commercial AI tool capable of detecting 124 different chest X-ray findings. Four vignettes contained X-rays known to produce incorrect AI findings. Primary outcomes were correct diagnosis and patient management. X-ray interpretation time, confidence of diagnosis, perceptions about the AI tool and the differential impact of AI assistance by seniority were also examined. 200 doctors participated. AI assistance increased correct diagnosis by 5.9% (95% CI 2.7 to 9.2%) compared with unassisted vignettes, with the largest increase among senior residents (11.8%; 95% CI 5.2% to 18.3%). Patient management increased by 3.2% (95% CI 0.1% to 6.4%). Confidence in diagnosis increased by 5% (95% CI 3.4% to 6.6%; p<0.001) and interpretation time increased by 4.9 s (p=0.08). Incorrect AI findings decreased correct diagnosis by 1% for false-positive (p=0.9) and 9% for false-negative findings (p=0.1). Participants found the AI tool helpful for interpreting chest X-rays, highlighting missed findings, but were neutral on its accuracy. Improvements in diagnosis and patient management without meaningful increases in interpretation time suggest AI assistance could benefit clinical decisions involving chest X-ray interpretation. Further studies are required to ascertain if such improvements translate to improved patient care.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.