EchoAgent: guideline-centric reasoning agent for echocardiography measurement and interpretation.
Authors
Affiliations (4)
Affiliations (4)
- University of British Columbia, Vancouver, British Columbia, Canada. [email protected].
- University of British Columbia, Vancouver, British Columbia, Canada. [email protected].
- University of British Columbia, Vancouver, British Columbia, Canada.
- Vancouver General Hospital, Vancouver, British Columbia, Canada.
Abstract
Echocardiographic interpretation requires video-level reasoning and guideline-based measurement analysis, which current deep learning models for cardiac ultrasound do not support. We present EchoAgent, a framework that enables structured, interpretable automation for this domain. EchoAgent orchestrates specialized vision tools under large language model (LLM) control to perform temporal localization, spatial measurement, and clinical interpretation. A key contribution is a measurement-feasibility prediction model that determines whether anatomical structures are reliably measurable in each frame, enabling autonomous tool selection. We curated a benchmark of diverse, clinically validated video-query pairs for evaluation. To assess robustness across institutions, we further evaluate EchoAgent on a curated subset of the publicly available MIMIC-IV-EchoQA benchmark, specifically targeting questions answerable via linear measurements to remain within the current framework's scope. EchoAgent outperforms current medical VLMs and cardiac foundation models in video-level reasoning, demonstrating superior accuracy and interpretability on both our internal benchmark and the external MIMIC-IV-EchoQA subset. Outputs are grounded in visual evidence and clinical guidelines, supporting transparency and traceability. This work demonstrates the feasibility of agentic, guideline-aligned reasoning for echocardiographic video analysis, enabled by task-specific tools and full video-level automation. EchoAgent provides a framework for enhancing transparency and guideline-adherence, representing a step toward more trustworthy AI in cardiac ultrasound. Our code will be made publicly available at https://github.com/DeepRCL/EchoAgent .