DUSTrack: Semi-automated point tracking in ultrasound videos.
Authors
Affiliations (8)
Affiliations (8)
- Institute for Medical Engineering and Science, MIT, Cambridge, MA, 02139, USA. [email protected].
- MIT.nano Immersion Lab, MIT, Cambridge, MA, 02139, USA. [email protected].
- Department of Mechanical Engineering, MIT, Cambridge, MA, 02139, USA.
- Fraunhofer Portugal AICOS, Porto, 4200-135, Portugal.
- Comprehensive Health Research Center (CHRC), Porto, 4200-135, Portugal.
- Institute for Medical Engineering and Science, MIT, Cambridge, MA, 02139, USA. [email protected].
- MIT.nano Immersion Lab, MIT, Cambridge, MA, 02139, USA. [email protected].
- Department of Mechanical Engineering, MIT, Cambridge, MA, 02139, USA. [email protected].
Abstract
Ultrasound technology enables safe, non-invasive imaging of dynamic tissue behavior, making it a valuable tool in medicine, biomechanics, and sports science. However, accurately tracking tissue motion in B-mode ultrasound remains challenging due to speckle noise, low edge contrast, and out-of-plane movement. These challenges complicate the task of tracking anatomical landmarks over time, which is essential for quantifying tissue dynamics in many clinical and research applications. This manuscript introduces DUSTrack (Deep learning and optical flow-based toolkit for UltraSound Tracking), a semi-automated framework for tracking arbitrary points in B-mode ultrasound videos. We combine deep learning with optical flow to deliver high-quality and robust tracking across diverse anatomical structures and motion patterns. The toolkit includes a graphical user interface that streamlines the generation of high-quality training data and supports iterative model refinement. It also implements a novel optical-flow-based filtering technique that reduces high-frequency frame-to-frame noise while preserving rapid tissue motion. Semi-automated tracking with DUSTrack demonstrates superior accuracy compared to contemporary zero-shot point trackers and performs on par with specialized methods. This establishes its potential as a general tool for clinical and biomechanical research—one that can generate accurate training data for developing foundation models in ultrasound point tracking. We demonstrate DUSTrack’s versatility through three use cases: cardiac wall motion tracking in echocardiograms, muscle deformation analysis during reaching tasks, and fascicle tracking during ankle plantarflexion. As an open-source solution, DUSTrack offers a powerful, flexible framework for point tracking to quantify tissue motion from ultrasound videos. DUSTrack is available at https://github.com/praneethnamburi/DUSTrack. The online version contains supplementary material available at 10.1038/s41598-026-42795-3.