Back to all papers

A deep learning framework for finger motion recognition using forearm ultrasound imaging.

November 12, 2025pubmed logopapers

Authors

Lee Y,Ko K,Jang J,Lee JM,Yoon C

Affiliations (6)

  • Department of Orthopedic Surgery, Seoul National University Boramae Medical Center, Seoul, Republic of Korea.
  • Department of Artificial Intelligence, The Catholic University of Korea, Bucheon, Republic of Korea.
  • Department of Nanoscience Engineering, Inje University, Gimhae, Republic of Korea.
  • Department of Orthopedic Surgery, Keimyung University Dongsan Hospital, Daegu, Republic of Korea.
  • Department of Nanoscience Engineering, Inje University, Gimhae, Republic of Korea. [email protected].
  • Department of Biomedical Engineering, Inje University, Gimhae, Republic of Korea. [email protected].

Abstract

Precise classification of finger movements is essential for effective hand gesture recognition, as it facilitates reliable control across multiple degrees of freedom. Surface electromyography (sEMG) has been widely used for monitoring muscle activities in the forearm and applied for the classification of finger motion. However, due to its inherent limitations, sEMG fails to provide a satisfactory solution for finger motion recognition. As an alternative, A-mode ultrasound-based sensing methods have been proposed and have demonstrated great potential. In this paper, we present a finger motion classification method based on deep learning using forearm ultrasound imaging. Since B-mode ultrasound offers 2D visualization of muscle activity, it capture broader anatomical features and reducing sensitivity to transducer placement compared to A-mode ultrasound. Real-time ultrasound images of forearm muscles were acquired during nine predefined finger motions, comprising five single-finger motions and four multi-finger motions. A framework was developed to classify the finger motion. The framework demonstrated high accuracy in recognizing finger motions, with an overall average classification accuracy and an F1 score of 95.64% and 0.9563, respectively. The experimental results demonstrated the feasibility of the proposed method in accurately classifying various finger movements. The proposed method holds potential for various applications such as gesture recognition in virtual reality (VR)/augmented reality (AR), controlling robotic hands, and assisting in medical rehabilitation for finger dexterity.

Topics

Deep LearningFingersForearmJournal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.