Back to all papers

Deep learning-based framework for comprehensive quantification of thigh and calf muscles and adipose tissues from MRI.

December 4, 2025pubmed logopapers

Authors

Wohlfarth V,Kway YM,Sood A,Jeon YS,Chong LR,Marx UC,Tay J,Eriksson JG,Bendahan D,Michel CP,Sadananthan SA,Velan SS

Affiliations (15)

  • Institute for Human Development and Potential, Agency for Science, Technology, and Research, Singapore, Singapore.
  • Pforzheim University, Pforzheim, Germany.
  • Department of Medicine, Yong Loo Lin School of Medicine, National University of Singapore, Singapore, Singapore.
  • Institute of Data Science, National University of Singapore, Singapore, Singapore.
  • Department of Radiology, Changi General Hospital, Singapore, Singapore.
  • Human Potential Translational Research Programme, Yong Loo Lin School of Medicine, National University of Singapore, Singapore, Singapore.
  • Department of Obstetrics and Gynaecology, Yong Loo Lin School of Medicine, National University of Singapore, Singapore, Singapore.
  • Department of General Practice and Primary Health Care, University of Helsinki and Helsinki University Hospital, Helsinki, Finland.
  • Folkhälsan Research Center, Helsinki, Finland.
  • The French National Centre for Scientific Research, Center for Magnetic Resonance in Biology and Medicine, Aix Marseille University, Marseille, France.
  • Institute for Human Development and Potential, Agency for Science, Technology, and Research, Singapore, Singapore. [email protected].
  • Department of Medicine, Yong Loo Lin School of Medicine, National University of Singapore, Singapore, Singapore. [email protected].
  • Human Potential Translational Research Programme, Yong Loo Lin School of Medicine, National University of Singapore, Singapore, Singapore. [email protected].
  • Department of Physiology, Yong Loo Lin School of Medicine, National University of Singapore, Singapore, Singapore. [email protected].
  • Human Magnetic Resonance Centre, Institute for Applied Life Sciences, University of Massachusetts Amherst, 240 Thatcher Road, Amherst, MA, 01003, USA. [email protected].

Abstract

Quantification of muscles and adipose depots is essential for characterising pathological changes in neuromuscular, musculoskeletal, and metabolic diseases. This study presents a deep learning framework for automated comprehensive analysis of muscle and adipose tissue in the lower extremities. Axial two-point dixon magnetic resonance imaging data from thigh and calf were retrospectively collected from 25 participants (mean age: 40.5 ± 5.86 years; 64% male) from the Asian Indian Prediabetes Study. A 3D Attention-Res-V-Net pipeline was trained on expert-labelled ground truth data. A cascade of Attention-Res-V-Net models was trained to first quantify the entire muscle region and subcutaneous adipose tissue (SAT) in thigh and calf. Then, thigh and calf-specific networks quantified 13 thigh and 9 calf muscles, respectively. Intermuscular (InterMAT) and intramuscular (IntraMAT) adipose tissues were quantified by intensity thresholding fat-only image volumes within muscle-specific segmentation masks. Resulting fat voxels were multiplied by the voxel resolution to obtain adipose tissue volumes, which were evaluated as relative errors against the ground truth volumes. Whole muscle segmentation achieved mean DSCs of 92 (thigh) and 87% (calf); SAT reached 95%. Muscle-specific DSCs ranged from 76 to 90% (thigh) and 68 to 90% (calf). InterMAT errors were ~ 21% (thigh) and ~ 19% (calf), while IntraMAT errors ranged from 17.4 to 58.8%. In addition, the high-quality, expert-annotated dataset generated in this study will be publicly released to facilitate future research. The framework advances muscle-fat composition analysis in the lower limbs by enabling granular quantification of individual muscles, SAT, InterMAT, and muscle-specific IntraMAT.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.