Modality-projection universal model for comprehensive full-body medical imaging segmentation.
Authors
Affiliations (9)
Affiliations (9)
- Institute of Medical Technology, Peking University Health Science Center, Peking University, Beijing, China.
- Department of Nuclear Medicine, The First Affiliated Hospital of Shandong First Medical University & Shandong Provincial Qianfoshan Hospital, Jinan, China.
- Department of Radiology, Peking University Third Hospital, Beijing, China.
- Beijing Key Laboratory of Magnetic Resonance Imaging Devices and Technology, Peking University Third Hospital, Beijing, China.
- Department of Radiology, Guangdong Provincial People's Hospital (Guangdong Academy of Medical Sciences), Southern Medical University, Guangdong, China.
- Key Laboratory of Carcinogenesis and Translational Research (Ministry of Education/Beijing), Key Laboratory for Research and Evaluation of Radiopharmaceuticals (National Medical Products Administration), Department of Nuclear Medicine, Peking University Cancer Hospital & Institute, Beijing, China.
- Department of Nuclear Medicine, The First Affiliated Hospital of Shandong First Medical University & Shandong Provincial Qianfoshan Hospital, Jinan, China. [email protected].
- Institute of Medical Technology, Peking University Health Science Center, Peking University, Beijing, China. [email protected].
- National Biomedical Imaging Center, College of Future Technology, Peking University, Beijing, China. [email protected].
Abstract
The integration of deep learning in medical imaging has significantly advanced diagnostic, therapeutic, and research outcomes. However, applying universal models across multiple modalities remains challenging due to inherent inter-modality variability. Here we present the Modality Projection Universal Model (MPUM), trained on 861 subjects, which dynamically adapts to diverse imaging modalities through a modality-projection strategy. MPUM achieves state-of-the-art, whole-body organ segmentation, providing rapid localization for computer-aided diagnosis and precise anatomical quantification to support clinical decision-making. A controller-based convolutional layer further enables saliency map visualization, enhancing model interpretability for clinical use. Beyond segmentation, MPUM reveals metabolic correlations along the brain-body axis and between distinct brain regions, providing insights into systemic and physiological interactions from a whole-body perspective. Here we show that this universal framework accelerates diagnosis, facilitates large-scale imaging analysis, and bridges anatomical and metabolic information, enabling discovery of cross-organ disease mechanisms and advancing integrative brain-body research.