Back to all papers

A unified deep learning framework for cross-platform harmonization of multi-tracer PET quantification

October 22, 2025medrxiv logopreprint

Authors

Wang, J.,Zhong, A.,Xu, Q.,Huang, H.,Zhu, Y.,Lu, J.,Wang, M.,Jiang, J.,Li, C.,Ni, M.,Sun, K.,Guan, Y.,Lu, J.,Tian, M.,Shen, D.,Zhang, H.,Wang, Q.,Zuo, C.

Affiliations (1)

  • Department of Nuclear Medicine/PET center, Huashan Hospital, Fudan University, Shanghai, China

Abstract

Quantitative PET underpins diagnosis and treatment monitoring in neurodegenerative disease, yet systematic biases between PET-MRI and PET-CT preclude threshold transfer and cross-site comparability. We present a unified, anatomically guided deep-learning framework that harmonizes multi-tracer PET-MRI to PET-CT. The model learns CT-anchored attenuation representations with a Vision Transformer Autoencoder, aligns MRI features to CT space via contrastive objectives, and performs attention-guided residual correction. In paired same-day scans (N = 70; amyloid, tau, FDG), cross-platform bias fell by >80% while preserving inter-regional biological topology. The framework generalized zero-shot to held-out tracers (18F-florbetapir; 18F-FP-CIT) without retraining. Multicentre validation (N = 420; three sites, four vendors) reduced amyloid Centiloid discrepancies from 23.6 to 4.1 (within PET-CT test-retest precision) and aligned tau SUVR thresholds. These results enable platform-agnostic diagnostic cutoffs and reliable longitudinal monitoring when patients transition between modalities, establishing a practical route to scalable, radiation-sparing quantitative PET in therapeutic workflows.

Topics

neurology

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.