Back to all papers

UTMorph: A hybrid CNN-transformer network for weakly-supervised multimodal image registration in biopsy puncture.

January 10, 2026pubmed logopapers

Authors

Guo X,Chen P,Wang H,Yan Z,Jiang Q,Wang R,Bin J

Affiliations (7)

  • School of Health Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093, China. Electronic address: [email protected].
  • School of Health Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093, China.
  • Department of Urology, Shanghai East Hospital, Tongji University School of Medicine, China. Electronic address: [email protected].
  • Department of Urology, Shanghai East Hospital, Tongji University School of Medicine, China.
  • Department of Information Management, Shanghai East Hospital, Tongji University School of Medicine, Shanghai 200120, China.
  • Huzhou Key Laboratory of Precise Diagnosis and Treatment of Urinary Tumor, Shanghai 313000, China.
  • School of Health Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093, China; Department of Radiology, Shanghai East Hospital, Tongji University School of Medicine, Shanghai 200120, China.

Abstract

Accurate registration of preoperative magnetic resonance imaging (MRI) and intraoperative ultrasound (US) images is essential to enhance the precision of biopsy punctures and targeted ablation procedures using robotic systems. To improve the speed and accuracy of registration algorithms while accounting for soft tissue deformation during puncture, we propose UTMorph, a hybrid framework consisting of convolutional neural network (CNN) and Transformer network, based on the U-Net architecture. This model is designed to enable efficient and deformable multimodal image registration. We introduced a novel attention mechanism that focuses on the structured features of images, thereby ensuring precise deformation estimation and reducing computational complexity. In addition, we proposed a hybrid edge loss function to complement the shape and boundary information, thereby improving registration accuracy. Experiments were conducted on data from 704 patients, including private datasets from Shanghai East Hospital, public datasets from The Cancer Imaging Archive, and the µ-ProReg Challenge. The performance of UTMorph was compared with that of six commonly used registration methods and loss functions. UTMorph achieved superior performance across multiple evaluation metrics (dice similarity coefficient: 0.890, 95th percentile Hausdorff distance: 2.679 mm, mean surface distance: 0.284 mm, and Jacobi determinant: 0.040) and ensures accurate registration with minimal memory usage, even under significant modal differences. These findings validate the effectiveness of the UTMorph model with the hybrid edge loss function for MR-US deformable medical image registration. This code is available at https://github.com/Prps7/UTMorph.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 8,400+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.