Back to all papers

MoE-Morph: Lightweight Pyramid Model with Heterogeneous Mixture of Experts for Deformable Medical Image Registration.

Authors

Lin H,Song Y,Su Y,Ma Y

Abstract

Deformable image registration aims to achieve nonlinear alignment of image spaces by estimating dense displacement fields. It is widely used in clinical tasks such as surgical planning, assisted diagnosis, and surgical navigation. While efficient, deep learning registration methods often struggle with large, complex displacements. Pyramid-based approaches address this with a coarse-to-fine strategy, but their single-feature processing can lead to error accumulation. In this paper, we introduce a dense Mixture of Experts (MoE) pyramid registration model, using routing schemes and multiple heterogeneous experts to increase the width and flexibility of feature processing within a single layer. The collaboration among heterogeneous experts enables the model to retain more precise details and maintain greater feature freedom when dealing with complex displacements. We use only deformation fields as the information transmission paradigm between different levels, with deformation field interactions between layers, which encourages the model to focus on the feature location matching process and perform registration in the correct direction. We do not utilize any complex mechanisms such as attention or ViT, keeping the model at its simplest form. The powerful deformable capability allows the model to perform volume registration directly and accurately without the need for affine registration. Experimental results show that the model achieves outstanding performance across four public datasets, including brain registration, lung registration, and abdominal multi-modal registration. The code will be published at https://github.com/Darlinglinlinlin/MOE_Morph.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.