Back to all papers

Large-Scale Multimodality via Dual-Path Cooperative Feature Fusion Strategy for Medical Image Segmentation.

February 25, 2026pubmed logopapers

Authors

Tan D,Wang X,Su Y,Xia J,Zheng C,Zhong W

Abstract

Convolutional Neural Networks struggle with long-range dependencies modeling in medical image segmentation, and traditional Transformer models rely on Multi-Layer Perceptron (MLP) for channel information mixing, with performance issues as data dimensions increase. These issues prompt a reassessment of the model's design to enhance segmentation performance and effectively capture long-range dependencies. Consequently, this study presents the Kadformer, a novel network optimized for fine-grained multi-organ segmentation. The Kadformer model adopts an innovative U-shaped network architecture, which enhances the extraction of spatial and channel features in the encoder through the KAN-Enhanced Multi-Dimensional Attention (KMA) mechanism, effectively compensating for information loss during downsampling. We design a Dynamic Path Selection (DPS) strategy to mitigate the feature extraction discrepancies encountered by the linear attention mechanism when processing category-sparse and category-dense images while enhancing feature discrimination through long-range sequential modeling Mamba. Furthermore, we construct the Data Interaction (DAI) module to guide the dual-path encoder's channel and spatial information filtering and effectively integrate the semantically inconsistent features between the KMA and DPS modules. Our approach achieves more than 30% parameter reduction compared to state-of-the-art methods. In addition, the Kadformer network outperforms existing segmentation methods on six public datasets, demonstrating excellent performance. The code has been made available on GitHub: https://github.com/wxc9927/Kadformer.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.