Back to all papers

A multi-dimensional lightweight attention-enhanced model for medical image segmentation.

December 10, 2025pubmed logopapers

Authors

Shang M,Kim P

Affiliations (2)

  • Department of Computer Engineering, Silla University, Busan, 46958, South Korea.
  • Department of Computer Engineering, Silla University, Busan, 46958, South Korea. [email protected].

Abstract

Medical image segmentation is crucial in medical imaging analysis: based on grayscale, texture, and structural features, it precisely partitions an image into semantic subregions to enable accurate localization and analysis of lesions or anatomical structures. In recent years, deep learning methods-especially convolutional neural networks (CNNs)-have achieved remarkable progress in this field; however, fixed kernel sizes and limited receptive fields hinder adequate modeling of global dependencies. Although introducing multi-branch designs or global perception based on vision Transformers can partially alleviate this issue, they often increase computational complexity and are therefore limited in resource-constrained medical scenarios. To address this, we propose a multi-dimensional, lightweight, attention-enhanced medical image segmentation model. Our model integrates omni-dimensional dynamic convolution and mask attention: it performs adaptive modeling across spatial, channel, and kernel-number dimensions to significantly expand the effective receptive field, and employs binary masks to suppress background interference and focus on critical boundaries, with lower computational cost than global self-attention. Evaluated on three public benchmark datasets, our model achieves segmentation accuracy and inference efficiency that are superior to or comparable with current mainstream methods, demonstrating its potential applicability and practical value in clinical settings.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 7,100+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.