Back to all papers

Segmenting multi-modal abdominal multi-organ using the CADSTransN-Net model.

April 7, 2026pubmed logopapers

Authors

Sun P,Wu G,Tan Y,Mei L,Mo T,Chen Z

Affiliations (5)

  • School of Electronic Engineering and Automation, Guilin University of Electronic Technology, Guilin, Guangxi, China.
  • Guangxi Key Laboratory of Automatic Detecting Technology and Instruments (Guilin University of Electronic Technology), Guilin, China.
  • School of Life and Environmental Science, Guilin University of Electronic Technology, Guilin, Guangxi, China.
  • Guangxi Human Physiological Information Non-Invasive Detection Engineering Technology Research Center, Guilin, Guangxi, China.
  • Guangxi Colleges and Universities Key Laboratory of Biomedical Sensors and Intelligent Instruments, Guilin, Guangxi, China.

Abstract

Deep learning advances medical imaging segmentation, but the insufficient diversity of datasets limits its performance. The AMOS22 dataset addresses this by providing large-scale, varied clinical data to enhance algorithm robustness.PurposeThis study develops and validates CADSTransN-Net (Convolutional Attention and Deep Supervision TransN-Net) to optimize abdominal organ segmentation for the AMOS22 challenge.MethodsCADSTransN-Net integrates three core innovations: a novel N-shaped feature flow path (departing from symmetric architectures for efficient encoder-decoder fusion), a convolutional attention mechanism (prioritizing anatomically relevant regions), and layer-wise deep supervision (ensuring meticulous gradient propagation and faster convergence).ResultsEvaluated on the full AMOS22 dataset, CADSTransN-Net achieved outstanding comprehensive performance: average Dice Similarity Coefficient (DSC) of 0.907, Normalized Surface Dice (NSD) of 0.850, 95th Percentile Hausdorff Distance (HD(95%)) of 3.98 mm, Average Surface Distance (ASD) of 0.75 mm, Absolute Volumetric Difference (AVD) of 39,755.88 mm<sup>3</sup>, and Relative Volumetric Difference (RVD) of 1.53%. These metrics confirm its high accuracy in region overlap, boundary consistency, and volume estimation for multi-modal abdominal multi-organ segmentation.ConclusionsCADSTransN-Net effectively meets AMOS22's challenges, delivering robust performance across region, boundary, and volume metrics. It provides a reliable solution for multi-modal abdominal multi-organ segmentation, with significant clinical potential for tasks such as surgical navigation.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 11k+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.