AIP-Net: an attention-integrated pyramid network for computer-aided diagnosis and segmentation of gastric lesion in ultrasound images.
Authors
Affiliations (4)
Affiliations (4)
- College of Computer Science and Technology, Huaqiao University - Xiamen Campus, No. 668, Jimei Avenue, Jimei District, Xiamen City, Fujian Province, China., Xiamen, Fujian, 361021, CHINA.
- Huaqiao University - Xiamen Campus, No. 668, Jimei Avenue, Jimei District, Xiamen City, Fujian Province, China., Xiamen, 361021, CHINA.
- Fujian Medical University Union Hospital, No. 29, Xinquan Road, Gulou District, Fuzhou City, Fujian Province, China., Fuzhou, Fujian, 350001, CHINA.
- Fujian Gongtian Software Co., Ltd., Quanzhou, Fujian, China, [email protected], Quanzhou, 362017, CHINA.
Abstract
Research on automatic gastric lesion segmentation in ultrasound is relatively scarce, despite its critical role in the early diagnosis and treatment of gastric cancer, the second leading cause of cancer-related deaths globally. The highly variable morphology of gastric lesions, combined with artifacts, blurred boundaries, and intensity inhomogeneity, makes accurate lesion segmentation particularly challenging, especially for clinicians with limited experience. To address these challenges, we propose a novel Attention-Integrated Pyramid 
Network (AIP-Net) to improve segmentation accuracy and assist clinical decision-making. In the encoder phase, the model integrates Convolutional Neural Networks (CNN) with a Lesion Boundary Detection (BD) module to enhance the focused extraction of lesion-specific features. A connected mask is applied to capture complex and subtle directional information. In the decoder phase, segmentation performance is further refined 
through spatial feature fusion and channel processing. Experimental results demonstrate that our approach outperforms state-of-the-art segmentation methods on gastric cancer ultrasound datasets, particularly in handling lesions with unclear or ambiguous boundaries. Experiments on the breast ultrasound dataset further validate the generalization capability of the proposed method. Comparisons with annotations from clinicians with varying experience levels show that our method performs well, offering valuable diagnostic support, improving targeting accuracy, and facilitating precise diagnosis and treatment planning for gastric ultrasound detection.