Improving 3D Thin Vessel Segmentation in Brain TOF-MRA via a Dual-space Context-Aware Network.

Authors

Shan W,Li X,Wang X,Li Q,Wang Z

Abstract

3D cerebrovascular segmentation poses a significant challenge, akin to locating a line within a vast 3D environment. This complexity can be substantially reduced by projecting the vessels onto a 2D plane, enabling easier segmentation. In this paper, we create a vessel-segmentation-friendly space using a clinical visualization technique called maximum intensity projection (MIP). Leveraging this, we propose a Dual-space Context-Aware Network (DCANet) for 3D vessel segmentation, designed to capture even the finest vessel structures accurately. DCANet begins by transforming a magnetic resonance angiography (MRA) volume into a 3D Regional-MIP volume, where each Regional-MIP slice is constructed by projecting adjacent MRA slices. This transformation highlights vessels as prominent continuous curves rather than the small circular or ellipsoidal cross-sections seen in MRA slices. DCANet encodes vessels separately in the MRA and the projected Regional-MIP spaces and introduces the Regional-MIP Image Fusion Block (MIFB) between these dual spaces to selectively integrate contextual features from Regional-MIP into MRA. Following dual-space encoding, DCANet employs a Dual-mask Spatial Guidance TransFormer (DSGFormer) decoder to focus on vessel regions while effectively excluding background areas, which reduces the learning burden and improves segmentation accuracy. We benchmark DCANet on four datasets: two public datasets, TubeTK and IXI-IOP, and two in-house datasets, Xiehe and IXI-HH. The results demonstrate that DCANet achieves superior performance, with improvements in average DSC values of at least 2.26%, 2.17%, 2.62%, and 2.58% for thin vessels, respectively. Codes are available at: https://github.com/shanwq/DCANet.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.