DRIPS: Domain Randomisation for Image-based Perivascular spaces Segmentation
Authors
Affiliations (1)
Affiliations (1)
- Cerebrovascular Imaging and Research Lab, Department Artificial Intelligence in Biomedical Engineering (AIBE), Friedrich-Alexander-Universität Erlangen-Nürnberg
Abstract
Perivascular spaces (PVS) are emerging as sensitive imaging markers of brain health. However, out-of-sample PVS segmentation remains challenging, as existing methods are modality-specific or require cohort-specific tuning. We propose DRIPS (Domain Randomisation for Image-based PVS Segmentation), a physics-inspired domain randomisation framework for out-of-sample PVS segmentation. We tested DRIPS out-of-sample on real imaging data from five cohorts comprising individuals with diverse health conditions (N=165; T1w and T2w, isotropic and anisotropic imaging) and on a 3D ex vivo brain model reconstructed from histology. We evaluated its segmentation against manual PVS segmentations using the area under the precision-recall curve (AUPRC) and the Dice similarity coefficient (DSC), and compared it with both classical and deep learning methods, namely Frangi, RORPO, SHIVA-PVS, and nnU-Net. Only DRIPS and Frangi had AUPRC values above chance level across all cohorts and the ex vivo brain model. On isotropic images, DRIPS and nnU-Net performed comparably, surpassing the third-best method by a median of +0.17-0.39 AUPRC and +0.09-0.26 DSC. On anisotropic images, DRIPS outperformed all competitors, exceeding the second-best method by a median of +0.13-0.22 AUPRC and +0.07-0.14 DSC. Unlike other methods, the performance of DRIPS was not associated with the volume of white matter hyperintensities. These results position DRIPS as an effective method for PVS segmentation that generalise well across heterogeneous imaging settings without the need for manually labelled data, modality-specific models, or cohort-specific tuning.