Back to all papers

Perivascular space identification nnUNet for generalised usage (PINGU).

December 4, 2025pubmed logopapers

Authors

Sinclair B,Pham W,Vivash L,Moses J,Lynch M,Dorfman K,Marotta C,Koh S,Bunyamin J,Rowsthorn E,Jarema A,Peiris H,Chen Z,Shultz SR,Wright DK,Kong D,Naismith SL,O'Brien TJ,Law M

Affiliations (9)

  • Department of Neuroscience, The School of Translational Medicine, Monash University, Melbourne, Australia. Electronic address: [email protected].
  • Department of Neuroscience, The School of Translational Medicine, Monash University, Melbourne, Australia.
  • Department of Neuroscience, The School of Translational Medicine, Monash University, Melbourne, Australia; Department of Neurology, Alfred Health, Melbourne, Australia; Department of Medicine, The Royal Melbourne Hospital, The University of Melbourne, Parkville, Australia.
  • Department of Radiology, Alfred Health, Melbourne, Australia.
  • Monash Biomedical Imaging. Monash University, Melbourne, Australia; Department of Data Science and AI, Monash University, Melbourne, Australia.
  • Department of Neuroscience, The School of Translational Medicine, Monash University, Melbourne, Australia; Department of Neurology, Alfred Health, Melbourne, Australia; Centre for Trauma and Mental Health, Vancouver Island University, Nanaimo, Canada.
  • School of Psychology, Faculty of Science, University of Sydney, Sydney, Australia; Healthy Brain Ageing Program, Brain and Mind Centre, University of Sydney, Camperdown, NSW 2050, Australia; Charles Perkins Centre, University of Sydney, Camperdown, NSW 2050, Australia.
  • Department of Neuroscience, The School of Translational Medicine, Monash University, Melbourne, Australia; Department of Neurology, Alfred Health, Melbourne, Australia.
  • Department of Neuroscience, The School of Translational Medicine, Monash University, Melbourne, Australia; Department of Radiology, Alfred Health, Melbourne, Australia.

Abstract

Perivascular spaces (PVSs) form a central component of the brain's waste clearance system, the glymphatic system. These structures are visible on MRIs when enlarged, and their morphology is associated with aging and neurological disease. Manual quantification of PVS is time consuming and subjective. Numerous deep learning methods for PVS segmentation have been developed for automated segmentation. However, the majority of these algorithms have been developed and evaluated on homogenous datasets and high resolution scans, perhaps limiting their applicability for the wide range of image qualities acquired in clinical and research settings. In this work we train a nnUNet, a top-performing task driven biomedical image segmentation deep learning algorithm, on a heterogenous training sample of manually segmented MRIs of a range of different qualities and resolutions from 7 different datasets acquired on 6 different scanners. These are compared to the two currently publicly available deep learning methods for 3D segmentation of PVS, evaluated on scans with a range of resolutions and qualities. The resulting model, PINGU (Perivascular space Identification Nnunet for Generalised Usage), achieved voxel and cluster level dice scores of 0.50(SD=0.15) and 0.63(0.17) in the white matter (WM), and 0.54 (0.11) and 0.66(0.17) in the basal ganglia (BG). Performance on unseen "external" sites' data was substantially lower for both PINGU (0.20-0.38 [WM, voxel], 0.29-0.58 [WM, cluster], 0.22-0.36 [BG, voxel], 0.46-0.60 [BG, cluster]) and the publicly available algorithms (0.18-0.30 [WM, voxel], 0.29-0.38 [WM cluster], 0.10-0.20 [BG, voxel], 0.15-0.37 [BG, cluster]). Nonetheless, PINGU strongly outperformed the publicly available algorithms, particularly in the BG. PINGU stands out as broad-use PVS segmentation tool, with particular strength in the BG, an area of PVS highly related to vascular disease and pathology.

Topics

Journal Article

Ready to Sharpen Your Edge?

Subscribe to join 7,100+ peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.