Multilevel Correlation-aware and Modal-aware Graph Convolutional Network for Diagnosing Neurodevelopmental Disorders.
Authors
Abstract
Graph-based methods using resting-state functional magnetic resonance imaging demonstrate strong capabilities in modeling brain networks. However, existing graph-based methods often overlook inter-graph relationships, limiting their ability to capture the intrinsic features shared across individuals. Additionally, their simplistic integration strategies may fail to take full advantage of multimodal information. To address these challenges, this paper proposes a Multilevel Correlation-aware and Modal-aware Graph Convolutional Network (MCM-GCN) for the reliable diagnosis of neurodevelopmental disorders. At the individual level, we design a correlation-driven feature generation module that incorporates a pooling layer with external graph attention to perceive inter-graph correlations, generating discriminative brain embeddings and identifying disease-related regions. At the population level, to deeply integrate multimodal and multi-atlas information, a multimodal-decoupled feature enhancement module learns unique and shared embeddings from brain graphs and phenotypic data and then fuses them adaptively with graph channel attention for reliable disease classification. Extensive experiments on two public datasets for Autism Spectrum Disorder (ASD) and Attention Deficit Hyperactivity Disorder (ADHD) demonstrate that MCM-GCN outperforms other competing methods, with an accuracy of 92.88% for ASD and 76.55% for ADHD. The MCM-GCN framework integrates individual-level and population-level analyses, offering a comprehensive perspective for neurodevelopmental disorder diagnosis, significantly improving diagnostic accuracy while identifying key indicators. These findings highlight the potential of the MCM-GCN for imaging-assisted diagnosis of neurodevelopmental diseases, advancing interpretable deep learning in medical imaging analysis.