Integrating Time and Frequency Domain Features of fMRI Time Series for Alzheimer's Disease Classification Using Graph Neural Networks.

Authors

Peng W,Li C,Ma Y,Dai W,Fu D,Liu L,Liu L,Yu N,Liu J

Affiliations (6)

  • Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming, 650050, China. [email protected].
  • Computer Technology Application Key Lab of Yunnan Province, Kunming University of Science and Technology, Kunming, 650050, China. [email protected].
  • Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming, 650050, China.
  • Computer Technology Application Key Lab of Yunnan Province, Kunming University of Science and Technology, Kunming, 650050, China.
  • Department of Computing Sciences, the College at Brockport, State University of New York, Brockport, 14422, USA.
  • Hunan Provincial Key Lab on Bioinformatics, School of Computer Science and Engineering, Central South University, Changsha, 410083, China.

Abstract

Accurate and early diagnosis of Alzheimer's Disease (AD) is crucial for timely interventions and treatment advancement. Functional Magnetic Resonance Imaging (fMRI), measuring brain blood-oxygen level changes over time, is a powerful AD-diagnosis tool. However, current fMRI-based AD diagnosis methods rely on noise-susceptible time-domain features and focus only on synchronous brain-region interactions in the same time phase, neglecting asynchronous ones. To overcome these issues, we propose Frequency-Time Fusion Graph Neural Network (FTF-GNN). It integrates frequency- and time-domain features for robust AD classification, considering both asynchronous and synchronous brain-region interactions. First, we construct a fully connected hypervariate graph, where nodes represent brain regions and their Blood Oxygen Level-Dependent (BOLD) values at a time series point. A Discrete Fourier Transform (DFT) transforms these BOLD values from the spatial to the frequency domain for frequency-component analysis. Second, a Fourier-based Graph Neural Network (FourierGNN) processes the frequency features to capture asynchronous brain region connectivity patterns. Third, these features are converted back to the time domain and reshaped into a matrix where rows represent brain regions and columns represent their frequency-domain features at each time point. Each brain region then fuses its frequency-domain features with position encoding along the time series, preserving temporal and spatial information. Next, we build a brain-region network based on synchronous BOLD value associations and input the brain-region network and the fused features into a Graph Convolutional Network (GCN) to capture synchronous brain region connectivity patterns. Finally, a fully connected network classifies the brain-region features. Experiments on the Alzheimer's Disease Neuroimaging Initiative (ADNI) dataset demonstrate the method's effectiveness: Our model achieves 91.26% accuracy and 96.79% AUC in AD versus Normal Control (NC) classification, showing promising performance. For early-stage detection, it attains state-of-the-art performance in distinguishing NC from Late Mild Cognitive Impairment (LMCI) with 87.16% accuracy and 93.22% AUC. Notably, in the challenging task of differentiating LMCI from AD, FTF-GNN achieves optimal performance (85.30% accuracy, 94.56% AUC), while also delivering competitive results (77.40% accuracy, 91.17% AUC) in distinguishing Early MCI (EMCI) from LMCI-the most clinically complex subtype classification. These results indicate that leveraging complementary frequency- and time-domain information, along with considering asynchronous and synchronous brain-region interactions, can address existing approach limitations, offering a robust neuroimaging-based diagnostic solution.

Topics

Journal Article

Ready to Sharpen Your Edge?

Join hundreds of your peers who rely on RadAI Slice. Get the essential weekly briefing that empowers you to navigate the future of radiology.

We respect your privacy. Unsubscribe at any time.