Exploring transfer learning techniques for classifying Alzheimer's disease with rs-fMRI.
Authors
Affiliations (3)
Affiliations (3)
- Department of Industrial Engineering, Faculty of Engineering, Alzahra University, Tehran, Iran. Electronic address: [email protected].
- Department of Industrial Engineering, Faculty of Engineering, Alzahra University, Tehran, Iran. Electronic address: [email protected].
- Department of Industrial Engineering, University of Science and Technology of Mazandaran, Behshahr, Iran. Electronic address: [email protected].
Abstract
Alzheimer's disease, the most prevalent form of dementia, leads to a fatal progression after progressively destroying memory at each stage. This irreversible disease appears more frequently in older populations. Even though research on Alzheimer's disease has risen over the past few years, the intricacy of brain structure and function creates challenges for accurate disease diagnosis. As a neuroimaging technology, resting-state functional magnetic resonance imaging enables researchers to study debilitating neural diseases while scanning the brain. The research investigates resting-state functional magnetic resonance imaging approaches and deep learning methods to distinguish between Alzheimer's patients and normal individuals. resting-state functional magnetic resonance imaging of 97 participants is obtained from the Alzheimer's disease neuroimaging initiative database, with 56 participants classified in the Alzheimer's disease group and 41 in the normal control group. Extensive preprocessing is applied to the resting-state functional magnetic resonance imaging data before classification. Using transfer learning, classification between the normal control and Alzheimer's disease groups is conducted with proposed VGG19, AlexNet, and ResNet50 algorithms; the classification accuracy of them is 96.91 %, 98.71 %, and 98.20 %, respectively. For evaluation, precision, recall, and F1-score are utilized as additional assessment metrics. The AlexNet model exhibits higher accuracy than the other models and outperforms them in other evaluation metrics, including precision, recall, and F1-score. While AlexNet achieves the highest overall classification performance, ResNet50 demonstrates superior interpretability through Grad-CAM visualizations, producing more anatomically focused and clinically meaningful attention maps.