NeuroEmo: A neuroimaging-based fMRI dataset to extract temporal affective brain dynamics for Indian movie video clips stimuli using dynamic functional connectivity approach with graph convolution neural network (DFC-GCNN).
Authors
Affiliations (3)
Affiliations (3)
- Department of Computer Science and Engineering, Birla Institute of Technology, Mesra, Ranchi, India. Electronic address: [email protected].
- Department of Computer Science and Engineering, Birla Institute of Technology, Mesra, Ranchi, India.
- Department of Psychiatry, Central Institute of Psychiatry, Kanke, Ranchi, India.
Abstract
FMRI, a non-invasive neuroimaging technique, can detect emotional brain activation patterns. It allows researchers to observe functional changes in the brain, making it a valuable tool for emotion recognition. For improved emotion recognition systems, it becomes crucial to understand the neural mechanisms behind emotional processing in the brain. There have been multiple studies across the world on the same, however, research on fMRI-based emotion recognition within the Indian population remains scarce, limiting the generalizability of existing models. To address this gap, a culturally relevant neuroimaging dataset has been created https://openneuro.org/datasets/ds005700 for identifying five emotional states i.e., calm, afraid, delighted, depressed and excited-in a diverse group of Indian participants. To ensure cultural relevance, emotional stimuli were derived from Bollywood movie clips. This study outlines the fMRI task design, experimental setup, data collection procedures, preprocessing steps, statistical analysis using the General Linear Model (GLM), and region-of-interest (ROI)-based dynamic functional connectivity (DFC) extraction using parcellation based on the Power et al. (2011) functional atlas. A supervised emotion classification model has been proposed using a Graph Convolutional Neural Network (GCNN), where graph structures were constructed from DFC matrices at varying thresholds. The DFC-GCNN model achieved an impressive 95% classification accuracy across 5-fold cross-validation, highlighting emotion-specific connectivity dynamics in key affective regions, including the amygdala, prefrontal cortex, and anterior insula. These findings emphasize the significance of temporal variability in emotional state classification. By introducing a culturally specific neuroimaging dataset and a GCNN-based emotion recognition framework, this research enhances the applicability of graph-based models for identifying region-wise connectivity patterns in fMRI data. It also offers novel insights into cross-cultural differences in emotional processing at the neural level. Furthermore, the high spatial and temporal resolution of the fMRI dataset provides a valuable resource for future studies in emotional neuroscience and related disciplines.