Evidence
IEEE J Biomed Health Inform. 2024 Sep 5;PP. doi: 10.1109/JBHI.2024.3449083. Online ahead of print.
ABSTRACT
Thanks to advancements in artificial intelligence and brain-computer interface (BCI) research, there has been increasing attention towards emotion recognition techniques based on electro encephalogram (EEG) recently. The complexity of EEG data poses a challenge when it comes to accurately classifying emotions by integrating time, frequency, and spatial domain features. To address this challenge, this paper proposes a fusion model called DC-ASTGCN, which combines the strengths of deep convolutional neural network (DCNN) and adaptive spatiotemporal graphic convolutional neural network (ASTGCN) to comprehensively analyze and understand EEG signals. The DCNN focuses on extracting frequency-domain and local spatial features from EEG signals to identify brain region activity patterns, while the ASTGCN, with its spatiotemporal attention mechanism and adaptive brain topology layer, reveals the functional connectivity features between brain regions in different emotional states. This integration significantly enhances the model’s ability to understand and recognize emotional states. Extensive experiments conducted on the DEAP and SEED datasets demonstrate that the DC-ASTGCN model outperforms existing state-of the-art methods in terms of emotion recognition accuracy.
PMID:39236139 | DOI:10.1109/JBHI.2024.3449083
Estimated reading time: 4 minute(s)
Latest: Psychiatryai.com #RAISR4D Evidence
Cool Evidence: Engaging Young People and Students in Real-World Evidence
Real-Time Evidence Search [Psychiatry]
AI Research
DC-ASTGCN: EEG Emotion Recognition Based on Fusion Deep Convolutional and Adaptive Spatio-temporal Graph Convolutional Networks
🌐 90 Days
Evidence Blueprint
DC-ASTGCN: EEG Emotion Recognition Based on Fusion Deep Convolutional and Adaptive Spatio-temporal Graph Convolutional Networks
☊ AI-Driven Related Evidence Nodes
(recent articles with at least 5 words in title)
More Evidence