EEG-based Graph-guided Domain Adaptation for Robust Cross-Session Emotion Recognition
By: Maryam Mirzaei, Farzaneh Shayegh, Hamed Narimani
Accurate recognition of human emotional states is critical for effective human-machine interaction. Electroencephalography (EEG) offers a reliable source for emotion recognition due to its high temporal resolution and its direct reflection of neural activity. Nevertheless, variations across recording sessions present a major challenge for model generalization. To address this issue, we propose EGDA, a framework that reduces cross-session discrepancies by jointly aligning the global (marginal) and class-specific (conditional) distributions, while preserving the intrinsic structure of EEG data through graph regularization. Experimental results on the SEED-IV dataset demonstrate that EGDA achieves robust cross-session performance, obtaining accuracies of 81.22%, 80.15%, and 83.27% across three transfer tasks, and surpassing several baseline methods. Furthermore, the analysis highlights the Gamma frequency band as the most discriminative and identifies the central-parietal and prefrontal brain regions as critical for reliable emotion recognition.
Similar Papers
Cross-domain EEG-based Emotion Recognition with Contrastive Learning
CV and Pattern Recognition
Reads your feelings from brain waves.
EEG Emotion Recognition Through Deep Learning
Signal Processing
Reads your feelings from brain waves.
SSAS: Cross-subject EEG-based Emotion Recognition through Source Selection with Adversarial Strategy
Machine Learning (CS)
Lets computers guess your feelings from brain waves.