Cross-domain EEG-based Emotion Recognition with Contrastive Learning
By: Rui Yan , Yibo Li , Han Ding and more
Potential Business Impact:
Reads your feelings from brain waves.
Electroencephalogram (EEG)-based emotion recognition is vital for affective computing but faces challenges in feature utilization and cross-domain generalization. This work introduces EmotionCLIP, which reformulates recognition as an EEG-text matching task within the CLIP framework. A tailored backbone, SST-LegoViT, captures spatial, spectral, and temporal features using multi-scale convolution and Transformer modules. Experiments on SEED and SEED-IV datasets show superior cross-subject accuracies of 88.69% and 73.50%, and cross-time accuracies of 88.46% and 77.54%, outperforming existing models. Results demonstrate the effectiveness of multimodal contrastive learning for robust EEG emotion recognition.
Similar Papers
EEG Emotion Recognition Through Deep Learning
Signal Processing
Reads your feelings from brain waves.
An Emotion Recognition Framework via Cross-modal Alignment of EEG and Eye Movement Data
Multimedia
Reads your feelings using brain waves and eyes.
Cross-Modal Consistency-Guided Active Learning for Affective BCI Systems
Machine Learning (CS)
Helps computers understand feelings from brain waves.