Salience Adjustment for Context-Based Emotion Recognition
By: Bin Han, Jonathan Gratch
Potential Business Impact:
Helps computers understand feelings in real-life situations.
Emotion recognition in dynamic social contexts requires an understanding of the complex interaction between facial expressions and situational cues. This paper presents a salience-adjusted framework for context-aware emotion recognition with Bayesian Cue Integration (BCI) and Visual-Language Models (VLMs) to dynamically weight facial and contextual information based on the expressivity of facial cues. We evaluate this approach using human annotations and automatic emotion recognition systems in prisoner's dilemma scenarios, which are designed to evoke emotional reactions. Our findings demonstrate that incorporating salience adjustment enhances emotion recognition performance, offering promising directions for future research to extend this framework to broader social contexts and multimodal applications.
Similar Papers
Saliency-guided Emotion Modeling: Predicting Viewer Reactions from Video Stimuli
CV and Pattern Recognition
Shows how video looks to guess how you feel.
Modelling Emotions in Face-to-Face Setting: The Interplay of Eye-Tracking, Personality, and Temporal Dynamics
Human-Computer Interaction
Helps computers understand how you feel.
Bimodal Connection Attention Fusion for Speech Emotion Recognition
Sound
Helps computers understand feelings from voices and words.