Reinforcing Trustworthiness in Multimodal Emotional Support Systems
By: Huy M. Le , Dat Tien Nguyen , Ngan T. T. Vo and more
Potential Business Impact:
Helps computers give better emotional support.
In today's world, emotional support is increasingly essential, yet it remains challenging for both those seeking help and those offering it. Multimodal approaches to emotional support show great promise by integrating diverse data sources to provide empathetic, contextually relevant responses, fostering more effective interactions. However, current methods have notable limitations, often relying solely on text or converting other data types into text, or providing emotion recognition only, thus overlooking the full potential of multimodal inputs. Moreover, many studies prioritize response generation without accurately identifying critical emotional support elements or ensuring the reliability of outputs. To overcome these issues, we introduce \textsc{ MultiMood}, a new framework that (i) leverages multimodal embeddings from video, audio, and text to predict emotional components and to produce responses responses aligned with professional therapeutic standards. To improve trustworthiness, we (ii) incorporate novel psychological criteria and apply Reinforcement Learning (RL) to optimize large language models (LLMs) for consistent adherence to these standards. We also (iii) analyze several advanced LLMs to assess their multimodal emotional support capabilities. Experimental results show that MultiMood achieves state-of-the-art on MESC and DFEW datasets while RL-driven trustworthiness improvements are validated through human and LLM evaluations, demonstrating its superior capability in applying a multimodal framework in this domain.
Similar Papers
Reinforcing Trustworthiness in Multimodal Emotional Support Systems
Computers and Society
Helps computers give better emotional support.
A Unified Framework for Emotion Recognition and Sentiment Analysis via Expert-Guided Multimodal Fusion with Large Language Models
Computation and Language
**Computers understand feelings from talking, seeing, and writing.**
Computational emotion analysis with multimodal LLMs: Current evidence on an emerging methodological opportunity
Computation and Language
AI can't reliably tell emotions in real speeches.