Why We Need a New Framework for Emotional Intelligence in AI
By: Max Parks , Kheli Atluru , Meera Vinod and more
In this paper, we develop the position that current frameworks for evaluating emotional intelligence (EI) in artificial intelligence (AI) systems need refinement because they do not adequately or comprehensively measure the various aspects of EI relevant in AI. Human EI often involves a phenomenological component and a sense of understanding that artificially intelligent systems lack; therefore, some aspects of EI are irrelevant in evaluating AI systems. However, EI also includes an ability to sense an emotional state, explain it, respond appropriately, and adapt to new contexts (e.g., multicultural), and artificially intelligent systems can do such things to greater or lesser degrees. Several benchmark frameworks specialize in evaluating the capacity of different AI models to perform some tasks related to EI, but these often lack a solid foundation regarding the nature of emotion and what it is to be emotionally intelligent. In this project, we begin by reviewing different theories about emotion and general EI, evaluating the extent to which each is applicable to artificial systems. We then critically evaluate the available benchmark frameworks, identifying where each falls short in light of the account of EI developed in the first section. Lastly, we outline some options for improving evaluation strategies to avoid these shortcomings in EI evaluation in AI systems.
Similar Papers
Artificial Emotion: A Survey of Theories and Debates on Realising Emotion in Artificial Intelligence
Human-Computer Interaction
AI learns to feel emotions like people.
Artificial Emotion: A Survey of Theories and Debates on Realising Emotion in Artificial Intelligence
Human-Computer Interaction
AI learns to feel emotions like people.
Artificial Intelligence Can Emulate Human Normative Judgments on Emotional Visual Scenes
Human-Computer Interaction
AI learns to feel emotions from pictures and words.