Mind the Motions: Benchmarking Theory-of-Mind in Everyday Body Language
By: Seungbeen Lee , Jinhong Jeong , Donghyun Kim and more
Potential Business Impact:
Teaches computers to understand body language like people.
Our ability to interpret others' mental states through nonverbal cues (NVCs) is fundamental to our survival and social cohesion. While existing Theory of Mind (ToM) benchmarks have primarily focused on false-belief tasks and reasoning with asymmetric information, they overlook other mental states beyond belief and the rich tapestry of human nonverbal communication. We present Motion2Mind, a framework for evaluating the ToM capabilities of machines in interpreting NVCs. Leveraging an expert-curated body-language reference as a proxy knowledge base, we build Motion2Mind, a carefully curated video dataset with fine-grained nonverbal cue annotations paired with manually verified psychological interpretations. It encompasses 222 types of nonverbal cues and 397 mind states. Our evaluation reveals that current AI systems struggle significantly with NVC interpretation, exhibiting not only a substantial performance gap in Detection, as well as patterns of over-interpretation in Explanation compared to human annotators.
Similar Papers
MindPower: Enabling Theory-of-Mind Reasoning in VLM-based Embodied Agents
Artificial Intelligence
Robots understand what people think and do.
Theory of Mind in Large Language Models: Assessment and Enhancement
Computation and Language
Helps computers understand what people are thinking.
RecToM: A Benchmark for Evaluating Machine Theory of Mind in LLM-based Conversational Recommender Systems
Artificial Intelligence
Helps computers understand what people want and need.