Score: 0

Mind the Motions: Benchmarking Theory-of-Mind in Everyday Body Language

Published: November 19, 2025 | arXiv ID: 2511.15887v1

By: Seungbeen Lee , Jinhong Jeong , Donghyun Kim and more

Potential Business Impact:

Teaches computers to understand body language like people.

Business Areas:
Motion Capture Media and Entertainment, Video

Our ability to interpret others' mental states through nonverbal cues (NVCs) is fundamental to our survival and social cohesion. While existing Theory of Mind (ToM) benchmarks have primarily focused on false-belief tasks and reasoning with asymmetric information, they overlook other mental states beyond belief and the rich tapestry of human nonverbal communication. We present Motion2Mind, a framework for evaluating the ToM capabilities of machines in interpreting NVCs. Leveraging an expert-curated body-language reference as a proxy knowledge base, we build Motion2Mind, a carefully curated video dataset with fine-grained nonverbal cue annotations paired with manually verified psychological interpretations. It encompasses 222 types of nonverbal cues and 397 mind states. Our evaluation reveals that current AI systems struggle significantly with NVC interpretation, exhibiting not only a substantial performance gap in Detection, as well as patterns of over-interpretation in Explanation compared to human annotators.

Country of Origin
🇰🇷 Korea, Republic of

Page Count
20 pages

Category
Computer Science:
Computation and Language