Score: 1

XToM: Exploring the Multilingual Theory of Mind for Large Language Models

Published: June 3, 2025 | arXiv ID: 2506.02461v1

By: Chunkit Chan , Yauwai Yim , Hongchuan Zeng and more

Potential Business Impact:

Computers understand feelings in different languages.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Theory of Mind (ToM), the ability to infer mental states in others, is pivotal for human social cognition. Existing evaluations of ToM in LLMs are largely limited to English, neglecting the linguistic diversity that shapes human cognition. This limitation raises a critical question: can LLMs exhibit Multilingual Theory of Mind, which is the capacity to reason about mental states across diverse linguistic contexts? To address this gap, we present XToM, a rigorously validated multilingual benchmark that evaluates ToM across five languages and incorporates diverse, contextually rich task scenarios. Using XToM, we systematically evaluate LLMs (e.g., DeepSeek R1), revealing a pronounced dissonance: while models excel in multilingual language understanding, their ToM performance varies across languages. Our findings expose limitations in LLMs' ability to replicate human-like mentalizing across linguistic contexts.

Country of Origin
🇭🇰 Hong Kong

Repos / Data Links

Page Count
36 pages

Category
Computer Science:
Computation and Language