Score: 1

Emotion Recognition in Signers

Published: December 17, 2025 | arXiv ID: 2512.15376v1

By: Kotaro Funakoshi, Yaoxiong Zhu

Potential Business Impact:

Helps computers understand sign language emotions better.

Business Areas:
Speech Recognition Data and Analytics, Software

Recognition of signers' emotions suffers from one theoretical challenge and one practical challenge, namely, the overlap between grammatical and affective facial expressions and the scarcity of data for model training. This paper addresses these two challenges in a cross-lingual setting using our eJSL dataset, a new benchmark dataset for emotion recognition in Japanese Sign Language signers, and BOBSL, a large British Sign Language dataset with subtitles. In eJSL, two signers expressed 78 distinct utterances with each of seven different emotional states, resulting in 1,092 video clips. We empirically demonstrate that 1) textual emotion recognition in spoken language mitigates data scarcity in sign language, 2) temporal segment selection has a significant impact, and 3) incorporating hand motion enhances emotion recognition in signers. Finally we establish a stronger baseline than spoken language LLMs.


Page Count
9 pages

Category
Computer Science:
CV and Pattern Recognition