Perspectives on Capturing Emotional Expressiveness in Sign Language
By: Phoebe Chua , Cathy Mengying Fang , Yasith Samaradivakara and more
Potential Business Impact:
Helps computers understand feelings in sign language.
Significant advances have been made in our ability to understand and generate emotionally expressive content such as text and speech, yet comparable progress in sign language technologies remain limited. While computational approaches to sign language translation have focused on capturing lexical content, the emotional dimensions of sign language communication remain largely unexplored. Through semi-structured interviews with eight sign language users across Singapore, Sri Lanka and the United States, including both Deaf and Hard of hearing (DHH) and hearing signers, we investigate how emotions are expressed and perceived in sign languages. Our findings highlight the role of both manual and non-manual elements in emotional expression, revealing universal patterns as well as individual and cultural variations in how signers communicate emotions. We identify key challenges in capturing emotional nuance for sign language translation, and propose design considerations for developing more emotionally-aware sign language technologies. This work contributes to both theoretical understanding of emotional expression in sign language and practical development of interfaces to better serve diverse signing communities.
Similar Papers
Challenges and opportunities in portraying emotion in generated sign language
Computation and Language
Makes computer sign language avatars show feelings.
"Nothing about us without us": Perspectives of Global Deaf and Hard-of-hearing Community Members on Sign Language Technologies
Human-Computer Interaction
Makes sign language tech work for everyone, not just some.
EASL: Multi-Emotion Guided Semantic Disentanglement for Expressive Sign Language Generation
CV and Pattern Recognition
Makes sign language videos show feelings.