A Hybrid Deep Learning Framework for Emotion Recognition in Children with Autism During NAO Robot-Mediated Interaction
By: Indranil Bhattacharjee , Vartika Narayani Srinet , Anirudha Bhattacharjee and more
Potential Business Impact:
Helps robots understand autistic children's feelings.
Understanding emotional responses in children with Autism Spectrum Disorder (ASD) during social interaction remains a critical challenge in both developmental psychology and human-robot interaction. This study presents a novel deep learning pipeline for emotion recognition in autistic children in response to a name-calling event by a humanoid robot (NAO), under controlled experimental settings. The dataset comprises of around 50,000 facial frames extracted from video recordings of 15 children with ASD. A hybrid model combining a fine-tuned ResNet-50-based Convolutional Neural Network (CNN) and a three-layer Graph Convolutional Network (GCN) trained on both visual and geometric features extracted from MediaPipe FaceMesh landmarks. Emotions were probabilistically labeled using a weighted ensemble of two models: DeepFace's and FER, each contributing to soft-label generation across seven emotion classes. Final classification leveraged a fused embedding optimized via Kullback-Leibler divergence. The proposed method demonstrates robust performance in modeling subtle affective responses and offers significant promise for affective profiling of ASD children in clinical and therapeutic human-robot interaction contexts, as the pipeline effectively captures micro emotional cues in neurodivergent children, addressing a major gap in autism-specific HRI research. This work represents the first such large-scale, real-world dataset and pipeline from India on autism-focused emotion analysis using social robotics, contributing an essential foundation for future personalized assistive technologies.
Similar Papers
Generation of Real-time Robotic Emotional Expressions Learning from Human Demonstration in Mixed Reality
Robotics
Robots show feelings like humans do.
Awakening Facial Emotional Expressions in Human-Robot
Robotics
Robots learn to make human-like faces.
Deep Learning-Based Real-Time Sequential Facial Expression Analysis Using Geometric Features
CV and Pattern Recognition
Lets computers understand your feelings from your face.