A Comparative Study of Human Activity Recognition: Motion, Tactile, and multi-modal Approaches
By: Valerio Belcamino , Nhat Minh Dinh Le , Quan Khanh Luu and more
Potential Business Impact:
Robots understand what you're doing better.
Human activity recognition (HAR) is essential for effective Human-Robot Collaboration (HRC), enabling robots to interpret and respond to human actions. This study evaluates the ability of a vision-based tactile sensor to classify 15 activities, comparing its performance to an IMU-based data glove. Additionally, we propose a multi-modal framework combining tactile and motion data to leverage their complementary strengths. We examined three approaches: motion-based classification (MBC) using IMU data, tactile-based classification (TBC) with single or dual video streams, and multi-modal classification (MMC) integrating both. Offline validation on segmented datasets assessed each configuration's accuracy under controlled conditions, while online validation on continuous action sequences tested online performance. Results showed the multi-modal approach consistently outperformed single-modality methods, highlighting the potential of integrating tactile and motion sensing to enhance HAR systems for collaborative robotics.
Similar Papers
Skin-Machine Interface with Multimodal Contact Motion Classifier
Robotics
Robots learn to move by feeling your touch.
TACT: Humanoid Whole-body Contact Manipulation through Deep Imitation Learning with Tactile Modality
Robotics
Robot learns to grab things by feeling them.
Scaling Human Activity Recognition: A Comparative Evaluation of Synthetic Data Generation and Augmentation Techniques
CV and Pattern Recognition
Creates fake motion data to train activity trackers.