Skin-Machine Interface with Multimodal Contact Motion Classifier
By: Alberto Confente , Takanori Jin , Taisuke Kobayashi and more
Potential Business Impact:
Robots learn to move by feeling your touch.
This paper proposes a novel framework for utilizing skin sensors as a new operation interface of complex robots. The skin sensors employed in this study possess the capability to quantify multimodal tactile information at multiple contact points. The time-series data generated from these sensors is anticipated to facilitate the classification of diverse contact motions exhibited by an operator. By mapping the classification results with robot motion primitives, a diverse range of robot motions can be generated by altering the manner in which the skin sensors are interacted with. In this paper, we focus on a learning-based contact motion classifier employing recurrent neural networks. This classifier is a pivotal factor in the success of this framework. Furthermore, we elucidate the requisite conditions for software-hardware designs. Firstly, multimodal sensing and its comprehensive encoding significantly contribute to the enhancement of classification accuracy and learning stability. Utilizing all modalities simultaneously as inputs to the classifier proves to be an effective approach. Secondly, it is essential to mount the skin sensors on a flexible and compliant support to enable the activation of three-axis accelerometers. These accelerometers are capable of measuring horizontal tactile information, thereby enhancing the correlation with other modalities. Furthermore, they serve to absorb the noises generated by the robot's movements during deployment. Through these discoveries, the accuracy of the developed classifier surpassed 95 %, enabling the dual-arm mobile manipulator to execute a diverse range of tasks via the Skin-Machine Interface. https://youtu.be/UjUXT4Z4BC8
Similar Papers
A Comparative Study of Human Activity Recognition: Motion, Tactile, and multi-modal Approaches
Robotics
Robots understand what you're doing better.
Whole-body Multi-contact Motion Control for Humanoid Robots Based on Distributed Tactile Sensors
Robotics
Robots use knees and elbows to balance better.
TACT: Humanoid Whole-body Contact Manipulation through Deep Imitation Learning with Tactile Modality
Robotics
Robot learns to grab things by feeling them.