A Personalized Data-Driven Generative Model of Human Motion
By: Angelo Di Porzio, Marco Coraggio
Potential Business Impact:
Makes robots move like real people.
The deployment of autonomous virtual avatars (in extended reality) and robots in human group activities - such as rehabilitation therapy, sports, and manufacturing - is expected to increase as these technologies become more pervasive. Designing cognitive architectures and control strategies to drive these agents requires realistic models of human motion. However, existing models only provide simplified descriptions of human motor behavior. In this work, we propose a fully data-driven approach, based on Long Short-Term Memory neural networks, to generate original motion that captures the unique characteristics of specific individuals. We validate the architecture using real data of scalar oscillatory motion. Extensive analyses show that our model effectively replicates the velocity distribution and amplitude envelopes of the individual it was trained on, remaining different from other individuals, and outperforming state-of-the-art models in terms of similarity to human data.
Similar Papers
A Survey on Human Interaction Motion Generation
CV and Pattern Recognition
Teaches computers to move like people interacting.
Synthetic Human Action Video Data Generation with Pose Transfer
CV and Pattern Recognition
Makes fake videos of people move realistically.
From Motion to Behavior: Hierarchical Modeling of Humanoid Generative Behavior Control
Robotics
Makes robots move like real people.