HHI-Assist: A Dataset and Benchmark of Human-Human Interaction in Physical Assistance Scenario
By: Saeed Saadatnejad , Reyhaneh Hosseininejad , Jose Barreiros and more
Potential Business Impact:
Helps robots learn how people move together.
The increasing labor shortage and aging population underline the need for assistive robots to support human care recipients. To enable safe and responsive assistance, robots require accurate human motion prediction in physical interaction scenarios. However, this remains a challenging task due to the variability of assistive settings and the complexity of coupled dynamics in physical interactions. In this work, we address these challenges through two key contributions: (1) HHI-Assist, a dataset comprising motion capture clips of human-human interactions in assistive tasks; and (2) a conditional Transformer-based denoising diffusion model for predicting the poses of interacting agents. Our model effectively captures the coupled dynamics between caregivers and care receivers, demonstrating improvements over baselines and strong generalization to unseen scenarios. By advancing interaction-aware motion prediction and introducing a new dataset, our work has the potential to significantly enhance robotic assistance policies. The dataset and code are available at: https://sites.google.com/view/hhi-assist/home
Similar Papers
Learning to Generate Human-Human-Object Interactions from Textual Descriptions
CV and Pattern Recognition
Teaches computers to show people interacting with objects.
PhysHSI: Towards a Real-World Generalizable and Natural Humanoid-Scene Interaction System
Robotics
Robots learn to move and interact like people.
The Human Robot Social Interaction (HSRI) Dataset: Benchmarking Foundational Models' Social Reasoning
Human-Computer Interaction
Teaches robots to act nicely with people.