SafeHumanoid: VLM-RAG-driven Control of Upper Body Impedance for Humanoid Robot
By: Yara Mahmoud , Jeffrin Sam , Nguyen Khang and more
Potential Business Impact:
Robot safely changes speed and stiffness around people.
Safe and trustworthy Human Robot Interaction (HRI) requires robots not only to complete tasks but also to regulate impedance and speed according to scene context and human proximity. We present SafeHumanoid, an egocentric vision pipeline that links Vision Language Models (VLMs) with Retrieval-Augmented Generation (RAG) to schedule impedance and velocity parameters for a humanoid robot. Egocentric frames are processed by a structured VLM prompt, embedded and matched against a curated database of validated scenarios, and mapped to joint-level impedance commands via inverse kinematics. We evaluate the system on tabletop manipulation tasks with and without human presence, including wiping, object handovers, and liquid pouring. The results show that the pipeline adapts stiffness, damping, and speed profiles in a context-aware manner, maintaining task success while improving safety. Although current inference latency (up to 1.4 s) limits responsiveness in highly dynamic settings, SafeHumanoid demonstrates that semantic grounding of impedance control is a viable path toward safer, standard-compliant humanoid collaboration.
Similar Papers
GentleHumanoid: Learning Upper-body Compliance for Contact-rich Human and Object Interaction
Robotics
Robots can gently hug and help people.
SwarmVLM: VLM-Guided Impedance Control for Autonomous Navigation of Heterogeneous Robots in Dynamic Warehousing
Robotics
Drones and robots work together to move things.
Variable Impedance Control for Floating-Base Supernumerary Robotic Leg in Walking Assistance
Robotics
Robot leg helps people walk safely and smoothly.