Skill-Aware Diffusion for Generalizable Robotic Manipulation
By: Aoshen Huang , Jiaming Chen , Jiyu Cheng and more
Potential Business Impact:
Robots learn new jobs faster by sharing skills.
Robust generalization in robotic manipulation is crucial for robots to adapt flexibly to diverse environments. Existing methods usually improve generalization by scaling data and networks, but model tasks independently and overlook skill-level information. Observing that tasks within the same skill share similar motion patterns, we propose Skill-Aware Diffusion (SADiff), which explicitly incorporates skill-level information to improve generalization. SADiff learns skill-specific representations through a skill-aware encoding module with learnable skill tokens, and conditions a skill-constrained diffusion model to generate object-centric motion flow. A skill-retrieval transformation strategy further exploits skill-specific trajectory priors to refine the mapping from 2D motion flow to executable 3D actions. Furthermore, we introduce IsaacSkill, a high-fidelity dataset containing fundamental robotic skills for comprehensive evaluation and sim-to-real transfer. Experiments in simulation and real-world settings show that SADiff achieves good performance and generalization across various manipulation tasks. Code, data, and videos are available at https://sites.google.com/view/sa-diff.
Similar Papers
S$^2$-Diffusion: Generalizing from Instance-level to Category-level Skills in Robot Manipulation
Robotics
Robots learn new tasks from fewer examples.
S$^2$-Diffusion: Generalizing from Instance-level to Category-level Skills in Robot Manipulation
Robotics
Robots learn new tasks from few examples.
Learning Diffusion Policy from Primitive Skills for Robot Manipulation
Robotics
Teaches robots to do tasks by breaking them down.