Score: 0

Sketch-to-Skill: Bootstrapping Robot Learning with Human Drawn Trajectory Sketches

Published: March 14, 2025 | arXiv ID: 2503.11918v1

By: Peihong Yu , Amisha Bhaskar , Anukriti Singh and more

Potential Business Impact:

Draw a sketch to teach robots new tricks.

Business Areas:
Robotics Hardware, Science and Engineering, Software

Training robotic manipulation policies traditionally requires numerous demonstrations and/or environmental rollouts. While recent Imitation Learning (IL) and Reinforcement Learning (RL) methods have reduced the number of required demonstrations, they still rely on expert knowledge to collect high-quality data, limiting scalability and accessibility. We propose Sketch-to-Skill, a novel framework that leverages human-drawn 2D sketch trajectories to bootstrap and guide RL for robotic manipulation. Our approach extends beyond previous sketch-based methods, which were primarily focused on imitation learning or policy conditioning, limited to specific trained tasks. Sketch-to-Skill employs a Sketch-to-3D Trajectory Generator that translates 2D sketches into 3D trajectories, which are then used to autonomously collect initial demonstrations. We utilize these sketch-generated demonstrations in two ways: to pre-train an initial policy through behavior cloning and to refine this policy through RL with guided exploration. Experimental results demonstrate that Sketch-to-Skill achieves ~96% of the performance of the baseline model that leverages teleoperated demonstration data, while exceeding the performance of a pure reinforcement learning policy by ~170%, only from sketch inputs. This makes robotic manipulation learning more accessible and potentially broadens its applications across various domains.

Country of Origin
🇺🇸 United States

Page Count
16 pages

Category
Computer Science:
Robotics