Robotic Sim-to-Real Transfer for Long-Horizon Pick-and-Place Tasks in the Robotic Sim2Real Competition
By: Ming Yang , Hongyu Cao , Lixuan Zhao and more
Potential Business Impact:
Robot learns to do tasks in real world from games.
This paper presents a fully autonomous robotic system that performs sim-to-real transfer in complex long-horizon tasks involving navigation, recognition, grasping, and stacking in an environment with multiple obstacles. The key feature of the system is the ability to overcome typical sensing and actuation discrepancies during sim-to-real transfer and to achieve consistent performance without any algorithmic modifications. To accomplish this, a lightweight noise-resistant visual perception system and a nonlinearity-robust servo system are adopted. We conduct a series of tests in both simulated and real-world environments. The visual perception system achieves the speed of 11 ms per frame due to its lightweight nature, and the servo system achieves sub-centimeter accuracy with the proposed controller. Both exhibit high consistency during sim-to-real transfer. Benefiting from these, our robotic system took first place in the mineral searching task of the Robotic Sim2Real Challenge hosted at ICRA 2024.
Similar Papers
Scalable Real2Sim: Physics-Aware Asset Generation Via Robotic Pick-and-Place Setups
Robotics
Robots learn object shapes and weights automatically.
Vid2Sim: Realistic and Interactive Simulation from Video for Urban Navigation
CV and Pattern Recognition
Robots learn better by practicing in realistic fake worlds.
Towards bridging the gap: Systematic sim-to-real transfer for diverse legged robots
Robotics
Robots walk better and use less power.