Robot Localization Using a Learned Keypoint Detector and Descriptor with a Floor Camera and a Feature Rich Industrial Floor
By: Piet Brömmel , Dominik Brämer , Oliver Urbann and more
Potential Business Impact:
Robot finds its place using floor patterns.
The localization of moving robots depends on the availability of good features from the environment. Sensor systems like Lidar are popular, but unique features can also be extracted from images of the ground. This work presents the Keypoint Localization Framework (KOALA), which utilizes deep neural networks that extract sufficient features from an industrial floor for accurate localization without having readable markers. For this purpose, we use a floor covering that can be produced as cheaply as common industrial floors. Although we do not use any filtering, prior, or temporal information, we can estimate our position in 75.7 % of all images with a mean position error of 2 cm and a rotation error of 2.4 %. Thus, the robot kidnapping problem can be solved with high precision in every frame, even while the robot is moving. Furthermore, we show that our framework with our detector and descriptor combination is able to outperform comparable approaches.
Similar Papers
Graph-based Robot Localization Using a Graph Neural Network with a Floor Camera and a Feature Rich Industrial Floor
CV and Pattern Recognition
Helps robots find their way using floor patterns.
Diffusion Based Robust LiDAR Place Recognition
Robotics
Robot finds its exact spot on building sites.
DeepDetect: Learning All-in-One Dense Keypoints
CV and Pattern Recognition
Finds important spots in pictures better than before.