RAVES-Calib: Robust, Accurate and Versatile Extrinsic Self Calibration Using Optimal Geometric Features
By: Haoxin Zhang , Shuaixin Li , Xiaozhou Zhu and more
Potential Business Impact:
Aligns 3D sensors and cameras perfectly.
In this paper, we present a user-friendly LiDAR-camera calibration toolkit that is compatible with various LiDAR and camera sensors and requires only a single pair of laser points and a camera image in targetless environments. Our approach eliminates the need for an initial transform and remains robust even with large positional and rotational LiDAR-camera extrinsic parameters. We employ the Gluestick pipeline to establish 2D-3D point and line feature correspondences for a robust and automatic initial guess. To enhance accuracy, we quantitatively analyze the impact of feature distribution on calibration results and adaptively weight the cost of each feature based on these metrics. As a result, extrinsic parameters are optimized by filtering out the adverse effects of inferior features. We validated our method through extensive experiments across various LiDAR-camera sensors in both indoor and outdoor settings. The results demonstrate that our method provides superior robustness and accuracy compared to SOTA techniques. Our code is open-sourced on GitHub to benefit the community.
Similar Papers
L2M-Calib: One-key Calibration Method for LiDAR and Multiple Magnetic Sensors
Robotics
Makes robots see better using magnets and lasers.
CaLiV: LiDAR-to-Vehicle Calibration of Arbitrary Sensor Setups
Robotics
Helps self-driving cars see better with lasers.
PLK-Calib: Single-shot and Target-less LiDAR-Camera Extrinsic Calibration using Plücker Lines
Robotics
Helps self-driving cars see better with lasers and cameras.