Uncertainty-Aware Knowledge Distillation for Compact and Efficient 6DoF Pose Estimation
By: Nassim Ali Ousalah , Anis Kacem , Enjie Ghorbel and more
Potential Business Impact:
Makes robots see objects better with less data.
Compact and efficient 6DoF object pose estimation is crucial in applications such as robotics, augmented reality, and space autonomous navigation systems, where lightweight models are critical for real-time accurate performance. This paper introduces a novel uncertainty-aware end-to-end Knowledge Distillation (KD) framework focused on keypoint-based 6DoF pose estimation. Keypoints predicted by a large teacher model exhibit varying levels of uncertainty that can be exploited within the distillation process to enhance the accuracy of the student model while ensuring its compactness. To this end, we propose a distillation strategy that aligns the student and teacher predictions by adjusting the knowledge transfer based on the uncertainty associated with each teacher keypoint prediction. Additionally, the proposed KD leverages this uncertainty-aware alignment of keypoints to transfer the knowledge at key locations of their respective feature maps. Experiments on the widely-used LINEMOD benchmark demonstrate the effectiveness of our method, achieving superior 6DoF object pose estimation with lightweight models compared to state-of-the-art approaches. Further validation on the SPEED+ dataset for spacecraft pose estimation highlights the robustness of our approach under diverse 6DoF pose estimation scenarios.
Similar Papers
Uncertainty-Aware Dual-Student Knowledge Distillation for Efficient Image Classification
CV and Pattern Recognition
Teaches small computers to learn like big ones.
Distilling Future Temporal Knowledge with Masked Feature Reconstruction for 3D Object Detection
CV and Pattern Recognition
Helps self-driving cars see the future.
Uncertainty-Aware Multi-Expert Knowledge Distillation for Imbalanced Disease Grading
CV and Pattern Recognition
Helps doctors grade disease pictures better.