DynOPETs: A Versatile Benchmark for Dynamic Object Pose Estimation and Tracking in Moving Camera Scenarios
By: Xiangting Meng , Jiaqi Yang , Mingshu Chen and more
Potential Business Impact:
Helps robots see moving things from moving cameras.
In the realm of object pose estimation, scenarios involving both dynamic objects and moving cameras are prevalent. However, the scarcity of corresponding real-world datasets significantly hinders the development and evaluation of robust pose estimation models. This is largely attributed to the inherent challenges in accurately annotating object poses in dynamic scenes captured by moving cameras. To bridge this gap, this paper presents a novel dataset DynOPETs and a dedicated data acquisition and annotation pipeline tailored for object pose estimation and tracking in such unconstrained environments. Our efficient annotation method innovatively integrates pose estimation and pose tracking techniques to generate pseudo-labels, which are subsequently refined through pose graph optimization. The resulting dataset offers accurate pose annotations for dynamic objects observed from moving cameras. To validate the effectiveness and value of our dataset, we perform comprehensive evaluations using 18 state-of-the-art methods, demonstrating its potential to accelerate research in this challenging domain. The dataset will be made publicly available to facilitate further exploration and advancement in the field.
Similar Papers
Dynamic Camera Poses and Where to Find Them
CV and Pattern Recognition
Creates realistic videos by tracking camera movement.
DynamicPose: Real-time and Robust 6D Object Pose Tracking for Fast-Moving Cameras and Objects
CV and Pattern Recognition
Tracks moving things even when camera and object zoom.
AthletePose3D: A Benchmark Dataset for 3D Human Pose Estimation and Kinematic Validation in Athletic Movements
CV and Pattern Recognition
Helps computers track athletes' fast moves better.