6-DoF Object Tracking with Event-based Optical Flow and Frames
By: Zhichao Li , Arren Glover , Chiara Bartolozzi and more
Potential Business Impact:
Tracks fast-moving objects with special cameras.
Tracking the position and orientation of objects in space (i.e., in 6-DoF) in real time is a fundamental problem in robotics for environment interaction. It becomes more challenging when objects move at high-speed due to frame rate limitations in conventional cameras and motion blur. Event cameras are characterized by high temporal resolution, low latency and high dynamic range, that can potentially overcome the impacts of motion blur. Traditional RGB cameras provide rich visual information that is more suitable for the challenging task of single-shot object pose estimation. In this work, we propose using event-based optical flow combined with an RGB based global object pose estimator for 6-DoF pose tracking of objects at high-speed, exploiting the core advantages of both types of vision sensors. Specifically, we propose an event-based optical flow algorithm for object motion measurement to implement an object 6-DoF velocity tracker. By integrating the tracked object 6-DoF velocity with low frequency estimated pose from the global pose estimator, the method can track pose when objects move at high-speed. The proposed algorithm is tested and validated on both synthetic and real world data, demonstrating its effectiveness, especially in high-speed motion scenarios.
Similar Papers
Planar Velocity Estimation for Fast-Moving Mobile Robots Using Event-Based Optical Flow
Robotics
Helps cars know speed even on slippery roads.
Unleashing the Temporal Potential of Stereo Event Cameras for Continuous-Time 3D Object Detection
CV and Pattern Recognition
Lets self-driving cars see moving objects better.
E-MoFlow: Learning Egomotion and Optical Flow from Event Data via Implicit Regularization
CV and Pattern Recognition
Helps cameras understand movement without seeing everything.