Event-RGB Fusion for Spacecraft Pose Estimation Under Harsh Lighting
By: Mohsi Jawaid, Marcus Märtens, Tat-Jun Chin
Potential Business Impact:
Helps robots see in space, even in bright light.
Spacecraft pose estimation is crucial for autonomous in-space operations, such as rendezvous, docking and on-orbit servicing. Vision-based pose estimation methods, which typically employ RGB imaging sensors, is a compelling solution for spacecraft pose estimation, but are challenged by harsh lighting conditions, which produce imaging artifacts such as glare, over-exposure, blooming and lens flare. Due to their much higher dynamic range, neuromorphic or event sensors are more resilient to extreme lighting conditions. However, event sensors generally have lower spatial resolution and suffer from reduced signal-to-noise ratio during periods of low relative motion. This work addresses these individual sensor limitations by introducing a sensor fusion approach combining RGB and event sensors. A beam-splitter prism was employed to achieve precise optical and temporal alignment. Then, a RANSAC-based technique was developed to fuse the information from the RGB and event channels to achieve pose estimation that leveraged the strengths of the two modalities. The pipeline was complemented by dropout uncertainty estimation to detect extreme conditions that affect either channel. To benchmark the performance of the proposed event-RGB fusion method, we collected a comprehensive real dataset of RGB and event data for satellite pose estimation in a laboratory setting under a variety of challenging illumination conditions. Encouraging results on the dataset demonstrate the efficacy of our event-RGB fusion approach and further supports the usage of event sensors for spacecraft pose estimation. To support community research on this topic, our dataset will be released publicly.
Similar Papers
RGB-Event Fusion with Self-Attention for Collision Prediction
Robotics
Helps robots avoid crashing into things.
Spatially-guided Temporal Aggregation for Robust Event-RGB Optical Flow Estimation
CV and Pattern Recognition
Makes cameras see fast motion better.
A Survey of 3D Reconstruction with Event Cameras
CV and Pattern Recognition
Helps robots see in fast, dark, or bright places.