EREBUS: End-to-end Robust Event Based Underwater Simulation
By: Hitesh Kyatham , Arjun Suresh , Aadi Palnitkar and more
Potential Business Impact:
Teaches robots to see underwater better.
The underwater domain presents a vast array of challenges for roboticists and computer vision researchers alike, such as poor lighting conditions and high dynamic range scenes. In these adverse conditions, traditional vision techniques struggle to adapt and lead to suboptimal performance. Event-based cameras present an attractive solution to this problem, mitigating the issues of traditional cameras by tracking changes in the footage on a frame-by-frame basis. In this paper, we introduce a pipeline which can be used to generate realistic synthetic data of an event-based camera mounted to an AUV (Autonomous Underwater Vehicle) in an underwater environment for training vision models. We demonstrate the effectiveness of our pipeline using the task of rock detection with poor visibility and suspended particulate matter, but the approach can be generalized to other underwater tasks.
Similar Papers
SEBVS: Synthetic Event-based Visual Servoing for Robot Navigation and Manipulation
Robotics
Lets robots see and move better in tough spots.
Drone Detection with Event Cameras
CV and Pattern Recognition
Finds tiny drones in any light.
AquaticVision: Benchmarking Visual SLAM in Underwater Environment with Events and Frames
Robotics
Helps robots see better underwater.