UEOF: A Benchmark Dataset for Underwater Event-Based Optical Flow
By: Nick Truong, Pritam P. Karmokar, William J. Beksi
Underwater imaging is fundamentally challenging due to wavelength-dependent light attenuation, strong scattering from suspended particles, turbidity-induced blur, and non-uniform illumination. These effects impair standard cameras and make ground-truth motion nearly impossible to obtain. On the other hand, event cameras offer microsecond resolution and high dynamic range. Nonetheless, progress on investigating event cameras for underwater environments has been limited due to the lack of datasets that pair realistic underwater optics with accurate optical flow. To address this problem, we introduce the first synthetic underwater benchmark dataset for event-based optical flow derived from physically-based ray-traced RGBD sequences. Using a modern video-to-event pipeline applied to rendered underwater videos, we produce realistic event data streams with dense ground-truth flow, depth, and camera motion. Moreover, we benchmark state-of-the-art learning-based and model-based optical flow prediction methods to understand how underwater light transport affects event formation and motion estimation accuracy. Our dataset establishes a new baseline for future development and evaluation of underwater event-based perception algorithms. The source code and dataset for this project are publicly available at https://robotic-vision-lab.github.io/ueof.
Similar Papers
EREBUS: End-to-end Robust Event Based Underwater Simulation
CV and Pattern Recognition
Teaches robots to see underwater better.
eStonefish-scenes: A synthetically generated dataset for underwater event-based optical flow prediction tasks
CV and Pattern Recognition
Helps underwater robots see and avoid things.
AquaticVision: Benchmarking Visual SLAM in Underwater Environment with Events and Frames
Robotics
Helps robots see better underwater.