Finding 3D Positions of Distant Objects from Noisy Camera Movement and Semantic Segmentation Sequences
By: Julius Pesonen, Arno Solin, Eija Honkavaara
Potential Business Impact:
Drones find fires better with fewer computer limits.
3D object localisation based on a sequence of camera measurements is essential for safety-critical surveillance tasks, such as drone-based wildfire monitoring. Localisation of objects detected with a camera can typically be solved with dense depth estimation or 3D scene reconstruction. However, in the context of distant objects or tasks limited by the amount of available computational resources, neither solution is feasible. In this paper, we show that the task can be solved using particle filters for both single and multiple target scenarios. The method was studied using a 3D simulation and a drone-based image segmentation sequence with global navigation satellite system (GNSS)-based camera pose estimates. The results showed that a particle filter can be used to solve practical localisation tasks based on camera poses and image segments in these situations where other solutions fail. The particle filter is independent of the detection method, making it flexible for new tasks. The study also demonstrates that drone-based wildfire monitoring can be conducted using the proposed method paired with a pre-existing image segmentation model.
Similar Papers
Semantic-Aware Particle Filter for Reliable Vineyard Robot Localisation
Robotics
Helps robots find their way in vineyards.
UAV Position Estimation using a LiDAR-based 3D Object Detection Method
Robotics
Helps drones find ground robots without GPS.
Real-Time Navigation for Autonomous Aerial Vehicles Using Video
Robotics
Drones navigate faster and use less power.