Drift-Corrected Monocular VIO and Perception-Aware Planning for Autonomous Drone Racing
By: Maulana Bisyir Azhari , Donghun Han , Je In You and more
The Abu Dhabi Autonomous Racing League(A2RL) x Drone Champions League competition(DCL) requires teams to perform high-speed autonomous drone racing using only a single camera and a low-quality inertial measurement unit -- a minimal sensor set that mirrors expert human drone racing pilots. This sensor limitation makes the system susceptible to drift from Visual-Inertial Odometry (VIO), particularly during long and fast flights with aggressive maneuvers. This paper presents the system developed for the championship, which achieved a competitive performance. Our approach corrected VIO drift by fusing its output with global position measurements derived from a YOLO-based gate detector using a Kalman filter. A perception-aware planner generated trajectories that balance speed with the need to keep gates visible for the perception system. The system demonstrated high performance, securing podium finishes across multiple categories: third place in the AI Grand Challenge with top speed of 43.2 km/h, second place in the AI Drag Race with over 59 km/h, and second place in the AI Multi-Drone Race. We detail the complete architecture and present a performance analysis based on experimental data from the competition, contributing our insights on building a successful system for monocular vision-based autonomous drone flight.
Similar Papers
Autonomous Navigation of Cloud-Controlled Quadcopters in Confined Spaces Using Multi-Modal Perception and LLM-Driven High Semantic Reasoning
Robotics
Drones fly safely indoors without GPS.
Dual-Agent Reinforcement Learning for Adaptive and Cost-Aware Visual-Inertial Odometry
Robotics
Lets robots and AR move without getting lost.
Mastering Diverse, Unknown, and Cluttered Tracks for Robust Vision-Based Drone Racing
Robotics
Drones learn to race through messy, unknown places.