YOPOv2-Tracker: An End-to-End Agile Tracking and Navigation Framework from Perception to Action
By: Junjie Lu , Yulin Hui , Xuewei Zhang and more
Potential Business Impact:
Drones fly faster and dodge obstacles better.
Traditional target tracking pipelines including detection, mapping, navigation, and control are comprehensive but introduce high latency, limitting the agility of quadrotors. On the contrary, we follow the design principle of "less is more", striving to simplify the process while maintaining effectiveness. In this work, we propose an end-to-end agile tracking and navigation framework for quadrotors that directly maps the sensory observations to control commands. Importantly, leveraging the multimodal nature of navigation and detection tasks, our network maintains interpretability by explicitly integrating the independent modules of the traditional pipeline, rather than a crude action regression. In detail, we adopt a set of motion primitives as anchors to cover the searching space regarding the feasible region and potential target. Then we reformulate the trajectory optimization as regression of primitive offsets and associated costs considering the safety, smoothness, and other metrics. For tracking task, the trajectories are expected to approach the target and additional objectness scores are predicted. Subsequently, the predictions, after compensation for the estimated lumped disturbance, are transformed into thrust and attitude as control commands for swift response. During training, we seamlessly integrate traditional motion planning with deep learning by directly back-propagating the gradients of trajectory costs to the network, eliminating the need for expert demonstration in imitation learning and providing more direct guidance than reinforcement learning. Finally, we deploy the algorithm on a compact quadrotor and conduct real-world validations in both forest and building environments to demonstrate the efficiency of the proposed method.
Similar Papers
Integrated YOLOP Perception and Lyapunov-based Control for Autonomous Mobile Robot Navigation on Track
Robotics
Lets robots drive themselves on paths.
YOPO-Nav: Visual Navigation using 3DGS Graphs from One-Pass Videos
Robotics
Robot follows paths from videos without maps.
YOLOMG: Vision-based Drone-to-Drone Detection with Appearance and Pixel-Level Motion Fusion
CV and Pattern Recognition
Finds tiny drones in busy skies.