Integrated YOLOP Perception and Lyapunov-based Control for Autonomous Mobile Robot Navigation on Track
By: Mo Chen
Potential Business Impact:
Lets robots drive themselves on paths.
This work presents a real-time autonomous track navigation framework for nonholonomic differential-drive mobile robots by jointly integrating multi-task visual perception and a provably stable tracking controller. The perception pipeline reconstructs lane centerlines using 2D-to-3D camera projection, arc-length based uniform point resampling, and cubic polynomial fitting solved via robust QR least-squares optimization. The controller regulates robot linear and angular velocities through a Lyapunov-stability grounded design, ensuring bounded error dynamics and asymptotic convergence of position and heading deviations even in dynamic and partially perceived lane scenarios, without relying on HD prior maps or global satellite localization. Real-world experiments on embedded platforms verify system fidelity, real-time execution, trajectory smoothness, and closed-loop stability for reliable autonomous navigation.
Similar Papers
Robust Model Predictive Control Design for Autonomous Vehicles with Perception-based Observers
Robotics
Makes robots safer by understanding bad sensor data.
Real-Time LPV-Based Non-Linear Model Predictive Control for Robust Trajectory Tracking in Autonomous Vehicles
Robotics
Helps self-driving cars steer perfectly.
YOPOv2-Tracker: An End-to-End Agile Tracking and Navigation Framework from Perception to Action
Robotics
Drones fly faster and dodge obstacles better.