Integration of Computer Vision with Adaptive Control for Autonomous Driving Using ADORE
By: Abu Shad Ahammed , Md Shahi Amran Hossain , Sayeri Mukherjee and more
Potential Business Impact:
Helps self-driving cars see and react better.
Ensuring safety in autonomous driving requires a seamless integration of perception and decision making under uncertain conditions. Although computer vision (CV) models such as YOLO achieve high accuracy in detecting traffic signs and obstacles, their performance degrades in drift scenarios caused by weather variations or unseen objects. This work presents a simulated autonomous driving system that combines a context aware CV model with adaptive control using the ADORE framework. The CARLA simulator was integrated with ADORE via the ROS bridge, allowing real-time communication between perception, decision, and control modules. A simulated test case was designed in both clear and drift weather conditions to demonstrate the robust detection performance of the perception model while ADORE successfully adapted vehicle behavior to speed limits and obstacles with low response latency. The findings highlight the potential of coupling deep learning-based perception with rule-based adaptive decision making to improve automotive safety critical system.
Similar Papers
Integration of Computer Vision with Adaptive Control for Autonomous Driving Using ADORE
Robotics
Helps self-driving cars see and react safely.
Enhanced Drift-Aware Computer Vision Architecture for Autonomous Driving
CV and Pattern Recognition
Makes self-driving cars see better in bad weather.
Simulating an Autonomous System in CARLA using ROS 2
Robotics
Helps race cars drive themselves super fast.