CP-NCBF: A Conformal Prediction-based Approach to Synthesize Verified Neural Control Barrier Functions
By: Manan Tayal , Aditya Singh , Pushpak Jagtap and more
Potential Business Impact:
Makes robots safer by checking their actions.
Control Barrier Functions (CBFs) are a practical approach for designing safety-critical controllers, but constructing them for arbitrary nonlinear dynamical systems remains a challenge. Recent efforts have explored learning-based methods, such as neural CBFs (NCBFs), to address this issue. However, ensuring the validity of NCBFs is difficult due to potential learning errors. In this letter, we propose a novel framework that leverages split-conformal prediction to generate formally verified neural CBFs with probabilistic guarantees based on a user-defined error rate, referred to as CP-NCBF. Unlike existing methods that impose Lipschitz constraints on neural CBF-leading to scalability limitations and overly conservative safe sets--our approach is sample-efficient, scalable, and results in less restrictive safety regions. We validate our framework through case studies on obstacle avoidance in autonomous driving and geo-fencing of aerial vehicles, demonstrating its ability to generate larger and less conservative safe sets compared to conventional techniques.
Similar Papers
Neural Control Barrier Functions from Physics Informed Neural Networks
Robotics
Makes robots safer by learning rules from physics.
CPED-NCBFs: A Conformal Prediction for Expert Demonstration-based Neural Control Barrier Functions
Robotics
Checks if AI safely follows rules.
ORN-CBF: Learning Observation-conditioned Residual Neural Control Barrier Functions via Hypernetworks
Robotics
Makes robots safer by learning from mistakes.