Statistical Guarantees in Data-Driven Nonlinear Control: Conformal Robustness for Stability and Safety
By: Ting-Wei Hsu, Hiroyasu Tsukamoto
Potential Business Impact:
Makes robots safely learn new tasks from data.
We present a true-dynamics-agnostic, statistically rigorous framework for establishing exponential stability and safety guarantees of closed-loop, data-driven nonlinear control. Central to our approach is the novel concept of conformal robustness, which robustifies the Lyapunov and zeroing barrier certificates of data-driven dynamical systems against model prediction uncertainties using conformal prediction. It quantifies these uncertainties by leveraging rank statistics of prediction scores over system trajectories, without assuming any specific underlying structure of the prediction model or distribution of the uncertainties. With the quantified uncertainty information, we further construct the conformally robust control Lyapunov function (CR-CLF) and control barrier function (CR-CBF), data-driven counterparts of the CLF and CBF, for fully data-driven control with statistical guarantees of finite-horizon exponential stability and safety. The performance of the proposed concept is validated in numerical simulations with four benchmark nonlinear control problems.
Similar Papers
Towards Data-Driven Model-Free Safety-Critical Control
Systems and Control
Robots move safely without knowing all rules.
Non-Conservative Data-driven Safe Control Design for Nonlinear Systems with Polyhedral Safe Sets
Systems and Control
Makes machines learn to be safe and fast.
Physics-Informed Data-Driven Control of Nonlinear Polynomial Systems with Noisy Data
Systems and Control
Keeps complex machines safe using less data