Learning Safety-Compatible Observers for Unknown Systems
By: Juho Bae, Daegyeong Roh, Han-Lim Choi
Potential Business Impact:
Makes robots safely guess what's happening.
This paper presents a data-driven approach for jointly learning a robust full-state observer and its robustness certificate for systems with unknown dynamics. Leveraging incremental input-to-state stability (delta ISS) notions, we jointly learn a delta ISS Lyapunov function that serves as the robustness certificate and prove practical convergence of the estimation error under standard fidelity assumptions on the learned models. This renders the observer safety-compatible: they can be consumed by certificate-based safe controllers so that, when the controller tolerates bounded estimation error, the controller's certificate remains valid under output feedback. We further extend the approach to interconnected systems via the small-gain theorem, yielding a distributed observer design framework. We validate the approach on a variety of nonlinear systems.
Similar Papers
Formally Verified Neural Network Controllers for Incremental Input-to-State Stability of Unknown Discrete-Time Systems
Systems and Control
Teaches computers to control machines safely.
Computationally Efficient State and Model Estimation via Interval Observers for Partially Unknown Systems
Systems and Control
Helps computers learn how things work by watching.
Safely Learning Controlled Stochastic Dynamics
Machine Learning (Stat)
Keeps robots safe while learning new tasks.