Score: 1

Conservative Perception Models for Probabilistic Verification

Published: March 23, 2025 | arXiv ID: 2503.18077v3

By: Matthew Cleaveland , Pengyuan Lu , Oleg Sokolsky and more

BigTech Affiliations: Massachusetts Institute of Technology

Potential Business Impact:

Makes self-driving cars safer by checking their "eyes."

Business Areas:
Simulation Software

Verifying the behaviors of autonomous systems with learned perception components is a challenging problem due to the complexity of the perception and the uncertainty of operating environments. Probabilistic model checking is a powerful tool for providing guarantees on stochastic models of systems. However, constructing model-checkable models of black-box perception components for system-level mathematical guarantees has been an enduring challenge. In this paper, we propose a method for constructing provably conservative Interval Markov Decision Process (IMDP) models of closed-loop systems with perception components. We prove that our technique results in conservative abstractions with a user-specified probability. We evaluate our approach in an automatic braking case study using both a synthetic perception component and the object detector YOLO11 in the CARLA driving simulator.

Country of Origin
🇺🇸 United States

Page Count
13 pages

Category
Computer Science:
Formal Languages and Automata Theory