Score: 2

BaB-prob: Branch and Bound with Preactivation Splitting for Probabilistic Verification of Neural Networks

Published: September 30, 2025 | arXiv ID: 2509.25647v1

By: Fangji Wang, Panagiotis Tsiotras

Potential Business Impact:

Checks if AI makes the right choices.

Business Areas:
A/B Testing Data and Analytics

Branch-and-bound with preactivation splitting has been shown highly effective for deterministic verification of neural networks. In this paper, we extend this framework to the probabilistic setting. We propose BaB-prob that iteratively divides the original problem into subproblems by splitting preactivations and leverages linear bounds computed by linear bound propagation to bound the probability for each subproblem. We prove soundness and completeness of BaB-prob for feedforward-ReLU neural networks. Furthermore, we introduce the notion of uncertainty level and design two efficient strategies for preactivation splitting, yielding BaB-prob-ordered and BaB+BaBSR-prob. We evaluate BaB-prob on untrained networks, MNIST and CIFAR-10 models, respectively, and VNN-COMP 2025 benchmarks. Across these settings, our approach consistently outperforms state-of-the-art approaches in medium- to high-dimensional input problems.

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
20 pages

Category
Computer Science:
Machine Learning (CS)