Score: 0

Lipschitz-Based Robustness Certification for Recurrent Neural Networks via Convex Relaxation

Published: September 22, 2025 | arXiv ID: 2509.17898v1

By: Paul Hamelbeck, Johannes Schiffer

Potential Business Impact:

Makes AI safer for important jobs.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Robustness certification against bounded input noise or adversarial perturbations is increasingly important for deployment recurrent neural networks (RNNs) in safety-critical control applications. To address this challenge, we present RNN-SDP, a relaxation based method that models the RNN's layer interactions as a convex problem and computes a certified upper bound on the Lipschitz constant via semidefinite programming (SDP). We also explore an extension that incorporates known input constraints to further tighten the resulting Lipschitz bounds. RNN-SDP is evaluated on a synthetic multi-tank system, with upper bounds compared to empirical estimates. While incorporating input constraints yields only modest improvements, the general method produces reasonably tight and certifiable bounds, even as sequence length increases. The results also underscore the often underestimated impact of initialization errors, an important consideration for applications where models are frequently re-initialized, such as model predictive control (MPC).

Page Count
10 pages

Category
Electrical Engineering and Systems Science:
Systems and Control