Score: 1

Automated Verification of Soundness of DNN Certifiers

Published: April 6, 2025 | arXiv ID: 2504.04542v1

By: Avaljot Singh , Yasmin Chandini Sarita , Charith Mendis and more

Potential Business Impact:

Makes AI trustworthy for important jobs.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

The uninterpretability of Deep Neural Networks (DNNs) hinders their use in safety-critical applications. Abstract Interpretation-based DNN certifiers provide promising avenues for building trust in DNNs. Unsoundness in the mathematical logic of these certifiers can lead to incorrect results. However, current approaches to ensure their soundness rely on manual, expert-driven proofs that are tedious to develop, limiting the speed of developing new certifiers. Automating the verification process is challenging due to the complexity of verifying certifiers for arbitrary DNN architectures and handling diverse abstract analyses. We introduce ProveSound, a novel verification procedure that automates the soundness verification of DNN certifiers for arbitrary DNN architectures. Our core contribution is the novel concept of a symbolic DNN, using which, ProveSound reduces the soundness property, a universal quantification over arbitrary DNNs, to a tractable symbolic representation, enabling verification with standard SMT solvers. By formalizing the syntax and operational semantics of ConstraintFlow, a DSL for specifying certifiers, ProveSound efficiently verifies both existing and new certifiers, handling arbitrary DNN architectures. Our code is available at https://github.com/uiuc-focal-lab/constraintflow.git

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
111 pages

Category
Computer Science:
Programming Languages