Robust Convolution Neural ODEs via Contractivity-promoting regularization
By: Muhammad Zakwan, Liang Xu, Giancarlo Ferrari-Trecate
Potential Business Impact:
Makes AI less fooled by fake pictures.
Neural networks can be fragile to input noise and adversarial attacks. In this work, we consider Convolutional Neural Ordinary Differential Equations (NODEs), a family of continuous-depth neural networks represented by dynamical systems, and propose to use contraction theory to improve their robustness. For a contractive dynamical system two trajectories starting from different initial conditions converge to each other exponentially fast. Contractive Convolutional NODEs can enjoy increased robustness as slight perturbations of the features do not cause a significant change in the output. Contractivity can be induced during training by using a regularization term involving the Jacobian of the system dynamics. To reduce the computational burden, we show that it can also be promoted using carefully selected weight regularization terms for a class of NODEs with slope-restricted activation functions. The performance of the proposed regularizers is illustrated through benchmark image classification tasks on MNIST and FashionMNIST datasets, where images are corrupted by different kinds of noise and attacks.
Similar Papers
Neural network based control of unknown nonlinear systems via contraction analysis
Systems and Control
Makes robots learn to move safely without crashing.
Structure-Preserving Neural Ordinary Differential Equations for Stiff Systems
Numerical Analysis
Makes computer models of changing things more stable.
Deep Neural ODE Operator Networks for PDEs
Machine Learning (CS)
Teaches computers to predict how things change.