R2DN: Scalable Parameterization of Contracting and Lipschitz Recurrent Deep Networks
By: Nicholas H. Barbara, Ruigang Wang, Ian R. Manchester
Potential Business Impact:
Makes smart computer programs learn faster and better.
This paper presents the Robust Recurrent Deep Network (R2DN), a scalable parameterization of robust recurrent neural networks for machine learning and data-driven control. We construct R2DNs as a feedback interconnection of a linear time-invariant system and a 1-Lipschitz deep feedforward network, and directly parameterize the weights so that our models are stable (contracting) and robust to small input perturbations (Lipschitz) by design. Our parameterization uses a structure similar to the previously-proposed recurrent equilibrium networks (RENs), but without the requirement to iteratively solve an equilibrium layer at each time-step. This speeds up model evaluation and backpropagation on GPUs, and makes it computationally feasible to scale up the network size, batch size, and input sequence length in comparison to RENs. We compare R2DNs to RENs on three representative problems in nonlinear system identification, observer design, and learning-based feedback control and find that training and inference are both up to an order of magnitude faster with similar test set performance, and that training/inference times scale more favorably with respect to model expressivity.
Similar Papers
The R2D2 Deep Neural Network Series for Scalable Non-Cartesian Magnetic Resonance Imaging
Image and Video Processing
Makes MRI scans faster and clearer.
State dimension reduction of recurrent equilibrium networks with contraction and robustness preservation
Systems and Control
Makes smart computer models smaller, faster, and safer.
Lipschitz-Based Robustness Certification for Recurrent Neural Networks via Convex Relaxation
Systems and Control
Makes AI safer for important jobs.