Robustly Invertible Nonlinear Dynamics and the BiLipREN: Contracting Neural Models with Contracting Inverses
By: Yurui Zhang, Ruigang Wang, Ian R. Manchester
Potential Business Impact:
Makes computer models remember and predict better.
We study the invertibility of nonlinear dynamical systems from the perspective of contraction and incremental stability analysis and propose a new invertible recurrent neural model: the BiLipREN. In particular, we consider a nonlinear state space model to be robustly invertible if an inverse exists with a state space realisation, and both the forward model and its inverse are contracting, i.e. incrementally exponentially stable, and Lipschitz, i.e. have bounded incremental gain. This property of bi-Lipschitzness implies both robustness in the sense of sensitivity to input perturbations, as well as robust distinguishability of different inputs from their corresponding outputs, i.e. the inverse model robustly reconstructs the input sequence despite small perturbations to the initial conditions and measured output. Building on this foundation, we propose a parameterization of neural dynamic models: bi-Lipschitz recurrent equilibrium networks (biLipREN), which are robustly invertible by construction. Moreover, biLipRENs can be composed with orthogonal linear systems to construct more general bi-Lipschitz dynamic models, e.g., a nonlinear analogue of minimum-phase/all-pass (inner/outer) factorization. We illustrate the utility of our proposed approach with numerical examples.
Similar Papers
State dimension reduction of recurrent equilibrium networks with contraction and robustness preservation
Systems and Control
Makes smart computer models smaller, faster, and safer.
React to Surprises: Stable-by-Design Neural Feedback Control and the Youla-REN
Systems and Control
Makes robots learn to move safely and surely.
R2DN: Scalable Parameterization of Contracting and Lipschitz Recurrent Deep Networks
Machine Learning (CS)
Makes smart computer programs learn faster and better.