A Systems-Theoretic View on the Convergence of Algorithms under Disturbances
By: Guner Dilsad Er, Sebastian Trimpe, Michael Muehlebach
Algorithms increasingly operate within complex physical, social, and engineering systems where they are exposed to disturbances, noise, and interconnections with other dynamical systems. This article extends known convergence guarantees of an algorithm operating in isolation (i.e., without disturbances) and systematically derives stability bounds and convergence rates in the presence of such disturbances. By leveraging converse Lyapunov theorems, we derive key inequalities that quantify the impact of disturbances. We further demonstrate how our result can be utilized to assess the effects of disturbances on algorithmic performance in a wide variety of applications, including communication constraints in distributed learning, sensitivity in machine learning generalization, and intentional noise injection for privacy. This underpins the role of our result as a unifying tool for algorithm analysis in the presence of noise, disturbances, and interconnections with other dynamical systems.
Similar Papers
Convergence in On-line Learning of Static and Dynamic Systems
Systems and Control
Teaches computers to learn systems better.
Universal Learning of Nonlinear Dynamics
Machine Learning (CS)
Learns how things change, even when wobbly.
Stable neural networks and connections to continuous dynamical systems
Numerical Analysis
Makes computer brains harder to trick.