Dimension-Free Bounds for Generalized First-Order Methods via Gaussian Coupling
By: Galen Reeves
Potential Business Impact:
Makes computer learning faster and more accurate.
We establish non-asymptotic bounds on the finite-sample behavior of generalized first-order iterative algorithms -- including gradient-based optimization methods and approximate message passing (AMP) -- with Gaussian data matrices and full-memory, non-separable nonlinearities. The central result constructs an explicit coupling between the iterates of a generalized first-order method and a conditionally Gaussian process whose covariance evolves deterministically via a finite-dimensional state evolution recursion. This coupling yields tight, dimension-free bounds under mild Lipschitz and moment-matching conditions. Our analysis departs from classical inductive AMP proofs by employing a direct comparison between the generalized first-order method and the conditionally Gaussian comparison process. This approach provides a unified derivation of AMP theory for Gaussian matrices without relying on separability or asymptotics. A complementary lower bound on the Wasserstein distance demonstrates the sharpness of our upper bounds.
Similar Papers
Generalized Orthogonal Approximate Message-Passing for Sublinear Sparsity
Information Theory
Fixes computer guessing of hidden information faster.
Markov Chains Approximate Message Passing
Data Structures and Algorithms
Helps computers find hidden patterns in messy data.
Markov Chains Approximate Message Passing
Data Structures and Algorithms
Helps computers find hidden patterns in noisy data.