A variational approach to dimension-free self-normalized concentration
By: Ben Chugg, Aaditya Ramdas
Potential Business Impact:
Makes math work better for smart computer learning.
We study the self-normalized concentration of vector-valued stochastic processes. We focus on bounds for sub-$\psi$ processes, a tail condition that encompasses a wide variety of well-known distributions (including sub-exponential, sub-Gaussian, sub-gamma, and sub-Poisson distributions). Our results recover and generalize the influential bound of Abbasi-Yadkori et al. (2011) and fill a gap in the literature between determinant-based bounds and those based on condition numbers. As applications we prove a Bernstein inequality for random vectors satisfying a moment condition (which is more general than boundedness), and also provide the first dimension-free, self-normalized empirical Bernstein inequality. Our techniques are based on the variational (PAC-Bayes) approach to concentration.
Similar Papers
Vector-valued self-normalized concentration inequalities beyond sub-Gaussianity
Machine Learning (Stat)
Helps computers learn faster from data.
On extremes for Gaussian subordination
Probability
Predicts rare, extreme events in complex systems.
Sub-Poisson distributions: Concentration inequalities, optimal variance proxies, and closure properties
Probability
Makes math tools work better for predicting random events.