Vector-valued self-normalized concentration inequalities beyond sub-Gaussianity
By: Diego Martinez-Taboada, Tomas Gonzalez, Aaditya Ramdas
Potential Business Impact:
Helps computers learn faster from data.
The study of self-normalized processes plays a crucial role in a wide range of applications, from sequential decision-making to econometrics. While the behavior of self-normalized concentration has been widely investigated for scalar-valued processes, vector-valued processes remain comparatively underexplored, especially outside of the sub-Gaussian framework. In this contribution, we provide concentration bounds for self-normalized processes with light tails beyond sub-Gaussianity (such as Bennett or Bernstein bounds). We illustrate the relevance of our results in the context of online linear regression, with applications in (kernelized) linear bandits.
Similar Papers
A variational approach to dimension-free self-normalized concentration
Probability
Makes math work better for smart computer learning.
Online Policy Learning via a Self-Normalized Maximal Inequality
Machine Learning (Stat)
Helps computers learn better from changing information.
Matrix Rosenthal and Concentration Inequalities for Markov Chains with Applications in Statistical Learning
Probability
Improves computer learning with tricky data.