Score: 0

Vector-valued self-normalized concentration inequalities beyond sub-Gaussianity

Published: November 5, 2025 | arXiv ID: 2511.03606v1

By: Diego Martinez-Taboada, Tomas Gonzalez, Aaditya Ramdas

Potential Business Impact:

Helps computers learn faster from data.

Business Areas:
Image Recognition Data and Analytics, Software

The study of self-normalized processes plays a crucial role in a wide range of applications, from sequential decision-making to econometrics. While the behavior of self-normalized concentration has been widely investigated for scalar-valued processes, vector-valued processes remain comparatively underexplored, especially outside of the sub-Gaussian framework. In this contribution, we provide concentration bounds for self-normalized processes with light tails beyond sub-Gaussianity (such as Bennett or Bernstein bounds). We illustrate the relevance of our results in the context of online linear regression, with applications in (kernelized) linear bandits.

Country of Origin
🇺🇸 United States

Page Count
25 pages

Category
Statistics:
Machine Learning (Stat)