Score: 0

Simple and Sharp Generalization Bounds via Lifting

Published: August 26, 2025 | arXiv ID: 2508.18682v2

By: Jingbo Liu

Potential Business Impact:

Makes computer learning more accurate and faster.

Business Areas:
Analytics Data and Analytics

We develop an information-theoretic framework for bounding the expected supremum and tail probabilities of stochastic processes, offering a simpler and sharper alternative to classical chaining and slicing arguments for generalization bounds. The key idea is a lifting argument that produces information-theoretic analogues of empirical process bounds, such as Dudley's entropy integral. The lifting introduces symmetry, yielding sharp bounds even when the classical Dudley integral is loose. As a by-product, we obtain a concise proof of the majorizing measure theorem, providing explicit constants. The information-theoretic approach provides a soft version of classical localized complexity bounds in generalization theory, but is more concise and does not require the slicing argument. We apply this approach to empirical risk minimization over Sobolev ellipsoids and weak $\ell_q$ balls, obtaining sharper convergence rates or extensions to settings not covered by classical methods.

Country of Origin
🇺🇸 United States

Page Count
74 pages

Category
Mathematics:
Statistics Theory