Score: 0

A Framework for Bounding Deterministic Risk with PAC-Bayes: Applications to Majority Votes

Published: October 29, 2025 | arXiv ID: 2510.25569v1

By: Benjamin Leblanc, Pascal Germain

Potential Business Impact:

Lets computers learn one good answer, not many.

Business Areas:
A/B Testing Data and Analytics

PAC-Bayes is a popular and efficient framework for obtaining generalization guarantees in situations involving uncountable hypothesis spaces. Unfortunately, in its classical formulation, it only provides guarantees on the expected risk of a randomly sampled hypothesis. This requires stochastic predictions at test time, making PAC-Bayes unusable in many practical situations where a single deterministic hypothesis must be deployed. We propose a unified framework to extract guarantees holding for a single hypothesis from stochastic PAC-Bayesian guarantees. We present a general oracle bound and derive from it a numerical bound and a specialization to majority vote. We empirically show that our approach consistently outperforms popular baselines (by up to a factor of 2) when it comes to generalization bounds on deterministic classifiers.

Page Count
21 pages

Category
Computer Science:
Machine Learning (CS)