PAC-Bayesian Bounds on Constrained f-Entropic Risk Measures
By: Hind Atbir , Farah Cherfaoui , Guillaume Metzler and more
Potential Business Impact:
Makes AI fair for all groups.
PAC generalization bounds on the risk, when expressed in terms of the expected loss, are often insufficient to capture imbalances between subgroups in the data. To overcome this limitation, we introduce a new family of risk measures, called constrained f-entropic risk measures, which enable finer control over distributional shifts and subgroup imbalances via f-divergences, and include the Conditional Value at Risk (CVaR), a well-known risk measure. We derive both classical and disintegrated PAC-Bayesian generalization bounds for this family of risks, providing the first disintegratedPAC-Bayesian guarantees beyond standard risks. Building on this theory, we design a self-bounding algorithm that minimizes our bounds directly, yielding models with guarantees at the subgroup level. Finally, we empirically demonstrate the usefulness of our approach.
Similar Papers
Some theoretical improvements on the tightness of PAC-Bayes risk certificates for neural networks
Machine Learning (CS)
Makes AI more trustworthy and reliable.
PAC-Bayesian Reinforcement Learning Trains Generalizable Policies
Machine Learning (CS)
Helps robots learn faster and safer.
A Framework for Bounding Deterministic Risk with PAC-Bayes: Applications to Majority Votes
Machine Learning (CS)
Lets computers learn one good answer, not many.