A Generalized Bias-Variance Decomposition for Bregman Divergences
By: David Pfau
Potential Business Impact:
Makes computer learning more accurate for certain tasks.
The bias-variance decomposition is a central result in statistics and machine learning, but is typically presented only for the squared error. We present a generalization of the bias-variance decomposition where the prediction error is a Bregman divergence, which is relevant to maximum likelihood estimation with exponential families. While the result is already known, there was not previously a clear, standalone derivation, so we provide one for pedagogical purposes. A version of this note previously appeared on the author's personal website without context. Here we provide additional discussion and references to the relevant prior literature.
Similar Papers
Geometric Convergence Analysis of Variational Inference via Bregman Divergences
Machine Learning (Stat)
Helps computers learn better by understanding math.
Direct Debiased Machine Learning via Bregman Divergence Minimization
Econometrics
Makes computer predictions more accurate and fair.
Direct Debiased Machine Learning via Bregman Divergence Minimization
Econometrics
Makes computer predictions more accurate and fair.