Robust inference using density-powered Stein operators
By: Shinto Eguchi
Potential Business Impact:
Makes computer learning ignore bad data.
We introduce a density-power weighted variant for the Stein operator, called the $\gamma$-Stein operator. This is a novel class of operators derived from the $\gamma$-divergence, designed to build robust inference methods for unnormalized probability models. The operator's construction (weighting by the model density raised to a positive power $\gamma$ inherently down-weights the influence of outliers, providing a principled mechanism for robustness. Applying this operator yields a robust generalization of score matching that retains the crucial property of being independent of the model's normalizing constant. We extend this framework to develop two key applications: the $\gamma$-kernelized Stein discrepancy for robust goodness-of-fit testing, and $\gamma$-Stein variational gradient descent for robust Bayesian posterior approximation. Empirical results on contaminated Gaussian and quartic potential models show our methods significantly outperform standard baselines in both robustness and statistical efficiency.
Similar Papers
Copula-Stein Discrepancy: A Generator-Based Stein Operator for Archimedean Dependence
Machine Learning (Stat)
Finds hidden patterns in how things are connected.
Tight Bounds for Schrödinger Potential Estimation in Unpaired Image-to-Image Translation Problems
Machine Learning (CS)
Makes pictures look like other pictures.
Fast Wasserstein rates for estimating probability distributions of probabilistic graphical models
Statistics Theory
Helps computers learn from less information.