The Minimax Lower Bound of Kernel Stein Discrepancy Estimation
By: Jose Cribeiro-Ramallo , Agnideep Aich , Florian Kalinke and more
Potential Business Impact:
Measures how well data fits, faster and better.
Kernel Stein discrepancies (KSDs) have emerged as a powerful tool for quantifying goodness-of-fit over the last decade, featuring numerous successful applications. To the best of our knowledge, all existing KSD estimators with known rate achieve $\sqrt n$-convergence. In this work, we present two complementary results (with different proof strategies), establishing that the minimax lower bound of KSD estimation is $n^{-1/2}$ and settling the optimality of these estimators. Our first result focuses on KSD estimation on $\mathbb R^d$ with the Langevin-Stein operator; our explicit constant for the Gaussian kernel indicates that the difficulty of KSD estimation may increase exponentially with the dimensionality $d$. Our second result settles the minimax lower bound for KSD estimation on general domains.
Similar Papers
Copula-Stein Discrepancy: A Generator-Based Stein Operator for Archimedean Dependence
Machine Learning (Stat)
Finds hidden patterns in how things are connected.
A Computable Measure of Suboptimality for Entropy-Regularised Variational Objectives
Computation
Helps computers learn from data without full information.
A Computable Measure of Suboptimality for Entropy-Regularised Variational Objectives
Computation
Helps computers learn from data without full information.