Score: 0

The Minimax Lower Bound of Kernel Stein Discrepancy Estimation

Published: October 16, 2025 | arXiv ID: 2510.15058v1

By: Jose Cribeiro-Ramallo , Agnideep Aich , Florian Kalinke and more

Potential Business Impact:

Measures how well data fits, faster and better.

Business Areas:
A/B Testing Data and Analytics

Kernel Stein discrepancies (KSDs) have emerged as a powerful tool for quantifying goodness-of-fit over the last decade, featuring numerous successful applications. To the best of our knowledge, all existing KSD estimators with known rate achieve $\sqrt n$-convergence. In this work, we present two complementary results (with different proof strategies), establishing that the minimax lower bound of KSD estimation is $n^{-1/2}$ and settling the optimality of these estimators. Our first result focuses on KSD estimation on $\mathbb R^d$ with the Langevin-Stein operator; our explicit constant for the Gaussian kernel indicates that the difficulty of KSD estimation may increase exponentially with the dimensionality $d$. Our second result settles the minimax lower bound for KSD estimation on general domains.

Page Count
25 pages

Category
Statistics:
Machine Learning (Stat)