A Generalized Cramér-Rao Bound Using Information Geometry
By: Satyajit Dhadumia, M. Ashok Kumar
Potential Business Impact:
Improves how computers guess answers from data.
In information geometry, statistical models are considered as differentiable manifolds, where each probability distribution represents a unique point on the manifold. A Riemannian metric can be systematically obtained from a divergence function using Eguchi's theory (1992); the well-known Fisher-Rao metric is obtained from the Kullback-Leibler (KL) divergence. The geometric derivation of the classical Cram\'er-Rao Lower Bound (CRLB) by Amari and Nagaoka (2000) is based on this metric. In this paper, we study a Riemannian metric obtained by applying Eguchi's theory to the Basu-Harris-Hjort-Jones (BHHJ) divergence (1998) and derive a generalized Cram\'er-Rao bound using Amari-Nagaoka's approach. There are potential applications for this bound in robust estimation.
Similar Papers
An approach to Fisher-Rao metric for infinite dimensional non-parametric information geometry
Machine Learning (Stat)
Finds hidden patterns in complex data.
Cartan meets Cramér-Rao
Statistics Theory
Improves how well computers guess things by using geometry.
Improving Cramér-Rao Bound And Its Variants: An Extrinsic Geometry Perspective
Statistics Theory
Makes measurements more accurate by understanding shape.