Score: 0

A Generalized Cramér-Rao Bound Using Information Geometry

Published: July 28, 2025 | arXiv ID: 2507.21022v1

By: Satyajit Dhadumia, M. Ashok Kumar

Potential Business Impact:

Improves how computers guess answers from data.

Business Areas:
Analytics Data and Analytics

In information geometry, statistical models are considered as differentiable manifolds, where each probability distribution represents a unique point on the manifold. A Riemannian metric can be systematically obtained from a divergence function using Eguchi's theory (1992); the well-known Fisher-Rao metric is obtained from the Kullback-Leibler (KL) divergence. The geometric derivation of the classical Cram\'er-Rao Lower Bound (CRLB) by Amari and Nagaoka (2000) is based on this metric. In this paper, we study a Riemannian metric obtained by applying Eguchi's theory to the Basu-Harris-Hjort-Jones (BHHJ) divergence (1998) and derive a generalized Cram\'er-Rao bound using Amari-Nagaoka's approach. There are potential applications for this bound in robust estimation.

Page Count
6 pages

Category
Mathematics:
Statistics Theory