Score: 0

Semiparametric plug-in estimation, sup-norm risk bounds, marginal optimization, and inference in BTL model

Published: March 19, 2025 | arXiv ID: 2503.15045v2

By: Vladimir Spokoiny

Potential Business Impact:

Helps compare many things with less data.

Business Areas:
A/B Testing Data and Analytics

The recent paper \cite{GSZ2023} on estimation and inference for top-ranking problem in Bradley-Terry-Lice (BTL) model presented a surprising result: component-wise estimation and inference can be done under much weaker conditions on the number of comparison then it is required for the full dimensional estimation. The present paper revisits this finding from completely different viewpoint. Namely, we show how a theoretical study of \emph{estimation in sup-norm} can be reduced to the analysis of \emph{plug-in semiparametric estimation}. For the latter, we adopt and extend the general approach from \cite{Sp2024} to high-dimensional estimation and inference. The main tool of the analysis is a theory of \emph{perturbed marginal optimization} when an objective function depends on a low-dimensional target parameter along with a high-dimensional nuisance parameter. A particular focus of the study is the critical dimension condition. Full-dimensional estimation requires in general the condition \( \mathbbmsl{N} \gg \mathbb{p} \) between the effective parameter dimension \( \mathbb{p} \) and the effective sample size \( \mathbbmsl{N} \) corresponding to the smallest eigenvalue of the Fisher information matrix \( \mathbbmsl{F} \). Inference on the estimated parameter is even more demanding: the condition \( \mathbbmsl{N} \gg \mathbb{p}^{2} \) cannot be generally avoided; see \cite{Sp2024}. However, for the sup-norm estimation, the critical dimension condition can be reduced to \( \mathbbmsl{N} \geq C \log p \).

Page Count
41 pages

Category
Mathematics:
Statistics Theory