Score: 0

On the Optimality of Tracking Fisher Information in Adaptive Testing with Stochastic Binary Responses

Published: October 9, 2025 | arXiv ID: 2510.07862v1

By: Sanghwa Kim, Dohyun Ahn, Seungki Min

Potential Business Impact:

Tests people faster and more accurately.

Business Areas:
A/B Testing Data and Analytics

We study the problem of estimating a continuous ability parameter from sequential binary responses by actively asking questions with varying difficulties, a setting that arises naturally in adaptive testing and online preference learning. Our goal is to certify that the estimate lies within a desired margin of error, using as few queries as possible. We propose a simple algorithm that adaptively selects questions to maximize Fisher information and updates the estimate using a method-of-moments approach, paired with a novel test statistic to decide when the estimate is accurate enough. We prove that this Fisher-tracking strategy achieves optimal performance in both fixed-confidence and fixed-budget regimes, which are commonly invested in the best-arm identification literature. Our analysis overcomes a key technical challenge in the fixed-budget setting -- handling the dependence between the evolving estimate and the query distribution -- by exploiting a structural symmetry in the model and combining large deviation tools with Ville's inequality. Our results provide rigorous theoretical support for simple and efficient adaptive testing procedures.

Page Count
47 pages

Category
Statistics:
Machine Learning (Stat)