Score: 0

On the Optimal Construction of Unbiased Gradient Estimators for Zeroth-Order Optimization

Published: October 22, 2025 | arXiv ID: 2510.19953v1

By: Shaocong Ma, Heng Huang

Potential Business Impact:

Finds best answers without needing all the steps.

Business Areas:
A/B Testing Data and Analytics

Zeroth-order optimization (ZOO) is an important framework for stochastic optimization when gradients are unavailable or expensive to compute. A potential limitation of existing ZOO methods is the bias inherent in most gradient estimators unless the perturbation stepsize vanishes. In this paper, we overcome this biasedness issue by proposing a novel family of unbiased gradient estimators based solely on function evaluations. By reformulating directional derivatives as a telescoping series and sampling from carefully designed distributions, we construct estimators that eliminate bias while maintaining favorable variance. We analyze their theoretical properties, derive optimal scaling distributions and perturbation stepsizes of four specific constructions, and prove that SGD using the proposed estimators achieves optimal complexity for smooth non-convex objectives. Experiments on synthetic tasks and language model fine-tuning confirm the superior accuracy and convergence of our approach compared to standard methods.

Country of Origin
🇺🇸 United States

Page Count
29 pages

Category
Computer Science:
Machine Learning (CS)