Score: 0

Efficient Curvature-Aware Hypergradient Approximation for Bilevel Optimization

Published: May 4, 2025 | arXiv ID: 2505.02101v1

By: Youran Dong , Junfeng Yang , Wei Yao and more

Potential Business Impact:

Makes computer learning faster and smarter.

Business Areas:
A/B Testing Data and Analytics

Bilevel optimization is a powerful tool for many machine learning problems, such as hyperparameter optimization and meta-learning. Estimating hypergradients (also known as implicit gradients) is crucial for developing gradient-based methods for bilevel optimization. In this work, we propose a computationally efficient technique for incorporating curvature information into the approximation of hypergradients and present a novel algorithmic framework based on the resulting enhanced hypergradient computation. We provide convergence rate guarantees for the proposed framework in both deterministic and stochastic scenarios, particularly showing improved computational complexity over popular gradient-based methods in the deterministic setting. This improvement in complexity arises from a careful exploitation of the hypergradient structure and the inexact Newton method. In addition to the theoretical speedup, numerical experiments demonstrate the significant practical performance benefits of incorporating curvature information.

Page Count
29 pages

Category
Mathematics:
Optimization and Control