Score: 1

Differentially Private Bilevel Optimization: Efficient Algorithms with Near-Optimal Rates

Published: June 15, 2025 | arXiv ID: 2506.12994v1

By: Andrew Lowy, Daogao Liu

Potential Business Impact:

Protects private data in smart learning machines.

Business Areas:
A/B Testing Data and Analytics

Bilevel optimization, in which one optimization problem is nested inside another, underlies many machine learning applications with a hierarchical structure -- such as meta-learning and hyperparameter optimization. Such applications often involve sensitive training data, raising pressing concerns about individual privacy. Motivated by this, we study differentially private bilevel optimization. We first focus on settings where the outer-level objective is \textit{convex}, and provide novel upper and lower bounds on the excess risk for both pure and approximate differential privacy, covering both empirical and population-level loss. These bounds are nearly tight and essentially match the optimal rates for standard single-level differentially private ERM and stochastic convex optimization (SCO), up to additional terms that capture the intrinsic complexity of the nested bilevel structure. The bounds are achieved in polynomial time via efficient implementations of the exponential and regularized exponential mechanisms. A key technical contribution is a new method and analysis of log-concave sampling under inexact function evaluations, which may be of independent interest. In the \textit{non-convex} setting, we develop novel algorithms with state-of-the-art rates for privately finding approximate stationary points. Notably, our bounds do not depend on the dimension of the inner problem.

Country of Origin
🇺🇸 United States

Page Count
32 pages

Category
Computer Science:
Machine Learning (CS)