Score: 0

Nonconvex Decentralized Stochastic Bilevel Optimization under Heavy-Tailed Noises

Published: September 19, 2025 | arXiv ID: 2509.15543v1

By: Xinwen Zhang, Yihan Zhang, Hongchang Gao

Potential Business Impact:

Teaches computers to learn better with messy data.

Business Areas:
A/B Testing Data and Analytics

Existing decentralized stochastic optimization methods assume the lower-level loss function is strongly convex and the stochastic gradient noise has finite variance. These strong assumptions typically are not satisfied in real-world machine learning models. To address these limitations, we develop a novel decentralized stochastic bilevel optimization algorithm for the nonconvex bilevel optimization problem under heavy-tailed noises. Specifically, we develop a normalized stochastic variance-reduced bilevel gradient descent algorithm, which does not rely on any clipping operation. Moreover, we establish its convergence rate by innovatively bounding interdependent gradient sequences under heavy-tailed noises for nonconvex decentralized bilevel optimization problems. As far as we know, this is the first decentralized bilevel optimization algorithm with rigorous theoretical guarantees under heavy-tailed noises. The extensive experimental results confirm the effectiveness of our algorithm in handling heavy-tailed noises.

Country of Origin
🇺🇸 United States

Page Count
14 pages

Category
Computer Science:
Machine Learning (CS)