Score: 3

HVAdam: A Full-Dimension Adaptive Optimizer

Published: November 25, 2025 | arXiv ID: 2511.20277v1

By: Yiheng Zhang , Shaowu Wu , Yuanzhuo Xu and more

Potential Business Impact:

Makes computer learning faster and smarter.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Adaptive optimizers such as Adam have achieved great success in training large-scale models like large language models and diffusion models. However, they often generalize worse than non-adaptive methods, such as SGD on classical architectures like CNNs. We identify a key cause of this performance gap: adaptivity in pre-conditioners, which limits the optimizer's ability to adapt to diverse optimization landscapes. To address this, we propose Anon (Adaptivity Non-restricted Optimizer with Novel convergence technique), a novel optimizer with continuously tunable adaptivity , allowing it to interpolate between SGD-like and Adam-like behaviors and even extrapolate beyond both. To ensure convergence across the entire adaptivity spectrum, we introduce incremental delay update (IDU), a novel mechanism that is more flexible than AMSGrad's hard max-tracking strategy and enhances robustness to gradient noise. We theoretically establish convergence guarantees under both convex and non-convex settings. Empirically, Anon consistently outperforms state-of-the-art optimizers on representative image classification, diffusion, and language modeling tasks. These results demonstrate that adaptivity can serve as a valuable tunable design principle, and Anon provides the first unified and reliable framework capable of bridging the gap between classical and modern optimizers and surpassing their advantageous properties.

Country of Origin
🇨🇳 🇬🇧 🇨🇦 Canada, United Kingdom, China

Repos / Data Links

Page Count
26 pages

Category
Computer Science:
Machine Learning (CS)