Score: 0

Insights from Gradient Dynamics: Gradient Autoscaled Normalization

Published: September 3, 2025 | arXiv ID: 2509.03677v1

By: Vincent-Daniel Yun

Potential Business Impact:

Makes computer learning more stable and accurate.

Business Areas:
A/B Testing Data and Analytics

Gradient dynamics play a central role in determining the stability and generalization of deep neural networks. In this work, we provide an empirical analysis of how variance and standard deviation of gradients evolve during training, showing consistent changes across layers and at the global scale in convolutional networks. Motivated by these observations, we propose a hyperparameter-free gradient normalization method that aligns gradient scaling with their natural evolution. This approach prevents unintended amplification, stabilizes optimization, and preserves convergence guarantees. Experiments on the challenging CIFAR-100 benchmark with ResNet-20, ResNet-56, and VGG-16-BN demonstrate that our method maintains or improves test accuracy even under strong generalization. Beyond practical performance, our study highlights the importance of directly tracking gradient dynamics, aiming to bridge the gap between theoretical expectations and empirical behaviors, and to provide insights for future optimization research.

Country of Origin
πŸ‡ΊπŸ‡Έ United States

Page Count
13 pages

Category
Computer Science:
Machine Learning (CS)