Muon Optimizes Under Spectral Norm Constraints
By: Lizhang Chen, Jonathan Li, Qiang Liu
Potential Business Impact:
Makes AI learn better by understanding its math.
The pursuit of faster optimization algorithms remains an active and important research direction in deep learning. Recently, the Muon optimizer [JJB+24] has demonstrated promising empirical performance, but its theoretical foundation remains less understood. In this paper, we bridge this gap and provide a theoretical analysis of Muon by placing it within the Lion-$\mathcal{K}$ family of optimizers [CLLL24]. Specifically, we show that Muon corresponds to Lion-$\mathcal{K}$ when equipped with the nuclear norm, and we leverage the theoretical results of Lion-$\mathcal{K}$ to establish that Muon (with decoupled weight decay) implicitly solves an optimization problem that enforces a constraint on the spectral norm of weight matrices. This perspective not only demystifies the implicit regularization effects of Muon but also leads to natural generalizations through varying the choice of convex map $\mathcal{K}$, allowing for the exploration of a broader class of implicitly regularized and constrained optimization algorithms.
Similar Papers
NorMuon: Making Muon more efficient and scalable
Machine Learning (CS)
Makes AI learn faster and better.
LiMuon: Light and Fast Muon Optimizer for Large Models
Machine Learning (CS)
Makes AI models train faster with less memory.
The Ky Fan Norms and Beyond: Dual Norms and Combinations for Matrix Optimization
Optimization and Control
Makes AI learn better and faster.