mHC-lite: You Don't Need 20 Sinkhorn-Knopp Iterations
By: Yongyi Yang, Jianyang Gao
Potential Business Impact:
Makes computer learning faster and more stable.
Hyper-Connections (HC) generalizes residual connections by introducing dynamic residual matrices that mix information across multiple residual streams, accelerating convergence in deep neural networks. However, unconstrained residual matrices can compromise training stability. To address this, DeepSeek's Manifold-Constrained Hyper-Connections (mHC) approximately projects these matrices onto the Birkhoff polytope via iterative Sinkhorn--Knopp (SK) normalization. We identify two limitations of this approach: (i) finite SK iterations do not guarantee exact doubly stochasticity, leaving an approximation gap that can accumulate through network depth and undermine stability; (ii) efficient SK implementation requires highly specialized CUDA kernels, raising engineering barriers and reducing portability. Motivated by the Birkhoff--von Neumann theorem, we propose mHC-lite, a simple reparameterization that explicitly constructs doubly stochastic matrices as convex combinations of permutation matrices. This approach guarantees exact doubly stochasticity by construction and can be implemented using only native matrix operations. Extensive experiments demonstrate that mHC-lite matches or exceeds mHC in performance while achieving higher training throughput with a naive implementation and eliminating the residual instabilities observed in both HC and mHC. The code is publicly available at https://github.com/FFTYYY/mhc-lite.
Similar Papers
mHC: Manifold-Constrained Hyper-Connections
Computation and Language
Fixes computer learning to work better and bigger.
mHC: Manifold-Constrained Hyper-Connections
Computation and Language
Makes computer learning models train better and faster.
mHC-GNN: Manifold-Constrained Hyper-Connections for Graph Neural Networks
Machine Learning (CS)
Makes computer learning models work better for longer.