Score: 1

Frac-Connections: Fractional Extension of Hyper-Connections

Published: March 18, 2025 | arXiv ID: 2503.14125v1

By: Defa Zhu , Hongzhi Huang , Jundong Zhou and more

BigTech Affiliations: ByteDance

Potential Business Impact:

Makes computer learning faster and use less memory.

Business Areas:
Darknet Internet Services

Residual connections are central to modern deep learning architectures, enabling the training of very deep networks by mitigating gradient vanishing. Hyper-Connections recently generalized residual connections by introducing multiple connection strengths at different depths, thereby addressing the seesaw effect between gradient vanishing and representation collapse. However, Hyper-Connections increase memory access costs by expanding the width of hidden states. In this paper, we propose Frac-Connections, a novel approach that divides hidden states into multiple parts rather than expanding their width. Frac-Connections retain partial benefits of Hyper-Connections while reducing memory consumption. To validate their effectiveness, we conduct large-scale experiments on language tasks, with the largest being a 7B MoE model trained on up to 3T tokens, demonstrating that Frac-Connections significantly outperform residual connections.

Country of Origin
🇨🇳 China

Page Count
16 pages

Category
Computer Science:
Machine Learning (CS)