Fusion Matters: Length-Aware Analysis of Positional-Encoding Fusion in Transformers
By: Mohamed Amine Hallam, Kuo-Kun Tseng
Potential Business Impact:
Improves AI understanding of long texts.
Transformers require positional encodings to represent sequence order, yet most prior work focuses on designing new positional encodings rather than examining how positional information is fused with token embeddings. In this paper, we study whether the fusion mechanism itself affects performance, particularly in long-sequence settings. We conduct a controlled empirical study comparing three canonical fusion strategies--element-wise addition, concatenation with projection, and scalar gated fusion--under identical Transformer architectures, data splits, and random seeds. Experiments on three text classification datasets spanning short (AG News), medium (IMDB), and long (ArXiv) sequences show that fusion choice has negligible impact on short texts but produces consistent gains on long documents. To verify that these gains are structural rather than stochastic, we perform paired-seed analysis and cross-dataset comparison across sequence-length regimes. Additional experiments on the ArXiv dataset indicate that the benefit of learnable fusion generalizes across multiple positional encoding families. Finally, we explore a lightweight convolutional gating mechanism that introduces local inductive bias at the fusion level, evaluated on long documents only. Our results indicate that positional-encoding fusion is a non-trivial design choice for long-sequence Transformers and should be treated as an explicit modeling decision rather than a fixed default.
Similar Papers
Positional Encoding in Transformer-Based Time Series Models: A Survey
Machine Learning (CS)
Helps computers understand time patterns better.
Theoretical Analysis of Positional Encodings in Transformer Models: Impact on Expressiveness and Generalization
Machine Learning (CS)
Helps AI understand longer stories better.
The Role of Sparsity for Length Generalization in Transformers
Machine Learning (CS)
Makes AI understand longer stories by focusing on key parts.