Learning Advanced Self-Attention for Linear Transformers in the Singular Value Domain
By: Hyowon Wi, Jeongwhan Choi, Noseong Park
Potential Business Impact:
Helps computers understand complex patterns better.
Transformers have demonstrated remarkable performance across diverse domains. The key component of Transformers is self-attention, which learns the relationship between any two tokens in the input sequence. Recent studies have revealed that the self-attention can be understood as a normalized adjacency matrix of a graph. Notably, from the perspective of graph signal processing (GSP), the self-attention can be equivalently defined as a simple graph filter, applying GSP using the value vector as the signal. However, the self-attention is a graph filter defined with only the first order of the polynomial matrix, and acts as a low-pass filter preventing the effective leverage of various frequency information. Consequently, existing self-attention mechanisms are designed in a rather simplified manner. Therefore, we propose a novel method, called \underline{\textbf{A}}ttentive \underline{\textbf{G}}raph \underline{\textbf{F}}ilter (AGF), interpreting the self-attention as learning the graph filter in the singular value domain from the perspective of graph signal processing for directed graphs with the linear complexity w.r.t. the input length $n$, i.e., $\mathcal{O}(nd^2)$. In our experiments, we demonstrate that AGF achieves state-of-the-art performance on various tasks, including Long Range Arena benchmark and time series classification.
Similar Papers
GraphTARIF: Linear Graph Transformer with Augmented Rank and Improved Focus
CV and Pattern Recognition
Makes AI understand complex data better.
The Origin of Self-Attention: Pairwise Affinity Matrices in Feature Selection and the Emergence of Self-Attention
Machine Learning (CS)
Helps computers learn by seeing connections.
Gaussian Equivalence for Self-Attention: Asymptotic Spectral Analysis of Attention Matrix
Machine Learning (Stat)
Makes AI understand words better by analyzing their connections.