Score: 1

Graph Laplacian Wavelet Transformer via Learnable Spectral Decomposition

Published: May 9, 2025 | arXiv ID: 2505.07862v1

By: Andrew Kiruluta, Eric Lundy, Priscilla Burity

Potential Business Impact:

Makes computers understand language much faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Existing sequence to sequence models for structured language tasks rely heavily on the dot product self attention mechanism, which incurs quadratic complexity in both computation and memory for input length N. We introduce the Graph Wavelet Transformer (GWT), a novel architecture that replaces this bottleneck with a learnable, multi scale wavelet transform defined over an explicit graph Laplacian derived from syntactic or semantic parses. Our analysis shows that multi scale spectral decomposition offers an interpretable, efficient, and expressive alternative to quadratic self attention for graph structured sequence modeling.

Page Count
11 pages

Category
Computer Science:
Computation and Language