Score: 0

Numerical Investigation of Sequence Modeling Theory using Controllable Memory Functions

Published: June 6, 2025 | arXiv ID: 2506.05678v2

By: Haotian Jiang , Zeyu Bao , Shida Wang and more

Potential Business Impact:

Tests computer models on how well they remember things.

Business Areas:
Simulation Software

The evolution of sequence modeling architectures, from recurrent neural networks and convolutional models to Transformers and structured state-space models, reflects ongoing efforts to address the diverse temporal dependencies inherent in sequential data. Despite this progress, systematically characterizing the strengths and limitations of these architectures remains a fundamental challenge. In this work, we propose a synthetic benchmarking framework to evaluate how effectively different sequence models capture distinct temporal structures. The core of this approach is to generate synthetic targets, each characterized by a memory function and a parameter that determines the strength of temporal dependence. This setup allows us to produce a continuum of tasks that vary in temporal complexity, enabling fine-grained analysis of model behavior concerning specific memory properties. We focus on four representative memory functions, each corresponding to a distinct class of temporal structures. Experiments on several sequence modeling architectures confirm existing theoretical insights and reveal new findings. These results demonstrate the effectiveness of the proposed method in advancing theoretical understanding and highlight the importance of using controllable targets with clearly defined structures for evaluating sequence modeling architectures.

Country of Origin
πŸ‡ΈπŸ‡¬ Singapore

Page Count
30 pages

Category
Computer Science:
Machine Learning (CS)