Characterizing Mamba's Selective Memory using Auto-Encoders
By: Tamanna Hossain , Robert L. Logan , Ganesh Jagadeesan and more
Potential Business Impact:
Helps AI remember math and names better.
State space models (SSMs) are a promising alternative to transformers for language modeling because they use fixed memory during inference. However, this fixed memory usage requires some information loss in the hidden state when processing long sequences. While prior work has studied the sequence length at which this information loss occurs, it does not characterize the types of information SSM language models (LMs) tend to forget. In this paper, we address this knowledge gap by identifying the types of tokens (e.g., parts of speech, named entities) and sequences (e.g., code, math problems) that are more frequently forgotten by SSM LMs. We achieve this by training an auto-encoder to reconstruct sequences from the SSM's hidden state, and measure information loss by comparing inputs with their reconstructions. We perform experiments using the Mamba family of SSM LMs (130M--1.4B) on sequences ranging from 4--256 tokens. Our results show significantly higher rates of information loss on math-related tokens (e.g., numbers, variables), mentions of organization entities, and alternative dialects to Standard American English. We then examine the frequency that these tokens appear in Mamba's pretraining data and find that less prevalent tokens tend to be the ones Mamba is most likely to forget. By identifying these patterns, our work provides clear direction for future research to develop methods that better control Mamba's ability to retain important information.
Similar Papers
PerfMamba: Performance Analysis and Pruning of Selective State Space Models
Machine Learning (CS)
Makes computer models run faster and use less memory.
From S4 to Mamba: A Comprehensive Survey on Structured State Space Models
Machine Learning (CS)
Makes computers understand long stories faster.
Trajectory Mamba: Efficient Attention-Mamba Forecasting Model Based on Selective SSM
CV and Pattern Recognition
Helps self-driving cars predict where others will go.