Score: 0

CodeSSM: Towards State Space Models for Code Understanding

Published: May 2, 2025 | arXiv ID: 2505.01475v3

By: Shweta Verma, Abhinav Anand, Mira Mezini

Potential Business Impact:

Helps computers understand code better, faster, cheaper.

Business Areas:
Semantic Search Internet Services

Although transformers dominate many code-specific tasks, they have significant limitations. This paper explores State Space Models (SSMs) as a promising alternative for code understanding tasks such as retrieval, classification, and clone detection. We introduce CodeSSM, the first SSM-based model trained on code corpora to assess its effectiveness. Our results demonstrate that SSMs are more sample-efficient and can extrapolate to longer contexts beyond the pretraining length. Extensive experiments show that SSMs offer a viable alternative to transformers, addressing several their limitations. Additionally, CodeSSM reduces memory usage by up to 64\% compared to transformers at a context length of 2048, with greater savings as context length grows.

Country of Origin
🇩🇪 Germany

Page Count
17 pages

Category
Computer Science:
Software Engineering