PerfMamba: Performance Analysis and Pruning of Selective State Space Models
By: Abdullah Al Asif , Mobina Kashaniyan , Sixing Yu and more
Potential Business Impact:
Makes computer models run faster and use less memory.
Recent advances in sequence modeling have introduced selective SSMs as promising alternatives to Transformer architectures, offering theoretical computational efficiency and sequence processing advantages. A comprehensive understanding of selective SSMs in runtime behavior, resource utilization patterns, and scaling characteristics still remains unexplored, thus obstructing their optimal deployment and further architectural improvements. This paper presents a thorough empirical study of Mamba-1 and Mamba-2, systematically profiled for performance to assess the design principles that contribute to their efficiency in state-space modeling. A detailed analysis of computation patterns, memory access, I/O characteristics, and scaling properties was performed for sequence lengths ranging from 64 to 16384 tokens. Our findings show that the SSM component, a central part of the selective SSM architecture, demands a significant portion of computational resources compared to other components in the Mamba block. Based on these insights, we propose a pruning technique that selectively removes low-activity states within the SSM component, achieving measurable throughput and memory gains while maintaining accuracy within a moderate pruning regime. This approach results in performance improvements across varying sequence lengths, achieving a 1.14x speedup and reducing memory usage by 11.50\%. These results offer valuable guidance for designing more efficient SSM architectures that can be applied to a wide range of real-world applications.
Similar Papers
SparseSSM: Efficient Selective Structured State Space Models Can Be Pruned in One-Shot
Machine Learning (CS)
Makes big AI models smaller without losing smarts.
Efficient Unstructured Pruning of Mamba State-Space Models for Resource-Constrained Environments
Machine Learning (CS)
Makes smart computer programs smaller and faster.
Characterizing the Behavior of Training Mamba-based State Space Models on GPUs
Machine Learning (CS)
Makes AI faster at understanding long texts.