How fast are algorithms reducing the demands on memory? A survey of progress in space complexity
By: Hayden Rome , Jayson Lynch , Jeffery Li and more
Potential Business Impact:
Makes computers use less memory for big tasks.
Algorithm research focuses primarily on how many operations processors need to do (time complexity). But for many problems, both the runtime and energy used are dominated by memory accesses. In this paper, we present the first broad survey of how algorithmic progress has improved memory usage (space complexity). We analyze 118 of the most important algorithm problems in computer science, reviewing the 800+ algorithms used to solve them. Our results show that space complexity has become much more important in recent years as worries have arisen about memory access bottle-necking performance (the ``memory wall''). In 20% of cases we find that space complexity improvements for large problems (n=1 billion) outpaced improvements in DRAM access speed, suggesting that for these problems algorithmic progress played a larger role than hardware progress in minimizing memory access delays. Increasingly, we also see the emergence of algorithmic Pareto frontiers, where getting better asymptotic time complexity for a problem requires getting worse asymptotic space complexity, and vice-versa. This tension implies that programmers will increasingly need to consider multiple algorithmic options to understand which is best for their particular problem. To help theorists and practitioners alike consider these trade-offs, we have created a reference for them at https://algorithm-wiki.csail.mit.edu.
Similar Papers
Fast polynomial computations with space constraints
Symbolic Computation
Makes computers solve math problems with less memory.
Fast polynomial computations with space constraints
Symbolic Computation
Makes computers solve math problems using less memory.
Space Efficient Algorithms for Parameterised Problems
Data Structures and Algorithms
Solves big computer puzzles with less memory.