Prefix Trees Improve Memory Consumption in Large-Scale Continuous-Time Stochastic Models
By: Landon Taylor , Joshua Jeppson , Ahmed Irfan and more
Highly-concurrent system models with vast state spaces like Chemical Reaction Networks (CRNs) that model biological and chemical systems pose a formidable challenge to cutting-edge formal analysis tools. Although many symbolic approaches have been presented, transient probability analysis of CRNs, modeled as Continuous-Time Markov Chains (CTMCs), requires explicit state representation. For that purpose, current cutting-edge methods use hash maps, which boast constant average time complexity and linear memory complexity. However, hash maps often suffer from severe memory limitations on models with immense state spaces. To address this, we propose using prefix trees to store states for large, highly concurrent models (particularly CRNs) for memory savings. We present theoretical analyses and benchmarks demonstrating the favorability of prefix trees over hash maps for very large state spaces. Additionally, we propose using a Bounded Model Checking (BMC) pre-processing step to impose a variable ordering to further improve memory usage along with preliminary evaluations suggesting its effectiveness. We remark that while our work is motivated primarily by the challenges posed by CRNs, it is generalizable to all CTMC models.
Similar Papers
Nonparametric Modeling of Continuous-Time Markov Chains
Methodology
Helps scientists understand how things change over time.
Beyond Connectivity: Higher-Order Network Framework for Capturing Memory-Driven Mobility Dynamics
Social and Information Networks
Maps traffic routes better by remembering past trips.
Compute-in-Memory Implementation of State Space Models for Event Sequence Processing
Signal Processing
Makes computers see and hear like brains.