DynamicRTL: RTL Representation Learning for Dynamic Circuit Behavior
By: Ruiyang Ma , Yunhao Zhou , Yipeng Wang and more
Potential Business Impact:
Helps computers understand how circuits work over time.
There is a growing body of work on using Graph Neural Networks (GNNs) to learn representations of circuits, focusing primarily on their static characteristics. However, these models fail to capture circuit runtime behavior, which is crucial for tasks like circuit verification and optimization. To address this limitation, we introduce DR-GNN (DynamicRTL-GNN), a novel approach that learns RTL circuit representations by incorporating both static structures and multi-cycle execution behaviors. DR-GNN leverages an operator-level Control Data Flow Graph (CDFG) to represent Register Transfer Level (RTL) circuits, enabling the model to capture dynamic dependencies and runtime execution. To train and evaluate DR-GNN, we build the first comprehensive dynamic circuit dataset, comprising over 6,300 Verilog designs and 63,000 simulation traces. Our results demonstrate that DR-GNN outperforms existing models in branch hit prediction and toggle rate prediction. Furthermore, its learned representations transfer effectively to related dynamic circuit tasks, achieving strong performance in power estimation and assertion prediction.
Similar Papers
Beyond Tokens: Enhancing RTL Quality Estimation via Structural Graph Learning
Machine Learning (CS)
Helps computer chips get made faster and better.
Recurrent Deep Differentiable Logic Gate Networks
Machine Learning (CS)
Makes computers learn by thinking like simple switches.
GROOT: Graph Edge Re-growth and Partitioning for the Verification of Large Designs in Logic Synthesis
Machine Learning (CS)
Makes computer chips get checked much faster.