Score: 1

DynamicRTL: RTL Representation Learning for Dynamic Circuit Behavior

Published: November 12, 2025 | arXiv ID: 2511.09593v1

By: Ruiyang Ma , Yunhao Zhou , Yipeng Wang and more

Potential Business Impact:

Helps computers understand how circuits work over time.

Business Areas:
Simulation Software

There is a growing body of work on using Graph Neural Networks (GNNs) to learn representations of circuits, focusing primarily on their static characteristics. However, these models fail to capture circuit runtime behavior, which is crucial for tasks like circuit verification and optimization. To address this limitation, we introduce DR-GNN (DynamicRTL-GNN), a novel approach that learns RTL circuit representations by incorporating both static structures and multi-cycle execution behaviors. DR-GNN leverages an operator-level Control Data Flow Graph (CDFG) to represent Register Transfer Level (RTL) circuits, enabling the model to capture dynamic dependencies and runtime execution. To train and evaluate DR-GNN, we build the first comprehensive dynamic circuit dataset, comprising over 6,300 Verilog designs and 63,000 simulation traces. Our results demonstrate that DR-GNN outperforms existing models in branch hit prediction and toggle rate prediction. Furthermore, its learned representations transfer effectively to related dynamic circuit tasks, achieving strong performance in power estimation and assertion prediction.

Repos / Data Links

Page Count
16 pages

Category
Computer Science:
Machine Learning (CS)