Score: 2

GraphNet: A Large-Scale Computational Graph Dataset for Tensor Compiler Research

Published: October 28, 2025 | arXiv ID: 2510.24035v1

By: Xinqi Li , Yiqun Liu , Shan Jiang and more

BigTech Affiliations: Baidu

Potential Business Impact:

Makes computer programs run much faster and correctly.

Business Areas:
Big Data Data and Analytics

We introduce GraphNet, a dataset of 2.7K real-world deep learning computational graphs with rich metadata, spanning six major task categories across multiple deep learning frameworks. To evaluate tensor compiler performance on these samples, we propose the benchmark metric Speedup Score S(t), which jointly considers runtime speedup and execution correctness under tunable tolerance levels, offering a reliable measure of general optimization capability. Furthermore, we extend S(t) to the Error-aware Speedup Score ES(t), which incorporates error information and helps compiler developers identify key performance bottlenecks. In this report, we benchmark the default tensor compilers, CINN for PaddlePaddle and TorchInductor for PyTorch, on computer vision (CV) and natural language processing (NLP) samples to demonstrate the practicality of GraphNet. The full construction pipeline with graph extraction and compiler evaluation tools is available at https://github.com/PaddlePaddle/GraphNet .

Country of Origin
🇨🇳 China

Repos / Data Links

Page Count
17 pages

Category
Computer Science:
Machine Learning (CS)