Score: 2

Autocomp: LLM-Driven Code Optimization for Tensor Accelerators

Published: May 24, 2025 | arXiv ID: 2505.18574v3

By: Charles Hong , Sahil Bhatia , Alvin Cheung and more

BigTech Affiliations: University of California, Berkeley

Potential Business Impact:

Makes computer chips run programs much faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Hardware accelerators, especially those designed for tensor processing, have become ubiquitous in today's computing landscape. However, even with significant efforts in building compilers, programming these tensor accelerators remains challenging, leaving much of their potential underutilized. Recently, large language models (LLMs), trained on large amounts of code, have shown significant promise in code generation and optimization tasks, but generating low-resource languages like specialized tensor accelerator code still poses a significant challenge. We tackle this challenge with Autocomp, an approach that empowers accelerator programmers to leverage domain knowledge and hardware feedback to optimize code via an automated LLM-driven search. We accomplish this by: 1) formulating each optimization pass as a structured two-phase prompt, divided into planning and code generation phases, 2) inserting domain knowledge during planning via a concise and adaptable optimization menu, and 3) integrating correctness and performance metrics from hardware as feedback at each search iteration. Across three categories of representative workloads and two different accelerators, we demonstrate that Autocomp-optimized code runs 5.6x (GEMM) and 2.7x (convolution) faster than the vendor-provided library, and outperforms expert-level hand-tuned code by 1.4x (GEMM), 1.1x (convolution), and 1.3x (fine-grained linear algebra). Additionally, we demonstrate that optimization schedules generated from Autocomp can be reused across similar tensor operations, improving speedups by up to 24% under a fixed sample budget.

Country of Origin
πŸ‡ΊπŸ‡Έ United States

Repos / Data Links

Page Count
33 pages

Category
Computer Science:
Programming Languages