Score: 0

TrackCore-F: Deploying Transformer-Based Subatomic Particle Tracking on FPGAs

Published: September 30, 2025 | arXiv ID: 2509.26335v1

By: Arjan Blankestijn, Uraz Odyurt, Amirreza Yousefzadeh

Potential Business Impact:

Makes AI models run faster on special chips.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

The Transformer Machine Learning (ML) architecture has been gaining considerable momentum in recent years. In particular, computational High-Energy Physics tasks such as jet tagging and particle track reconstruction (tracking), have either achieved proper solutions, or reached considerable milestones using Transformers. On the other hand, the use of specialised hardware accelerators, especially FPGAs, is an effective method to achieve online, or pseudo-online latencies. The development and integration of Transformer-based ML to FPGAs is still ongoing and the support from current tools is very limited to non-existent. Additionally, FPGA resources present a significant constraint. Considering the model size alone, while smaller models can be deployed directly, larger models are to be partitioned in a meaningful and ideally, automated way. We aim to develop methodologies and tools for monolithic, or partitioned Transformer synthesis, specifically targeting inference. Our primary use-case involves two machine learning model designs for tracking, derived from the TrackFormers project. We elaborate our development approach, present preliminary results, and provide comparisons.

Country of Origin
🇳🇱 Netherlands

Page Count
7 pages

Category
Physics:
High Energy Physics - Experiment