Score: 0

Template-Free Retrosynthesis with Graph-Prior Augmented Transformers

Published: December 11, 2025 | arXiv ID: 2512.10770v1

By: Youjun Zhao

Potential Business Impact:

Helps chemists invent new medicines faster.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Retrosynthesis reaction prediction seeks to infer plausible reactant molecules for a given product and is a central problem in computer-aided organic synthesis. Despite recent progress, many existing models still fall short of the accuracy and robustness required for practical deployment. This work studies a template-free, Transformer-based framework that eliminates reliance on handcrafted reaction templates or additional chemical rule engines. The model injects molecular graph information into the attention mechanism to jointly exploit \SMILES\ sequences and structural cues, and further applies a paired data augmentation strategy to enhance training diversity and scale. On the USPTO-50K benchmark, our proposed approach achieves state-of-the-art performance among template-free methods and substantially outperforming a vanilla Transformer baseline.

Page Count
6 pages

Category
Computer Science:
Machine Learning (CS)