Template-Free Retrosynthesis with Graph-Prior Augmented Transformers
By: Youjun Zhao
Potential Business Impact:
Helps chemists invent new medicines faster.
Retrosynthesis reaction prediction seeks to infer plausible reactant molecules for a given product and is a central problem in computer-aided organic synthesis. Despite recent progress, many existing models still fall short of the accuracy and robustness required for practical deployment. This work studies a template-free, Transformer-based framework that eliminates reliance on handcrafted reaction templates or additional chemical rule engines. The model injects molecular graph information into the attention mechanism to jointly exploit \SMILES\ sequences and structural cues, and further applies a paired data augmentation strategy to enhance training diversity and scale. On the USPTO-50K benchmark, our proposed approach achieves state-of-the-art performance among template-free methods and substantially outperforming a vanilla Transformer baseline.
Similar Papers
Retro3D: A 3D-aware Template-free Method for Enhancing Retrosynthesis via Molecular Conformer Information
Machine Learning (CS)
Finds better building blocks for new medicines.
Retro-Expert: Collaborative Reasoning for Interpretable Retrosynthesis
Machine Learning (CS)
AI finds easier ways to make new medicines.
Fast and scalable retrosynthetic planning with a transformer neural network and speculative beam search
Machine Learning (CS)
Speeds up finding new medicines with computers.