Score: 1

Context-Free Recognition with Transformers

Published: January 5, 2026 | arXiv ID: 2601.01754v1

By: Selim Jerad , Anej Svete , Sophie Hao and more

Potential Business Impact:

Helps computers understand complex sentence rules.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Transformers excel on tasks that process well-formed inputs according to some grammar, such as natural language and code. However, it remains unclear how they can process grammatical syntax. In fact, under standard complexity conjectures, standard transformers cannot recognize context-free languages (CFLs), a canonical formalism to describe syntax, or even regular languages, a subclass of CFLs (Merrill et al., 2022). Merrill & Sabharwal (2024) show that $\mathcal{O}(\log n)$ looping layers (w.r.t. input length $n$) allows transformers to recognize regular languages, but the question of context-free recognition remained open. In this work, we show that looped transformers with $\mathcal{O}(\log n)$ looping layers and $\mathcal{O}(n^6)$ padding tokens can recognize all CFLs. However, training and inference with $\mathcal{O}(n^6)$ padding tokens is potentially impractical. Fortunately, we show that, for natural subclasses such as unambiguous CFLs, the recognition problem on transformers becomes more tractable, requiring $\mathcal{O}(n^3)$ padding. We empirically validate our results and show that looping helps on a language that provably requires logarithmic depth. Overall, our results shed light on the intricacy of CFL recognition by transformers: While general recognition may require an intractable amount of padding, natural constraints such as unambiguity yield efficient recognition algorithms.

Country of Origin
πŸ‡¨πŸ‡­ πŸ‡ΊπŸ‡Έ United States, Switzerland

Page Count
20 pages

Category
Computer Science:
Machine Learning (CS)