Score: 0

Neural Induction of Finite-State Transducers

Published: January 16, 2026 | arXiv ID: 2601.10918v1

By: Michael Ginn, Alexis Palmer, Mans Hulden

Potential Business Impact:

Teaches computers to change words perfectly.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Finite-State Transducers (FSTs) are effective models for string-to-string rewriting tasks, often providing the efficiency necessary for high-performance applications, but constructing transducers by hand is difficult. In this work, we propose a novel method for automatically constructing unweighted FSTs following the hidden state geometry learned by a recurrent neural network. We evaluate our methods on real-world datasets for morphological inflection, grapheme-to-phoneme prediction, and historical normalization, showing that the constructed FSTs are highly accurate and robust for many datasets, substantially outperforming classical transducer learning algorithms by up to 87% accuracy on held-out test sets.

Country of Origin
πŸ‡ΊπŸ‡Έ United States

Page Count
15 pages

Category
Computer Science:
Computation and Language