Score: 1

A First Context-Free Grammar Applied to Nawatl Corpora Augmentation

Published: October 6, 2025 | arXiv ID: 2510.04945v1

By: Juan-José Guzmán-Landa , Juan-Manuel Torres-Moreno , Miguel Figueroa-Saavedra and more

Potential Business Impact:

Helps computers learn an old language better.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

In this article we introduce a context-free grammar (CFG) for the Nawatl language. Nawatl (or Nahuatl) is an Amerindian language of the $\pi$-language type, i.e. a language with few digital resources, in which the corpora available for machine learning are virtually non-existent. The objective here is to generate a significant number of grammatically correct artificial sentences, in order to increase the corpora available for language model training. We want to show that a grammar enables us significantly to expand a corpus in Nawatl which we call $\pi$-\textsc{yalli}. The corpus, thus enriched, enables us to train algorithms such as FastText and to evaluate them on sentence-level semantic tasks. Preliminary results show that by using the grammar, comparative improvements are achieved over some LLMs. However, it is observed that to achieve more significant improvement, grammars that model the Nawatl language even more effectively are required.

Country of Origin
🇫🇷 🇲🇽 Mexico, France

Page Count
11 pages

Category
Computer Science:
Computation and Language