Score: 1

Probability Distribution Collapse: A Critical Bottleneck to Compact Unsupervised Neural Grammar Induction

Published: September 25, 2025 | arXiv ID: 2509.20734v1

By: Jinwook Park, Kangil Kim

Potential Business Impact:

**Teaches computers grammar without examples.**

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Unsupervised neural grammar induction aims to learn interpretable hierarchical structures from language data. However, existing models face an expressiveness bottleneck, often resulting in unnecessarily large yet underperforming grammars. We identify a core issue, $\textit{probability distribution collapse}$, as the underlying cause of this limitation. We analyze when and how the collapse emerges across key components of neural parameterization and introduce a targeted solution, $\textit{collapse-relaxing neural parameterization}$, to mitigate it. Our approach substantially improves parsing performance while enabling the use of significantly more compact grammars across a wide range of languages, as demonstrated through extensive empirical analysis.

Country of Origin
🇰🇷 Korea, Republic of

Repos / Data Links

Page Count
12 pages

Category
Computer Science:
Computation and Language