Score: 2

Can maiBERT Speak for Maithili?

Published: September 18, 2025 | arXiv ID: 2509.15048v1

By: Sumit Yadav , Raju Kumar Yadav , Utsav Maskey and more

Potential Business Impact:

Helps computers understand a new language.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Natural Language Understanding (NLU) for low-resource languages remains a major challenge in NLP due to the scarcity of high-quality data and language-specific models. Maithili, despite being spoken by millions, lacks adequate computational resources, limiting its inclusion in digital and AI-driven applications. To address this gap, we introducemaiBERT, a BERT-based language model pre-trained specifically for Maithili using the Masked Language Modeling (MLM) technique. Our model is trained on a newly constructed Maithili corpus and evaluated through a news classification task. In our experiments, maiBERT achieved an accuracy of 87.02%, outperforming existing regional models like NepBERTa and HindiBERT, with a 0.13% overall accuracy gain and 5-7% improvement across various classes. We have open-sourced maiBERT on Hugging Face enabling further fine-tuning for downstream tasks such as sentiment analysis and Named Entity Recognition (NER).

Country of Origin
🇦🇺 Australia


Page Count
10 pages

Category
Computer Science:
Computation and Language