SinLlama -- A Large Language Model for Sinhala
By: H. W. K. Aravinda , Rashad Sirajudeen , Samith Karunathilake and more
Potential Business Impact:
Helps computers understand and use the Sinhala language.
Low-resource languages such as Sinhala are often overlooked by open-source Large Language Models (LLMs). In this research, we extend an existing multilingual LLM (Llama-3-8B) to better serve Sinhala. We enhance the LLM tokenizer with Sinhala specific vocabulary and perform continual pre-training on a cleaned 10 million Sinhala corpus, resulting in the SinLlama model. This is the very first decoder-based open-source LLM with explicit Sinhala support. When SinLlama was instruction fine-tuned for three text classification tasks, it outperformed base and instruct variants of Llama-3-8B by a significant margin.
Similar Papers
SinLlama - A Large Language Model for Sinhala
Computation and Language
Helps computers understand and use the Sinhala language.
SinhalaMMLU: A Comprehensive Benchmark for Evaluating Multitask Language Understanding in Sinhala
Computation and Language
Helps computers understand a new language better.
Llama-3-Nanda-10B-Chat: An Open Generative Large Language Model for Hindi
Computation and Language
Makes computers understand and talk Hindi better.