From Smør-re-brød to Subwords: Training LLMs on Danish, One Morpheme at a Time
By: Mikkel Wildner Kildeberg , Emil Allerslev Schledermann , Nicolaj Larsen and more
Potential Business Impact:
Helps computers understand Danish words better.
The best performing transformer-based language models use subword tokenization techniques, such as Byte-Pair-Encoding (BPE). However, these approaches often overlook linguistic principles, such as morphological segmentation, which we believe is fundamental for understanding language-specific word structure. In this study, we leverage an annotated Danish morphological dataset to train a semisupervised model for morphological segmentation, enabling the development of tokenizers optimized for Danish morphology. We evaluate four distinct tokenizers, including two custom morphological tokenizers, by analyzing their performance in morphologically segmenting Danish words. Additionally, we train two generative transformer models, \textit{CerebrasGPT-111M} and \textit{LLaMA-3.2 1B}, using these tokenizers and evaluate their downstream performance. Our findings reveal that our custom-developed tokenizers substantially enhance morphological segmentation, achieving an F1 score of 58.84, compared to 39.28 achieved by a Danish BPE tokenizer. In downstream tasks, models trained with our morphological tokenizers outperform those using BPE tokenizers across different evaluation metrics. These results highlight that incorporating Danish morphological segmentation strategies into tokenizers leads to improved performance in generative transformer models on Danish language
Similar Papers
MorphBPE: A Morpho-Aware Tokenizer Bridging Linguistic Complexity for Efficient LLM Training Across Morphologies
Computation and Language
Teaches computers to understand word parts better.
MorphTok: Morphologically Grounded Tokenization for Indian Languages
Computation and Language
Helps computers understand languages better.
Subword Tokenization Strategies for Kurdish Word Embeddings
Computation and Language
Helps computers understand Kurdish words better.