Score: 1

Enhancing Burmese News Classification with Kolmogorov-Arnold Network Head Fine-tuning

Published: November 26, 2025 | arXiv ID: 2511.21081v1

By: Thura Aung , Eaint Kay Khaing Kyaw , Ye Kyaw Thu and more

Potential Business Impact:

Helps computers understand languages with fewer examples.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

In low-resource languages like Burmese, classification tasks often fine-tune only the final classification layer, keeping pre-trained encoder weights frozen. While Multi-Layer Perceptrons (MLPs) are commonly used, their fixed non-linearity can limit expressiveness and increase computational cost. This work explores Kolmogorov-Arnold Networks (KANs) as alternative classification heads, evaluating Fourier-based FourierKAN, Spline-based EfficientKAN, and Grid-based FasterKAN-across diverse embeddings including TF-IDF, fastText, and multilingual transformers (mBERT, Distil-mBERT). Experimental results show that KAN-based heads are competitive with or superior to MLPs. EfficientKAN with fastText achieved the highest F1-score (0.928), while FasterKAN offered the best trade-off between speed and accuracy. On transformer embeddings, EfficientKAN matched or slightly outperformed MLPs with mBERT (0.917 F1). These findings highlight KANs as expressive, efficient alternatives to MLPs for low-resource language classification.

Country of Origin
🇹🇭 Thailand

Repos / Data Links

Page Count
6 pages

Category
Computer Science:
Computation and Language