Multilingual MFA: Forced Alignment on Low-Resource Related Languages
By: Alessio Tosolini, Claire Bowern
Potential Business Impact:
Helps computers understand new languages faster.
We compare the outcomes of multilingual and crosslingual training for related and unrelated Australian languages with similar phonological inventories. We use the Montreal Forced Aligner to train acoustic models from scratch and adapt a large English model, evaluating results against seen data, unseen data (seen language), and unseen data and language. Results indicate benefits of adapting the English baseline model for previously unseen languages.
Similar Papers
Data Augmentation and Hyperparameter Tuning for Low-Resource MFA
Computation and Language
Improves computer understanding of rare languages.
Can you map it to English? The Role of Cross-Lingual Alignment in Multilingual Performance of LLMs
Computation and Language
Helps computers understand many languages without extra training.
Fluent Alignment with Disfluent Judges: Post-training for Lower-resource Languages
Computation and Language
Teaches computers to speak less common languages well.