Do Construction Distributions Shape Formal Language Learning In German BabyLMs?
By: Bastian Bunzeck, Daniel Duran, Sina Zarrieß
Potential Business Impact:
Helps computers learn language like babies.
We analyze the influence of utterance-level construction distributions in German child-directed/child-available speech on the resulting word-level, syntactic and semantic competence (and their underlying learning trajectories) in small LMs, which we train on a novel collection of developmentally plausible language data for German. We find that trajectories are surprisingly robust for markedly different distributions of constructions in the training data, which have little effect on final accuracies and almost no effect on global learning trajectories. While syntax learning benefits from more complex utterances, word-level learning culminates in better scores with more fragmentary utterances. We argue that LMs trained on developmentally plausible data can contribute to debates on how conducive different kinds of linguistic stimuli are to language learning.
Similar Papers
BabyLM's First Constructions: Causal probing provides a signal of learning
Computation and Language
Models learn language rules from less data.
Findings of the BabyLM Challenge: Sample-Efficient Pretraining on Developmentally Plausible Corpora
Computation and Language
Teaches computers to learn language like babies.
Do Syntactic Categories Help in Developmentally Motivated Curriculum Learning for Language Models?
Computation and Language
Teaches computers language rules from baby talk.