Construction Identification and Disambiguation Using BERT: A Case Study of NPN
By: Wesley Scivetti, Nathan Schneider
Potential Business Impact:
Computer understands tricky word patterns like "day to day."
Construction Grammar hypothesizes that knowledge of a language consists chiefly of knowledge of form-meaning pairs (''constructions'') that include vocabulary, general grammar rules, and even idiosyncratic patterns. Recent work has shown that transformer language models represent at least some constructional patterns, including ones where the construction is rare overall. In this work, we probe BERT's representation of the form and meaning of a minor construction of English, the NPN (noun-preposition-noun) construction -- exhibited in such expressions as face to face and day to day -- which is known to be polysemous. We construct a benchmark dataset of semantically annotated corpus instances (including distractors that superficially resemble the construction). With this dataset, we train and evaluate probing classifiers. They achieve decent discrimination of the construction from distractors, as well as sense disambiguation among true instances of the construction, revealing that BERT embeddings carry indications of the construction's semantics. Moreover, artificially permuting the word order of true construction instances causes them to be rejected, indicating sensitivity to matters of form. We conclude that BERT does latently encode at least some knowledge of the NPN construction going beyond a surface syntactic pattern and lexical cues.
Similar Papers
BabyLM's First Constructions: Causal probing provides a signal of learning
Computation and Language
Models learn language rules from less data.
Evaluating CxG Generalisation in LLMs via Construction-Based NLI Fine Tuning
Computation and Language
Helps computers understand sentence structure better.
On the Geometry of Semantics in Next-token Prediction
Computation and Language
Teaches computers to understand words like humans.