A Fundamental Algorithm for Dependency Parsing (With Corrections)
By: Michael A. Covington
Potential Business Impact:
Helps computers understand sentences like people do.
This paper presents a fundamental algorithm for parsing natural language sentences into dependency trees. Unlike phrase-structure (constituency) parsers, this algorithm operates one word at a time, attaching each word as soon as it can be attached, corresponding to properties claimed for the parser in the human brain. Like phrase-structure parsing, its worst-case complexity is $O(n^3)$, but in human language, the worst case occurs only for small $n$.
Similar Papers
Counting trees: A treebank-driven exploration of syntactic variation in speech and writing across languages
Computation and Language
Shows how talking and writing use different sentence parts.
Hierarchical Bracketing Encodings Work for Dependency Graphs
Computation and Language
Helps computers understand sentences better.
Step-by-step Instructions and a Simple Tabular Output Format Improve the Dependency Parsing Accuracy of LLMs
Computation and Language
Helps computers understand sentences perfectly.