"Words in a sentence are not merely strung together... they interconnect in a hierarchical manner."
Unlike phrase structure grammars that group words into constituents, Dependency Grammar focuses on the pairwise relationships between individual words. Each word has a function, establishing a web of dependencies.
The French linguist who revolutionized syntax with "Éléments de syntaxe structurale". He introduced "stemmas" (tree diagrams) where the verb is the central pivot governing other words.
"John loves Mary" - The verb governs the nouns.
The Head determines the syntactic category. The Dependent modifies the head. In "John eats", eats is the head, John is the dependent.
Relationships are asymmetric. They flow from Head to Dependent. This directionality defines the role each word plays (e.g., Subject vs Object).
Dependency grammar is excellent for languages with flexible word order because it focuses on relations rather than linear placement.
Also known as Shift-Reduce parsing. It builds the tree step-by-step using a stack and a buffer.
Models parsing as a Graph Optimization problem. It scores every possible edge and finds the best tree.
From decoding tweets to translating languages, dependency relations are the backbone of modern NLP.
Structuring formal text. Identifying "Who did what to whom".
Handling informal grammar, fragments, and hashtags.
Resolving long-distance dependencies in complex sentences.
Named Entity Recognition (NER) and Event Extraction.
Mapping question structure to potential answers.
Parsing → Transfer → Generation. Handling word order differences.
In dependency grammar, what is the 'Head' of the sentence usually considered to be?