Title :
Probabilistic dependency grammar, and its application in constructing language models for speech applications
Author_Institution :
IBM UK Labs., Winchester, UK
Abstract :
Dependency grammar defines grammatical notions in terms of direct links between words at a lexical level. These lexical dependency relations are considered more fundamental than other representations, such as phrase structure analyses, which can be derived from them. In simple cases there is a direct correspondence between dependencies and phrase structures. The dependencies for a given word can be considered to be a conditioning of its context in a similar way to the use of prior history to condition the occurrence of words in n-gram modelling methods. Dependency grammar can be shown to have probabilistic interpretation in which the evaluation of conditional word probabilities follows the dependency links, rather than the strict lexical word order. An experiment is proposed to test the assumption that dependencies between words are likely to be local, and monotonically decreasing with distance. Two fundamental problems are then posed: parsing, where it is required to establish the dependency relationships for a given sentence when a dependency grammar is given, and estimation, where the parameters defining the dependency relationships have to be determined
Keywords :
computational linguistics; grammars; probability; speech analysis and processing; conditional word probabilities; dependency links; estimation; grammatical notions; lexical dependency relations; lexical level; monotonically decreasing; n-gram modelling methods; parsing; phrase structure analyses; prior history; probabilistic dependency grammar; probabilistic interpretation; speech applications; strict lexical word order;
Conference_Titel :
Grammatical Inference: Theory, Applications and Alternatives, IEE Colloquium on
Conference_Location :
Colchester