<< Prev | - Up - | Next >> |
In this chapter, we will take a look at parsing in the framework of dependency grammar. The chapter is called "Dependency Parsing" rather than, say, "Parsing with Dependency Grammars" because, for reasons of space/time, it is concerned only with the creation of dependency trees for input sentences and not with other issues, such as word-order, which are also essential for determining grammaticality. Also coverage will be minimal and is only meant to illustrate the application of the techniques presented here.
Dependency parsing is particularly interesting because it exhibits, in a very simple way, 2 fundamental forms of ambiguity that commonly arise in parsing: lexical ambiguity and structural ambiguity, and allows to showcase convincingly constraint-based techniques to effectively handle such ambiguities.
Maruyama [Mar90] was the first to propose a complete treatment of dependency grammar as a CSP and described parsing as a process of incremental disambiguation. Harper [HHW99] continues this line of research and proposed several algorithmic improvements within the MUSE CSP framework [HH93]. Menzel [Men98] [HKMS98] [MS98] advocates the use of soft graded constraints for robustness in e.g. parsing spoken language. His proposal turns parsing into a more expensive optimization problem, but adapts gracefully to constraint violations.
The material in this chapter is based on our paper [Duc99a]. Our presentation has one noticeable advantage over Maruyama's in that it follows modern linguistic practice: the grammar is specified by a lexicon and a collection of principles. The formulation in this chapter is also a little simpler than the one in [Duc99a] because it takes advantage of the new selection union constraint (see Section 5.4.6 and Section 6.9.4).
<< Prev | - Up - | Next >> |