File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/96/c96-1058_intro.xml

Size: 2,508 bytes

Last Modified: 2025-10-06 14:05:59

<?xml version="1.0" standalone="yes"?>
<Paper uid="C96-1058">
  <Title>Three New Probabilistic Models for Dependency Parsing: An Exploration*</Title>
  <Section position="2" start_page="0" end_page="0" type="intro">
    <SectionTitle>
1 Introduction
</SectionTitle>
    <Paragraph position="0"> In recent years, the statistical parsing community has begun to reach out; for syntactic formalisms that recognize the individuality of words, l,ink grammars (Sleator and 'Pemperley, 1991) and lexicalized tree-adjoining granunars (Schabes, 1992) have now received stochastic treatments. Other researchers, not wishing to abandon context-flee grammar (CI&amp;quot;G) but disillusioned with its lexica\] blind spot, have tried to re-parameterize stochastic CI&amp;quot;G in context-sensitive ways (Black et al., 1992) or have augmented the formalism with lexical headwords (Magerman, 1995; Collins, 11996).</Paragraph>
    <Paragraph position="1"> In this paper, we 1)resent a \[lexible l)robat)ilistic parser that simultaneously assigns both part-ofsl)eech tags and a bare-bones dependency structure (illustrate.d in l!'igure 1). The choice ot'a simple syntactic structure is deliberate: we would like to ask some basic questions about where h'xical relationships al)pear and how best, to exploit *This materia.l is based upon work supported under a National Science I%undation Graduate Fellowship, and has benefited greatly from discussions with  word points to a single t)arent, the word it modities; the head of the sentence points to the EOS (end-of: sentence) ma.rk. Crossing links and cycles arc not allowed. (b) Constituent structure and sub(:ategorization may be highlighted by displaying the same dependencies as a lexical tree.</Paragraph>
    <Paragraph position="2"> them. It is uscflfl to look into thes0 basic questions before trying to tine-tmm the performance of systems whose behavior is harder to understand. 1 The main contribution of' the work is to I)ropose three distin('t, lexiealist hyl)otheses abou(. (,he probability space underlying seHl\]ence structure.</Paragraph>
    <Paragraph position="3"> We il\]ustrate how each hypothesis is (:xl)ressed in a depemteney framework, and how each can be used to guide our parser toward its favored solution. Finally, we point to experimental resul(;s that compare the three hypotheses' parsing performance on sentences fi:om the Wall ,b'treel dourhal. \]'he parser is trained on an annol,ated corpus; no hand-written grammar is required.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML