File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/04/p04-1062_abstr.xml

Size: 1,095 bytes

Last Modified: 2025-10-06 13:43:37

<?xml version="1.0" standalone="yes"?>
<Paper uid="P04-1062">
  <Title>Annealing Techniques for Unsupervised Statistical Language Learning</Title>
  <Section position="1" start_page="0" end_page="0" type="abstr">
    <SectionTitle>
Abstract
</SectionTitle>
    <Paragraph position="0"> Exploiting unannotated natural language data is hard largely because unsupervised parameter estimation is hard. We describe deterministic annealing (Rose et al., 1990) as an appealing alternative to the Expectation-Maximization algorithm (Dempster et al., 1977). Seeking to avoid search error, DA begins by globally maximizing an easy concave function and maintains a local maximum as it gradually morphs the function into the desired non-concave likelihood function. Applying DA to parsing and tagging models is shown to be straightforward; significant improvements over EM are shown on a part-of-speech tagging task. We describe a variant, skewed DA, which can incorporate a good initializer when it is available, and show significant improvements over EM on a grammar induction task.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML