File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/03/n03-2002_abstr.xml

Size: 1,355 bytes

Last Modified: 2025-10-06 13:42:48

<?xml version="1.0" standalone="yes"?>
<Paper uid="N03-2002">
  <Title>Factored Language Models and Generalized Parallel Backoff</Title>
  <Section position="1" start_page="0" end_page="0" type="abstr">
    <SectionTitle>
Abstract
</SectionTitle>
    <Paragraph position="0"> We introduce factored language models (FLMs) and generalized parallel backoff (GPB). An FLM represents words as bundles of features (e.g., morphological classes, stems, data-driven clusters, etc.), and induces a probability model covering sequences of bundles rather than just words. GPB extends standard backoff to general conditional probability tables where variables might be heterogeneous types, where no obvious natural (temporal) backoff order exists, and where multiple dynamic backoff strategies are allowed. These methodologies were implemented during the JHU 2002 workshop as extensions to the SRI language modeling toolkit. This paper provides initial perplexity results on both CallHome Arabic and on Penn Treebank Wall Street Journal articles. Significantly, FLMs with GPB can produce bigrams with significantly lower perplexity, sometimes lower than highly-optimized baseline trigrams. In a multi-pass speech recognition context, where bigrams are used to create first-pass bigram lattices or N-best lists, these results are highly relevant.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML