File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/06/w06-3113_abstr.xml

Size: 1,128 bytes

Last Modified: 2025-10-06 13:45:40

<?xml version="1.0" standalone="yes"?>
<Paper uid="W06-3113">
  <Title>How Many Bits Are Needed To Store Probabilities for Phrase-Based Translation?</Title>
  <Section position="1" start_page="0" end_page="0" type="abstr">
    <SectionTitle>
Abstract
</SectionTitle>
    <Paragraph position="0"> State of the art in statistical machine translation is currently represented by phrase-based models, which typically incorporate a large number of probabilities of phrase-pairs and word n-grams. In this work, we investigate data compression methods for ef ciently encoding n-gram and phrase-pair probabilities, that are usually encoded in 32-bit oating point numbers. We measured the impact of compression on translation quality through a phrase-based decoder trained on two distinct tasks: the translation of European Parliament speeches from Spanish to English, and the translation of news agencies from Chinese to English. We show that with a very simple quantization scheme all probabilities can be encoded in just 4 bits with a relative loss in BLEU score on the two tasks by 1.0% and 1.6%, respectively.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML