File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/06/n06-1013_concl.xml
Size: 2,102 bytes
Last Modified: 2025-10-06 13:55:07
<?xml version="1.0" standalone="yes"?> <Paper uid="N06-1013"> <Title>A Maximum Entropy Approach to Combining Word Alignments</Title> <Section position="8" start_page="102" end_page="102" type="concl"> <SectionTitle> 7 Conclusions </SectionTitle> <Paragraph position="0"> We presented a new approach, ACME, to combining the outputs of different word alignment systems by reducing the combination problem to the level of alignment links and using a maximum entropy model to learn whether a particular alignment link is included in the final alignment.</Paragraph> <Paragraph position="1"> Our results indicate that ACME yields significant relative error reduction over the input alignments and their heuristic-based combinations on three different language pairs. Moreover, ACME provides similar relative improvements for different sizes of training data for the input alignment systems. We have also shown that using a higher number of input alignments, and partitioning the training data into disjoint subsets and learning a different model for each partition yield further improvements.</Paragraph> <Paragraph position="2"> We have tested impact of the reduced AER on MT and have shown that alignments generated by ACME yield statistically significant improvements in BLEU scores in two different languages, even if we don't employ a POS tagger on the FL side.</Paragraph> <Paragraph position="3"> However, additional studies are needed to investigate why huge improvements in AER result in relatively smaller improvements in BLEU scores.</Paragraph> <Paragraph position="4"> Because ACME is a supervised learning approach, it requires annotated data; however, our experiments have shown that significant improvements can be obtained using a small set of annotated data.</Paragraph> <Paragraph position="5"> Acknowledgments This work has been supported, in part, under ONR MURI Contract FCPO.810548265 and the GALE program of the Defense Advanced Research Projects Agency, Contracts No. HR0011-06-2-0001. We also thank anonymous reviewers for their helpful comments.</Paragraph> </Section> class="xml-element"></Paper>