File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/94/h94-1048_concl.xml

Size: 1,931 bytes

Last Modified: 2025-10-06 13:57:14

<?xml version="1.0" standalone="yes"?>
<Paper uid="H94-1048">
  <Title>A Maximum Entropy Model for Prepositional Phrase Attachment</Title>
  <Section position="6" start_page="253" end_page="254" type="concl">
    <SectionTitle>
5. Conclusion
</SectionTitle>
    <Paragraph position="0"> The Maximum Entropy model predicts prepositional phrase attachment 10 percentage points less accurately than a treebanker, but it performs comparably to a non-expert, assuming that only only the head words of the history are available in both cases. The biggest improvements to the ME model will come from better utilization of classes, and a larger history.</Paragraph>
    <Paragraph position="1"> Currently, the use of the mutual information class bits gives us a few percentage points in performance, but the ME model should gain more from other word classing schemes which are better tuned to the PP-attachment problem. A scheme in which the word classes are built from the observed attachment preferences of words ought to outperform the mutual information clustering method, which uses only word bigram distributions\[10\].</Paragraph>
    <Paragraph position="2">  200 Events of Computer Manuals Data Secondly, the ME model does not use information contained in the rest of the sentence, although it is apparently useful in predicting the attachment, as evidenced by a 5% average gain in the treebankers' accuracy. Any implementation of this model using the rest of the sentence would require features on other words, and perhaps features on the sentence's parse tree structure, coupled with an efficient incremental search. Such improvements should boost the performance of the model to that of treebankers. Already, the ME model out-performs a decision tree confronted with the same task. We hope to use Maximum Entropy to predict other linguistic phenomena that hinder the performance of most natural language parsers.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML