File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/05/p05-1012_concl.xml
Size: 1,659 bytes
Last Modified: 2025-10-06 13:54:44
<?xml version="1.0" standalone="yes"?> <Paper uid="P05-1012"> <Title>Online Large-Margin Training of Dependency Parsers</Title> <Section position="6" start_page="96" end_page="97" type="concl"> <SectionTitle> 4 Summary </SectionTitle> <Paragraph position="0"> We described a successful new method for training dependency parsers. We use simple linear parsing models trained with margin-sensitive online training algorithms, achieving state-of-the-art performance with relatively modest training times and no need for pruning heuristics. We evaluated the system on both English and Czech data to display state-of-the-art performance without any language specific enhancements. Furthermore, the model can be augmented to include features over lexicalized phrase structure parsing decisions to increase dependency accuracy over those parsers.</Paragraph> <Paragraph position="1"> We plan on extending our parser in two ways.</Paragraph> <Paragraph position="2"> First, we would add labels to dependencies to represent grammatical roles. Those labels are very important for using parser output in tasks like information extraction or machine translation. Second, we are looking at model extensions to allow non-projective dependencies, which occur in languages such as Czech, German and Dutch.</Paragraph> <Paragraph position="3"> Acknowledgments: We thank Jan HajiVc for answering queries on the Prague treebank, and Joakim Nivre for providing the Yamada and Matsumoto (2003) head rules for English that allowed for a direct comparison with our systems. This work was supported by NSF ITR grants 0205456, 0205448, and 0428193.</Paragraph> </Section> class="xml-element"></Paper>