File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/02/p02-1025_concl.xml

Size: 1,715 bytes

Last Modified: 2025-10-06 13:53:19

<?xml version="1.0" standalone="yes"?>
<Paper uid="P02-1025">
  <Title>A Study on Richer Syntactic Dependencies for Structured Language Modeling</Title>
  <Section position="5" start_page="0" end_page="0" type="concl">
    <SectionTitle>
5 Conclusion and Future Work
</SectionTitle>
    <Paragraph position="0"> We have presented a study on enriching the syntactic dependency structures in the SLM. We have built and evaluated the performance of seven different models. All of our models improve on the baseline SLM in either PPL or WER or both. We have shown that adding the NT tag of the third most-recent exposed head in the parser model improves the parsing performance significantly. The improvement in parsing accuracy carries over to enhancing language model performance, as evaluated by both WER and PPL. Furthermore, our best result shows that an uninterpolated grammar-based language model can outperform a 3-gram model. The best model achieved an overall WER improvement of 10% relative to the 3-gram baseline.</Paragraph>
    <Paragraph position="1"> Although conditioning on more contextual information helps, we should note that some of our models suffer from over-parameterization. One solution would be to apply the maximum entropy estimation technique (MaxEnt (Berger et al., 1996)) to all of the three components of the SLM, or at least to the CONSTRUCTOR. That would also allow for fine-tuning of the particular syntactic dependencies used in the model rather than the template based method we have used. Along these lines, the Max-Ent model has already shown promising improvements by combining syntactic dependencies in the WORD-PREDICTOR of the SLM (Wu and Khudanpur, 1999).</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML