File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/01/w01-0712_concl.xml

Size: 2,008 bytes

Last Modified: 2025-10-06 13:53:07

<?xml version="1.0" standalone="yes"?>
<Paper uid="W01-0712">
  <Title>Learning Computational Grammars</Title>
  <Section position="7" start_page="0" end_page="0" type="concl">
    <SectionTitle>
4 Prospects
</SectionTitle>
    <Paragraph position="0"> The project has proven to be successful in its results for applying machine learning techniques to all three of its selected tasks: chunking, NP chunking and NP bracketing. We are looking forward to applying these techniques to other NLP tasks. Three of our project members will take part in the CoNLL-2001 shared task, 'clausing', hopefully with good results. Two more have started working on the challenging task of full parsing, in particular by starting with a chunker and building a bottom-up arbitrary phrase recogniser on top of that. The preliminary results are encouraging though not as good as advanced statistical parsers like those of Charniak (2000) and Collins (2000).</Paragraph>
    <Paragraph position="1"> It is fair to characterise LCG's goals as primarily technical in the sense that we sought to maximise performance rates, esp. the recognition of different levels of NP structure. Our view in the project is certainly broader, and most project members would include learning as one of the language processes one ought to study from a computational perspective--like parsing or generation. This suggest several further avenues, e.g., one might compare the learning progress of simulations to humans (mastery as a function of experience). One might also be interested in the exact role of supervision, in the behaviour (and availability) of incremental learning algorithms, and also in comparing the simulation's error functions to those of human learners (wrt to phrase length or construction frequency or similarity).</Paragraph>
    <Paragraph position="2"> This would add an interesting cognitive perspective to the work, along the lines begun by Brent (1997), but we note it here only as a prospect for future work.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML