File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/97/p97-1028_intro.xml
Size: 3,637 bytes
Last Modified: 2025-10-06 14:06:15
<?xml version="1.0" standalone="yes"?> <Paper uid="P97-1028"> <Title>Applying Explanation-based Learning to Control and Speeding-up Natural Language Generation</Title> <Section position="4" start_page="214" end_page="214" type="intro"> <SectionTitle> 2 Foundations </SectionTitle> <Paragraph position="0"> The main focus of this paper is tactical generation, i.e., the mapping of structures (usually representing semantic information eventually decorated with some functional features) to strings using a lexicon and a grammar. Thus stated, we view tactical generation as the inverse process of parsing. Informally, EBL can be considered as an intelligent storage unit of example-based generalized parts of the grammatical search space determined via training by the tactical generator3 Processing of similar new input is then reduced to simple lookup and matching operations, which circumvent re-computation of this already known search space.</Paragraph> <Paragraph position="1"> We concentrate on constraint-based grammar formalism following a sign-based approach considering linguistic objects (i.e., words and phrases) as utterance-meaning associations (Pollard and Sag, 1994). Thus viewed, a grammar is a formal statement of the relation between utterances in a natural language and representations of their meanings in some logical or other artificial language, where such representations are usually called logical forms (Shieber, 1993). The result of the tactical generator is a feature structure (or a set of such structures in the case of multiple paraphrases) containing among others the input logical form, the computed string, and a representation of the derivation.</Paragraph> <Paragraph position="2"> In our current implementation we are using TDL, a typed feature-based language and inference system for constraint-based grammars (Krieger and Sch~ifer, 1994). TDL allows the user to define hierarchicallyordered types consisting of type and feature constraints. As shown later, a systematic use of type information leads to a very compact representation of the extracted data and supports an elegant but efficient generalization step.</Paragraph> <Paragraph position="3"> We are adapting a &quot;flat&quot; representation of logical forms as described in (Kay, 1996; Copestake et al., 1996). This is a minimally structured, but descriptively adequate means to represent semantic information, which allows for various types of under-/overspecification, facilitates generation and the specification of semantic transfer equivalences l In case a reversible grammar is used the parser can even be used for processing the training corpus.</Paragraph> <Paragraph position="4"> used for machine translation (Copestake et al., 1996; Shemtov, 1996). 2 Informally, a flat representation is obtained by the use of extra variables which explicitly represent the relationship between the entities of a logical form and scope information. In our current system we are using the framework called minimal recursion semantics (MRS) described in (Copestake et al., 1996). Using their typed feature structure notation figure 1 displays a possible MRS of the string &quot;Sandy gives a chair to Kim&quot; (abbreviated where convenient).</Paragraph> <Paragraph position="5"> The value of the feature LISZT is actually treated like a set, i.e., the relative order of the elements is immaterial. The feature HANDEL is used to represent scope information, and INDEX plays much the same role as a lambda variable in conventional representations (for more details see (Copestake et al., 1996)).</Paragraph> </Section> class="xml-element"></Paper>