File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/00/c00-1007_intro.xml
Size: 3,187 bytes
Last Modified: 2025-10-06 14:00:47
<?xml version="1.0" standalone="yes"?> <Paper uid="C00-1007"> <Title>Exploiting a Probabilistic Hierarchical Model for Generation</Title> <Section position="3" start_page="0" end_page="0" type="intro"> <SectionTitle> 1 Introduction </SectionTitle> <Paragraph position="0"> For many apt)lications in natural language gen~ eration (NLG), the range of linguistic expressions that must be generated is quite restricted, and a grammar tbr generation can be fltlly specified by hand. Moreover, in ma W cases it; is very important not to deviate from certain linguistic standards in generation, in which case hand-crafted grammars give excellent control. However, in other applications tbr NLG the variety of the output is much bigger, and the demands on the quality of the output somewhat less stringent. A typical example is NLG in the context of (interlingua- or transthr-based) machine translation. Another reason for reb~xing the quality of the output may be that not enough time is available to develop a flfll grammar tbr a new target language in NLG. In all these cases, stochastic (&quot;empiricist&quot;) methods provide an alternative to hand-crafted (&quot;rationalist&quot;) approaches to NLG. To our knowledge, the first to use stochastic techniques in NLG were Langkilde and Knight (1998a) and (1998b). In this paper, we present FERGUS (Flexible Empiricist/Rationalist Generation Using Syntax).</Paragraph> <Paragraph position="1"> FErtGUS follows Langkilde and Knight's seminal work in using an n-gram language model, but; we augment it with a tree-based stochastic model and a traditional tree-based syntactic grammar.</Paragraph> <Paragraph position="2"> More recent work on aspects of stochastic generation include (Langkilde and Knight, 2000), (Malouf, 1999) and (Ratnaparkhi, 2000).</Paragraph> <Paragraph position="3"> Betbre we describe in more detail how we use stochastic models in NLG, we recall the basic tasks in NLG (Rainbow and Korelsky, 1992; Reiter, 1994). During text planning, content and structure of the target text; are determined to achieve the overall communicative goal. During sentence planning, linguistic means - in particular, lexical and syntactic means are determined to convey smaller pieces of meaning.</Paragraph> <Paragraph position="4"> l)uring realization, the specification chosen in sentence planning is transtbrmed into a surface string, by line~rizing and intlecting words in the sentence (and typically, adding function words).</Paragraph> <Paragraph position="5"> As in the work by Langkilde and Knight, our work ignores the text planning stage, but it; does address the sentence, planning and the realization stages.</Paragraph> <Paragraph position="6"> The structure of the paper is as tbllows. In Section 2, we present the underlying grammatical tbrmalism, lexicalized tree-adjoining grammar (LTAG). In Section 3, we describe the architecture of the system, and some of the modules. In Section 4 we discuss three experiments.</Paragraph> <Paragraph position="7"> In Section 5 we colnpare our work to that of Langkilde and Knight (1998a). We conclude with a summary of on-going work.</Paragraph> </Section> class="xml-element"></Paper>