File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/evalu/00/w00-1430_evalu.xml

Size: 3,732 bytes

Last Modified: 2025-10-06 13:58:40

<?xml version="1.0" standalone="yes"?>
<Paper uid="W00-1430">
  <Title>From Context to Sentence Form</Title>
  <Section position="6" start_page="227" end_page="228" type="evalu">
    <SectionTitle>
3 Scenarios
</SectionTitle>
    <Paragraph position="0"> This section illustrates (with examples from simulated conference data) how the attention space is derived from the context and how rules for topic/focus assignment are applied. In each scenario the previous utterance of the parrot to the user -if recent enough-, constitutes the linguistic context (Icv), the user's current activity activates entities via tile physical context (elcv). Tile tree diagram shows tile corresponding attention space map. The propositional content (input to the NLG process) consists of the message type, an instance of event, person.</Paragraph>
    <Paragraph position="1"> keyword(s) and possibly time expression. Finally we compare the context sensitive NLG output with the default output,</Paragraph>
    <Section position="1" start_page="227" end_page="228" type="sub_section">
      <SectionTitle>
3.1 Scenario 1: topic-focus structure
</SectionTitle>
      <Paragraph position="0"> At the nloment of utterance, the heater's context can be characterised as follows: linguistic context: &amp;quot;There will be a.n interesting presentation on 'knowledge systems and AI'. l) 3 Amanda Huggenkiss. this afternoon.&amp;quot; physical context: user is attending a presentation oil 'machine learning' by Penn Sill.</Paragraph>
      <Paragraph position="1"> This situation may be analysed as activating discourse referents in the hearer's mind as represented in the attention space map of figure 4.</Paragraph>
      <Paragraph position="2"> GNote that, in c&amp;~e the object marked for high interest is the person, a more abbreviated sentence construct iotl is appropriate: '.losEP ARcos is in your neighbourh~md'. Since t he user indicated herself that she is interested in this person. n,, need tc~ ful*lher characterise him.</Paragraph>
      <Paragraph position="3">  This situation leads to the application of rule 2: 'machine learning' will be assigned the role of topic, while other entities of the input structure ('appointment proposal' and 'Enric Plaza') will receive the role of focus. This yields the following output: &amp;quot;MACHINE LEARNING, it's also the subject of an APPOINTMENT PROPOSAL by ENRIC PLAZA.&amp;quot; Compare with the default sentence construction: &amp;quot; ENRIC PLAZA proposes an APPOINTMENT to talk about ~'\[ACHINE LEARNING.&amp;quot;</Paragraph>
    </Section>
    <Section position="2" start_page="228" end_page="228" type="sub_section">
      <SectionTitle>
3.2 Scenario 2: topic shift
</SectionTitle>
      <Paragraph position="0"> linguistic context: &amp;quot;There will be an interesting presentation on 'knowledge systems' and 'AI', by Amanda Huggenkiss, this afternoon.&amp;quot; physical context: user is leaving a presentation, on her way to a meeting with Richard Benjamins on Machine Learning.</Paragraph>
      <Paragraph position="1"> This situation leads to the attention space map of figure 5.</Paragraph>
      <Paragraph position="2"> propositional content: proztmity alert, Josep A rcos, agents (p_pv. 5). The profile value annotation indicates that this keyword is of high interest t.o the user (as indicated by herself, e.g. at conference registrar ion).</Paragraph>
      <Paragraph position="3"> The physical context is such that it allows for a shift of topic (user is not yet talking to Hichard Benjamins), which makes rule 3 applicable: 'agents will be introduced as a new topic, followed b~, an argu- null Compare to the default expressions: &amp;quot;JoSEP ARCOS, who's interested in 'AGENTS', is close to you.&amp;quot;</Paragraph>
    </Section>
  </Section>
class="xml-element"></Paper>
Download Original XML