File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/00/w00-1432_intro.xml

Size: 1,582 bytes

Last Modified: 2025-10-06 14:01:07

<?xml version="1.0" standalone="yes"?>
<Paper uid="W00-1432">
  <Title>Sentence generation and neural networks</Title>
  <Section position="3" start_page="242" end_page="243" type="intro">
    <SectionTitle>
3 Implementation
</SectionTitle>
    <Paragraph position="0"> The following features are used to describe the information in the database:  The feature selector fetches the necessary values (determined by the discourse model) and inputs them to NN I. The input vector is eleven units long. Ten units are the local representations of the features in the database and the last unit represents the generalizer feature from the discourse model. The Stuttgart Neural Networks Simulator (SNNS 2) which will be used for the implementation only allows values between -1 and 1, so the numerical values will be normalized to fit into the vector. This is also necessary so the relative importance of the different features are not out of proportion.</Paragraph>
    <Paragraph position="1"> The event space in the output will consist of the tbllowing elements: (see table 1 at the end) The vocabulary needed for the generation task is represented by binary codes, e.g. based on the alphabetical order of the forms. If we let the subject/theme part of the vector be 7 units long I At the moment we deal only with hotel info.</Paragraph>
    <Paragraph position="2"> http:l/www.informatik.unistuttgart.de/ipvr/bv/projekt e/snns/snns/html  we can represent 27 (128) different word with numerical values. 0000001 is concept number 1, 00000 ! 0 is concept number 2 and so on.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML