File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/91/w91-0106_concl.xml

Size: 1,443 bytes

Last Modified: 2025-10-06 13:56:46

<?xml version="1.0" standalone="yes"?>
<Paper uid="W91-0106">
  <Title>REVERSIBLE NLP BY DERIVING THE GRAMMARS FROM THE KNOWLEDGE BASE</Title>
  <Section position="7" start_page="43" end_page="43" type="concl">
    <SectionTitle>
STATE OF DEVELOPMENT
</SectionTitle>
    <Paragraph position="0"> The parsing side of this architecture for reversible NLP is implemented and running in an operational system, CTI-1. It has a mature domain model for personnel changes, and has been running the parsing grammar that is projected from that model on hundreds of articles from the Wall Street Journal (&amp;quot;Who's News&amp;quot;).</Paragraph>
    <Paragraph position="1"> The generation side of the architecture is in its infancy, waiting on a suitable domain and task where the reasons for speaking and the situation models are rich enough to motivate subtle nuances in phrasing.</Paragraph>
    <Paragraph position="2"> By the same token, my prior experience with generation leads me to believe that the design of the linguistic annotations is well-founded for generation, and that this side of the reversal will fall out once the opportunity for implementation arises. When this happens the &amp;quot;raw material&amp;quot; that the mappings discussed here will supply will be fed to a text planner like Meteer's RAVEL orchestrator in her SPOKESMAN system, and then drive a TAG realization component along the lines of Mumble-86.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML