File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/00/a00-1023_concl.xml
Size: 3,926 bytes
Last Modified: 2025-10-06 13:52:38
<?xml version="1.0" standalone="yes"?> <Paper uid="A00-1023"> <Title>A Question Answering System Supported by Information Extraction*</Title> <Section position="7" start_page="170" end_page="171" type="concl"> <SectionTitle> 4 Future Work: Multi-level IE Supported QA </SectionTitle> <Paragraph position="0"> A new QA architecture is under development; it will exploit all levels of the IE system, including CE and GE.</Paragraph> <Paragraph position="1"> The first issue is how much CE can contribute to a better support of QA. It is found that there are some frequently seen questions which can be better answered once the CE information is provided. These questions are of two types: (i) what/who questions about an NE; (ii) relationship questions.</Paragraph> <Paragraph position="2"> Questions of the following format require CE templates as best answers: who/what is NE? For example, Who is Julian Hill? Who is Bill Clinton? What is Du Pont? What is Cymfony? To answer these questions, the system can simply 1 For example, How did one make a chocolate cake? How+Adjective questions (e.g. how long, how big, how old, etc.) are handled fairly well.</Paragraph> <Paragraph position="3"> retrieve the corresponding CE template to provide an &quot;assembled&quot; answer, as shown below. Q: Who is Julian Hill? A: name: Julian Werner Hill staff: Julian Hill; Wallace Carothers.</Paragraph> <Paragraph position="4"> Questions specifically about a CE relationship include: For which company did Julian Hill work? (affiliation relationship) Who are employees of Du Pont Co.? (staff relationship) What does Julian Hill do? (position/profession relationship) Which university did Julian Hill graduate from? (education relationship), etc. 2 The next issue is the relationships between GE and QA. It is our belief that the GE technology will result in a breakthrough for QA. In order to extract GE templates, the text goes through a series of linguistic processing as shown in Figure 1. It should be noted that the question processing is designed to go through parallel processes and share the same NLP resources until the point of matching and ranking. The merging of question templates and GE templates in Template Matcher are fairly straightforward. As they both undergo the same NLP processing, the resulting semantic templates are of the same form. Both question templates and GE templates correspond to fairly standard/predictable patterns (the PREDICATE value is open-ended, but the structure remains stable). More precisely, a user can ask questions on general events themselves (did what) and/or on the participants of the event (who, whom, what) and/or the time, frequency and place of events (when, how often, where). This addresses by far the most types of general questions of a potential user.</Paragraph> <Paragraph position="5"> For example, if a user is interested in company acquisition events, he can ask questions like: Which companies ware acquired by Microsoft in 1999? Which companies did Microsoft acquire in 1999? Our system will then parse these questions into the templates as shown If the user wants to know when some acquisition happened, he can ask: When was Netscape acquired? Our system will then translate it into the pattern below: Note that WHO, WHAT, WHEN above are variable to be instantiated. Such question templates serve as search constraints to filter the events in our extracted GE template database. Because the question templates and the extracted GE template share the same structure, a simple merging operation would suffice. Nevertheless, there are two important questions to be answered: (i) what if a different verb with the same meaning is used in the question from the one used in the processed text? (ii) what if the question asks about something beyond the GE (or CE) information? These are issues that we are currently researching.</Paragraph> </Section> class="xml-element"></Paper>