File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/78/t78-1019_metho.xml
Size: 15,703 bytes
Last Modified: 2025-10-06 14:11:13
<?xml version="1.0" standalone="yes"?> <Paper uid="T78-1019"> <Title>INPUT SENTENCE ANALYSIS I TOPIC/C-TYPE I I / / /// -/,,,/ EVALUATION RULES ENGLISH GENERATOR ENGLISH RESPONSE TOPIC SELECTION RULES TRANSITIO~ I W~OTHER TOPIC~ SELECTIONS F INTEREST MICS CONTROL-FLOW</Title> <Section position="2" start_page="143" end_page="144" type="metho"> <SectionTitle> PRISON INMATE 2: Me tool Everything here </SectionTitle> <Paragraph position="0"> tastes like cardboard.</Paragraph> <Paragraph position="1"> The utterance &quot;I want a juicy hamburger&quot; is interpreted differently in each dialog fragment. The difference in the interpretations is based on the different social relations existing between the two conversational participants. In Dialog (9) the utterance was interpreted to mean a direct order to the staff aide: &quot;Get me a hamburger and make sure it is julcyl&quot; In Dialog (I0), the 7-year-old was expressing a request to his mother, hoping that his mother might comply. In Dialog (II), the same statement was interpreted as nothing more than wishful thinking. The fir st inmate made no order or request to the second inmate. Hence, the first utterance of each dialog fragment implies a different conversational goal depending upon the differences in the social relations of the conversational participants. The social context and the relationship between the two speakers generate expectations that guide the course of the conversation. A staff aide e~pects to be ordered about by a general. A mother expects her son to ask her for favors. Prison inmates cannot expect each other to do thinks that are made impossible by their incarceration. These expectations lead to a formulation of different conversational goals for the utterance, &quot;I want a juicy hamburger ,&quot; in each conversational fragment. The conversational principle exemplified in our discussion is summarized as Conversational Rules (7) and (8): RULE 7: The social relationship between the participants in a conversation generates expectations about the intentional meaning of utterances in the conversation. These expectations are used to determine the conversational goals of each participant.</Paragraph> <Paragraph position="2"> RULE 8: Each speaker's perception of the conversational goals of the other speaker determines his interpretation of the other speaker&quot; s utterances.</Paragraph> <Paragraph position="3"> Di f ferences in understanding o f conversational goals lead to different responses in a dialog, as illustrated in Conversation Fragments (9), (I0) and (Ii). We saw how a social relationship between two people can influence their interpretation o f each other&quot; s conversational goals. Two strangers can also make assumptions about each other's conversational goals based on appearances, social circumstances and each other&quot; s occupation. Consider, for instance, the various responses to John's question in the example below: Scenario: John walked up to a parson in the corner and asked : &quot;DO you know how to get to Elm Street?&quot; 12.1) The stranger replied: &quot;You go two blocks toward that tall building and turn right.&quot; 12.2) The cab driver in the corner replied: &quot;Sure; Hop in. Where on Elm do you want to go?&quot; 12.3) The person, who was holding up a map and a piece of paper with an Elm Street address on it, replied : &quot;No, could you tell me how to get there?&quot; 12.4) The child answered: &quot;Yes, I know how to get there! &quot; The question was interpreted to mean four different things, depending on whom John spoke to. If a stranger asks, &quot;Do you know how to get to X,&quot; the listener usually interprets this to mean &quot;I want to go to X, but I do not know how to get there. Please give me directions.&quot; Since the occupation of a cab driver is to take people to their destination it is perfectly legitimate for him to interpret the question as: &quot;If you know how to get to X please take me there.&quot; The person who is visibly lost and trying to find his way may interpret John's question as: &quot;You seem to be lost. Can I help you find your way?&quot; Response (12.3) illustrates that the responder did not infer that John's goal was to go to Elm street, in contrast with the two previous responses. A child often interprets questions of the form: &quot;Do you know Y&quot; literally, possibly inferring that the person asking the question is quizzing him. As in our previous examples, the differences in interpretation can be ~plained in terms of differences in the perceived goals of the participants in the conversation.</Paragraph> <Paragraph position="4"> II) MICS: A process model of human conversation. The phenomenon of human conversation is too comple~ for any single study to do justice to more than a narrow aspect of the problem. In order to fully understand human conversations we may have to understand all human cognitive reasoning processes. Our research approach can be outlined as follows: I) Study many sample conversations; 2) try to establish some relatively general rules of conversation; 3) encode these rules into a process model; 4) see if this model accounts for certain aspects of human conversation; 5) realize that we solved hardly more than a minute part of the problem, and 6) reiterate the research process in a (hopefully positive) feed-back loop.</Paragraph> <Paragraph position="5"> The conversational rules discussed in the first section address problems that need to be considered if one is to understand human conversations. There is little doubt, as demonstrated by countless examples, that conversational goals, shared knowledge between speakers, social relationships between speakers ', and the conversational import of each utterance in a dialog are aspects of human discourse that need to be analyzed if one is to understand how human conversations work. Analyzing these aspects, however, solves only a small subset of the larger problem of how conversations function. For instance, the problem of topic selection in a conversation needs to be addressed. How does a person change the topic in a conversation? How are new topics chosen? These questions are analyzed in Schank \[1977\]. Here we propose some additional ideas on the impact of shared knowledge and interests on topic-selection.</Paragraph> </Section> <Section position="3" start_page="144" end_page="145" type="metho"> <SectionTitle> MICS (Mixed-Initiative Conversational System) </SectionTitle> <Paragraph position="0"> is a fully implemented computer program that generates one side of a natural-language conversation. MICS embodies the conversational rules discussed in this paper, a topic transition mechanism based on Schank \[1977\], and the idea of a conversational syntax. Conversational syntax is a set of rules that help to characterize well-formed conversations. For instance, the following four rules are among the twenty conversational syntax rules in MICS: RULE 9: Do not repeat information in the course of a conversation unless explicitly r eque st ed.</Paragraph> <Paragraph position="1"> RULE I0: Do not say things that the other speaker already knows about.</Paragraph> <Paragraph position="2"> RULE II: If the other speaker says something that violates one's beliefs, then respond in kind by re-asserting the violated belief.</Paragraph> <Paragraph position="3"> RULE 12: If insulted, return the insult or end the conversation.</Paragraph> <Paragraph position="4"> MICS illustrates that the conversational rules, topic transition rules and conversational syntax rules provide a sound, if possibly incomplete approach to the modeling of human conversation. MICS is able to generate reasonable conversations about domains on which it has some knowledge, but there are still many short-comings, as illustrated in the examples that follow.</Paragraph> <Paragraph position="5"> MICS &quot;knows&quot; about itself as a person (e.g. it believes itself to be a computer programmer) and uses a model of the other conversational participant that contains certain information about the speaker, such as his social status, his occupation, his relationship with other people and his areas of special interest and more detailed knowledge. The model of the other speaker is a necessary part of the conversational model, since it enables MICS to apply the conversational rules. When MICS talks to a stranger, it builds a model of the speaker; in fact~ one of its conversational goals is to learn about the person with whom it is conversing. In the dialog that follows (an actual computer run), MICS starts out knowing nothing about Dave.</Paragraph> <Paragraph position="6"> I have to go. Bye Mics.</Paragraph> <Paragraph position="7"> NICE TALKING TO YOU, DAVE As a result of the conversation, MICS knows something about Dave, in particular that one of his llfe goals is to becomle professor. MICS's conversational goals cause it to establish its own conversational topics. When a new topic is introduced by the human speaker, MICS stays on the chosen topic or chooses a closely related topic. Another rule of conversation is invoked in the above dialog : RULE 13: Each participant in the conversation has the conversational goal of saying things that are important to the other participant.</Paragraph> <Paragraph position="8"> When the human participant says that he likes to smoke marijuana, MICS finds in its memory that smoking marijuana is illegal and can lead to the smoker getting stoned. Accord ing to MIC S ' s beliefs getting stoned is undesirable, and doing something illegal leads to undesirable consequences. MICS determines that it is important for Dave to know about the bad consequences of his acts. Therefore, applying Rule (13), MICS warns Dave about the evils of marijuana. It is evident that MICS should have known that Dave knows that smoking marijuana leads to getting stoned, and that, in fact, getting stoned is Dave's primary purpose for smoking marijuana. If MICS knew this fact, then Conversational Syntax Rule (I0) would prevent MICS from warning Dave about getting stoned * Therefore, MICS needs a better mechanism for deciding what knowledge it shares with the other speaker. It is very difficult to determine what the other person already knows and what he is able to infer, but this is a problem that cannot be circumvented if we are to model human discourse. The conversational syntax rule of countering a belief violation (Rule II) caused MICS to object when Dave said that playing chess (MICS's favorite pastime) was not exciting * Since the two participants could not find a common area of interest, the conversation ended without any s ubstantitive dialog. This illustrates Conversational Rule (4), discussed in the previous section. Let us analyze a conversation between MICS and Pete, someone who MICS believes to be its friend.</Paragraph> </Section> <Section position="4" start_page="145" end_page="145" type="metho"> <SectionTitle> THAT'S GOING TO MAKE IT HARD FOR DAVE TO BECOME </SectionTitle> <Paragraph position="0"> A PROFESSOR.</Paragraph> <Paragraph position="1"> Too bad, Dave is not too smart anyways. YES, DAVE LIKES TO SMOKE DOPE. l've got to go and study.</Paragraph> </Section> <Section position="5" start_page="145" end_page="145" type="metho"> <SectionTitle> YOU WILL DO WELL IN THE EXAM. </SectionTitle> <Paragraph position="0"> The above conversation illustrates the usefulness of shared knowledge between the two participants. Pete and MICS both know about taking ~ams, and they have a common acquaintance - Dave. MICS is able to use what it learned about Dave in its previous conversation. MICS' s response to Dave possibly flunking the exam illustrates Conversational Rules (3), (5) and (7). The continuity of the conversation is preserved.</Paragraph> <Paragraph position="1"> The chosen topic transition is probably of interest to Pete, as inferred by MICS in the following manner: Pete introduced Dave into the conversation; therefore, he probably wants to talk about Dave. Not fulfilling a life goal (e.g.</Paragraph> <Paragraph position="2"> becoming a professor) is an inherently important consequence to Dave; therefore, Pete is probably interested in knowing this consequence. Rule 7 applies because in the social relationship between Pete and MICS (they are friends) it is OK to talk about mutual acquaintances * (It is NOT necessarily OK for a judge and a defendant, for instance, to discuss mutual acquaintances.) MICS's utterance about it being difficult for Dave to become a professor illustrates the inferenceing that proceeds in the course of a conversation. The inferences are directed at the most serious consequences of the preceeding utterance: flunking the exam. Focusing the inference process by importance criteria was found to be necessary in order to keep conversations from digressing into uninteresting topics.</Paragraph> <Paragraph position="3"> When Pete states that Dave is not too smart, MICS reacts by verifying this evaluation. Since in the previous conversation Dave chose to ignore MICS's warnings about smoking marijuana, MICS concluded that Dave was not too bright. This conclusion is used as supporting evidence to back Pete's evaluation. The process of responding to opinions is captured in conversational rule 14: RULE 14: If a speaker in a conversation states an opinion, the other speaker should consider whether or not he agrees with the opinion.</Paragraph> <Paragraph position="4"> The first speaker usually has the conversational goal of ~mnting feedback (usually agreement) on his oDinion.</Paragraph> <Paragraph position="5"> The last exchange in the above dialog illustrates a deficiency in MICS determination of the conversational goals. When Pete says, &quot;I've got to go and study,&quot; he means, &quot;I want to end the conversation,&quot; but inferring this conversational goal requires understanding what it means to study. One cannot infer from the phrase, &quot;I've got to go ...&quot;, that the conversation should terminate. For instance, if Pete says, &quot;I've got to go to Europe ,&quot; he probably wants to introduce his upcoming trip as a new topic of conversation.</Paragraph> <Paragraph position="6"> The knowledge that studying and talking cannot proceed simultaneuously is necessary in order to conclude that Pete wanted to end the conversation.</Paragraph> <Paragraph position="7"> The primary usefulness of the MICS system is that it sheds light on problems about the rules of conversation that might otherwise remain unnoticed * The feedback from testing the implementation of our conversational model forces us to be more precise and complete in our formulation of our ideas about conversation. The following is a conversation fragment from a later version of MICS which has a more sophisticated inference capability for determining the knowledge that it shares with the other speaker. For instance, MICS infers that if a person voluntarily chooses to pursue a certain activity, then that person is aware of the intended consequences of this activity.</Paragraph> <Paragraph position="8"> Ashtrays.</Paragraph> </Section> <Section position="6" start_page="145" end_page="145" type="metho"> <SectionTitle> WHAT ABOUT ASHTRAYS? </SectionTitle> <Paragraph position="0"> Ashtrays are useful.</Paragraph> </Section> <Section position="7" start_page="145" end_page="145" type="metho"> <SectionTitle> DO YOU SMOKE? </SectionTitle> <Paragraph position="0"> Yes.</Paragraph> </Section> class="xml-element"></Paper>