File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/87/e87-1040_metho.xml

Size: 32,316 bytes

Last Modified: 2025-10-06 14:11:59

<?xml version="1.0" standalone="yes"?>
<Paper uid="E87-1040">
  <Title>A STRUCTURED REPRESENTATION OF WORD-SENSES IrOR SEMANTIC ANALYSIS.</Title>
  <Section position="3" start_page="249" end_page="249" type="metho">
    <SectionTitle>
TIlE CONCEPTUAL GRAPH MODEL
</SectionTitle>
    <Paragraph position="0"> The conceptual graph formalism unifies in a powerful and versatile model many of the ideas that have been around in the last few years on natural language processing. Conceptual graphs add new features to to the well known semantic nets formalism, and make it a viable model to express the richness and complexity of natural language.</Paragraph>
    <Paragraph position="1"> The meaning of a sentence or word is represented by a directed graph of concepts and conceptual relations. In a graph, concepts are enclosed in boxes, and conceptual relations in circles; in the linear form, adopted in this paper, boxes and circles are replaced by brackets and parenthesis. Arrows indicate the direction of the relations among concepts.</Paragraph>
    <Paragraph position="2"> Concepts are the generalization of physical perceptions (MAN, CAT, NOISE) or abstract categories (FREEDOM, LOVE). A concept has the general form: \[NAME: referent\] The r~ferent indicates a specific occurrence of the concept NAME ~t'or example \[DOG: Fido\]).</Paragraph>
    <Paragraph position="3"> Conceptual relations express the semantic links between concepts. For example, the phrase &amp;quot;John eats ~ is :'cpresented as follows: \[PERSON: John\] &lt; --(AGNT) &lt; --\[EAT\] where (AGNT) is a diadic relation used to explicit the active role of the entity John with respect to the action of eating. In order to describe word meanings, in \[SOWg4\] several types of conceptual graphs are introduced:  1. Type definitions.</Paragraph>
    <Paragraph position="4">  The type of a concept is the name of the class to which the concept belongs. Type labels are structured in a hierarchy: the expression C&gt;C' means that the type C is more general than C' (for example, ANIMAl. - MAN); C is called the supertype of C'.</Paragraph>
    <Paragraph position="5"> A type C is defined in terms of species, that is the more general class to which it belongs, and differentia, that is what distinguishes C from the other types of the same species. The type definition for MAN is : \[ANIMAl ,\] .... (CHRC)-- &gt; \[RATIONAL\] where (ClIP.C.) is the characteristic relation.</Paragraph>
  </Section>
  <Section position="4" start_page="249" end_page="251" type="metho">
    <SectionTitle>
2. Canonical graphs.
</SectionTitle>
    <Paragraph position="0"> Canonical graphs express the semantic constraints (or semantic expectations ruling the use of a concept.</Paragraph>
    <Paragraph position="1"> For example, the canonical graph for GO is: l</Paragraph>
    <Paragraph position="3"> Many ~f the ideas contained in \[SOWS4\] have been used in our work. The original contribution of this paper can be summarized by the following items: find a clear correspondence between the words of natural language and conceptual categories (concepts and relations).</Paragraph>
    <Paragraph position="4"> * provide a lexicon of conceptual relations to express the semanlic formation rules of sentences use a l,ragmatic rather than semantic expectation approach to represent word-senses. As discussed later, the latter seems not to provide sufficient information to analyze m~t trivial sentences.</Paragraph>
    <Paragraph position="5"> To make a clear distinction between word-sense concepts and abstract types. It is not viable to arrange word-senscs in a type hierarchy and to preserve at the same time the richness and consistency of the knowledge base.</Paragraph>
    <Paragraph position="6"> The following sections discuss the above listed items. Concepts, relations and words.</Paragraph>
    <Paragraph position="7"> The pr()htem analyzed in this section concerns the translation of a words dictionary into a concept-relation dictionary. Which words are concepts? Which are relations? Which, if any. are redundant for meaning representation? Concepts and relations are semantic categories which have been adopted with different names in many models. Besides ct~nceplual graphs, Schank's conceptual dependency Word definitions in linear form are represented by wrighting in Ihe Ihsl line the name of the word W (concept or relation) to be defined, and in the following lines a lisl of graphs, linked on their left. side to W.</Paragraph>
    <Paragraph position="8">  \[$HA72\] and semantic nets in their various implementations \[BRA79\] \[GRI76\] represent sentences as a net of concepts and semantic links.</Paragraph>
    <Paragraph position="9"> The ambiguity between concepts and relations is solved in the conceptual dependency theory, where a set of primitive acts and conceptual dependencies are employed. The use of primitives is however questionable due to the potential loss of expressive power.</Paragraph>
    <Paragraph position="10"> In the semantic net model, relations can be role words (father, actor, organization etc.) or verbs (eat, is-a, possess etc.) or position words (on, over , left etc.), depending on the particular implementation.</Paragraph>
    <Paragraph position="11"> In \[sowg4\] a dictionary of conceptual relations is provided, containing role words (mother, child, successor), modal or temporal markers (past, possible, cause etc.), adverbs (until).</Paragraph>
    <Paragraph position="12"> In our system, it was decided to derive some clear guidelines for the definition of a conceptual relation lexicon. As suggested by Fillmore in \[F1L68\], the existence of semantic links between words seems to be suggested by lexical surface structures, such as word endings, prepositions, syntactic roles (subject, object etc.), conjunctions etc. These structures do not convey a meaning per se, but rather are used to relate words to each other in a meaningful pattern.</Paragraph>
    <Paragraph position="13"> In the following, three correspondence rules between words, lexical surface structures and semantic categories are proposed.</Paragraph>
    <Paragraph position="14"> Correspondence between words and concepts.</Paragraph>
    <Paragraph position="15"> Words are nouns, verbs, adjectives, pronouns, not-prepositional adverbs. Each word can have synonyms or multiple meanings.</Paragraph>
    <Paragraph position="16"> RI: A biunivocal correspondence is assigned between main word meanings and concept names. Proper names (John, Fldo) are translated into the referent field of the entity type they belong to (\[PERSON: John\] ).</Paragraph>
    <Paragraph position="17"> Correspondence between determiners and referents Determiners (the, a, etc.) specify whether a word refers to an individual or to a generic instance.</Paragraph>
    <Paragraph position="18"> R2: Determiners are mapped into a specific or generic concept referent.</Paragraph>
    <Paragraph position="19"> For example &amp;quot;a dog&amp;quot; and &amp;quot;the dog&amp;quot; are translated respectively into \[DOG: *\[ and \[DOG: *x\[, where * and *x mean &amp;quot;a generic instance&amp;quot; and &amp;quot;a specific instance&amp;quot;. The problem of concept instantiation is however far more complex; this will be objective of luther study.</Paragraph>
    <Paragraph position="20"> Correspondence between lexical surface structures and conceptual relations The role of prepositions, conjunctions, prepositional adverbs (hef~re, under, without etc.), word endings (nice-st, gold-en) verb endings and auxiliary verbs is to relate words, as in &amp;quot;1 go by bus&amp;quot;, modify the meaning of a name, as in &amp;quot;she is the nicest&amp;quot;, determine the tenses of verbs as in &amp;quot;I was going&amp;quot;, etc.</Paragraph>
    <Paragraph position="21"> Like w~rds, functional signs may have multiple roles (e.g. by, to etc.), derivable from an analysis of grammar cases. (The term case is here intended in its extended meaning, as for Fillmore).</Paragraph>
    <Paragraph position="22"> R3: A biunivocal correspondence is assumed between roles played t'.y./itnctional signs and conceptual relations. Conceptual relations occurrences which have a linguistic correspondent in the sentence (as the one listed above) are called e.~plicit This does not exhaust the set of conceptual relations; there are in fact syntactic roles which are not expressed by signs. For example, in the phrase &amp;quot;John eats&amp;quot; there exist a subject-verb relation between &amp;quot;John&amp;quot; and &amp;quot;eats&amp;quot;; in the sentence &amp;quot;the nice girl&amp;quot;, the adjective &amp;quot;nice&amp;quot; is a quality complement of the noun &amp;quot;girl&amp;quot; . Conceptual relalions which correspond to these syntactic roles are called implicit A conceptual relation is only identified by its role and might have implicit or explicit occurrences. For example, the phrases &amp;quot;a book about history&amp;quot; and &amp;quot;an history book&amp;quot; both embed the argument (ARG) relation: \[BOOK\] .... (A RG)--:&gt; \[HISTORY\] The translation of surface lexical structure into conceptual relations allows to represent in the same way phrases wilh the same meaning but different syntactic structure, as in the latter example.</Paragraph>
    <Paragraph position="23"> Conceptual relations also explicit the meaning of syntactic roles. For example, the subject relation, which expresses the active role of an entity in some action, corresponds m different semantic relation, like agent (AGNT) as in &amp;quot;.lohn reads&amp;quot;, initiator (INIT) as in &amp;quot;John boils potatoes&amp;quot; (John starts the process of boiling), participant (I'ART) as in &amp;quot;John flies to Roma&amp;quot; (John participates to a flight), instrument (INST) as in '.'the knife cuts&amp;quot;. The genitive case, expressed explicitly by the preposition &amp;quot;of&amp;quot; or by the ending &amp;quot;'s&amp;quot;, indicates a social relation (SOC_I,~F,|,) as in &amp;quot;the doctor of John&amp;quot; or in &amp;quot;the father of my friend&amp;quot;, part-of (PART-OF) as in &amp;quot;John's arm&amp;quot;, a real ,~r metaphorical possession (POSS) as in &amp;quot;John's book&amp;quot; and &amp;quot;Dante's poetry&amp;quot;, etc. (see Appendix). The idea of ordering concepts in a type hierarchy was extended to conceptual relations. To understand the need of a relati~m hierarchy, consider the following graphs: \[ B t tll.I ~1 NG\]-- &gt; (AGE)-- &gt; \[YEAR: #50\]  \[BIfll DING\]--&gt; (EXTEN)-- &gt; \[HEIGHT: !130\] \[BI!II.I~ING\]--~-(PRICE)--&gt; ELIRE: #5.000\] (AGI!). (F.XTEN) and (PRICE) represent respectively Ih~, age, extension and price relations. By  defining a supertype (MEAS) relation, the three statements above could be generalized as follows: \[BUILDING\]-- &gt; (MEAS)-- &gt; \[MEASURE: *x\] Appendix 1 lists the set of hierarchically ordered relation types. At the top level, three relation categories have been defined: Role. These relations specify the role of a concept with respect to an action (John (AGNT) eats), to a function (building for (MEANS) residence) or to an event (a delay for (CAUSE) a traffic jam).</Paragraph>
    <Paragraph position="24">  2. Complement. Complement relations link an entity to a description of its structure (a golden (MATTER) ring) or an action to a description of its occurrence (going to (D EST) Roma).</Paragraph>
    <Paragraph position="25"> 3. Link. Links are entity-entity or action-action type of  relations, describing how two or more kindred concepts relate with respect to an action or a way of being. For example, they express a social relation (the mother of (SOC_REL) Mary), a comparison (John is more (MAJ) handsome than Bill), a time sequence (the sun after (AFTER) the rain), etc.</Paragraph>
  </Section>
  <Section position="5" start_page="251" end_page="251" type="metho">
    <SectionTitle>
STRUCTURED REPRESENTATION OF CONCEPTS.
</SectionTitle>
    <Paragraph position="0"> This section describes the structure of the semantic knowledge base. Many natural language processing systems express semantic knowledge in form of selection restriction or deep case constraints. In the first case, semantic expectations are associated to the words employed, as for canonical graphs; in the second case, they are associated to some abstraction of a word, as for example in Wilk's formulas \[WlL73\] and in Shank's primitive conceptual cases \[SHA72\].</Paragraph>
    <Paragraph position="1"> Semantic expectations however do not provide enough knowledge to solve many language phenomena.</Paragraph>
    <Paragraph position="2"> Consider for example the following problems, encountered during the analysis of our text data base (press agency releases of economics):</Paragraph>
  </Section>
  <Section position="6" start_page="251" end_page="251" type="metho">
    <SectionTitle>
1. Metonimies
</SectionTitle>
    <Paragraph position="0"> &amp;quot;The state department, the ACE and the trade unions sign an agreement&amp;quot; &amp;quot;The meeting was held at the ACE of Roma&amp;quot; In the first sentence, ACE designates a human  organization; it is some delegate of the ACE who actually sign the agreement. In the second sentence, ACE designates a plant, or the head office where a meeting took place.</Paragraph>
    <Paragraph position="1"> 2. Syntactic ambiguity &amp;quot;The Prime Minister Craxi went to Milano for a meeting&amp;quot; &amp;quot;President Cossiga went to a residence for handicapped&amp;quot;  In the first case, meeting is the purpose of the act go, in the second &amp;quot;handicapped&amp;quot; case specifies the destinat#m of a building. In both examples, syntactic rules are unable to determine whether the prepositional phrase should be attached to the noun or to the verb. Semantic expectations cannot solve this ambiguity as well: for example, the canonical graph for GO (see Section 2) does not say anything about the semantic validity of the conceptual relation PURPOSE.</Paragraph>
  </Section>
  <Section position="7" start_page="251" end_page="252" type="metho">
    <SectionTitle>
3. Conjtmctions
</SectionTitle>
    <Paragraph position="0"> &amp;quot;The slate department, the ACE and the trade unions sign an agreement&amp;quot; &amp;quot;A meeting between trade unionists and the Minister of tne Interior, Scalfaro&amp;quot; In the first sentence, the comma links to different human chillies; in the second, it specifies the name of a Minister.</Paragraph>
    <Paragraph position="1"> The above phenomena, plus many others, like metaphors, vagueness, ill formed sentences etc., can only be solved by adopting a pragmatic approach for the semantic knowledge base. Pragmatics is the knowledge about word uses, contexts, figures of speech; it potentially unlimited, but allows to handle without severe restrictions the richness of natural language. The definition of this semantic encyclopedia is a challenging objective, that will require a joint effort nf linguists and computer scientists, llowever, we do not believe in short cut solution of the natural language processing problem.</Paragraph>
    <Paragraph position="2"> Within our project, the following guidelines were adopted for 0w definition of a semantic encyclopedia: Each word-sense have an entry in the semantic data base; Ihis entry is called in the following a concept definition  2. A concepl definition is a detailed description of its semantic expectations and of its semantically permitted uses (for example, a car is included as a possible subject of drinl~ as in &amp;quot;my car drinks gasoline&amp;quot;, a purpose and a manner are included as possible relations fi~r go) 3. F.ach word use or expectation is represented by an elementary ,2raph :  (i)\[Wl.- (~aEl. CONC)-:-&gt;\[C\] where \\' is the concept to be defined, C some other concept tx'pe, and &lt;-&gt; is either a left or a right arrow.</Paragraph>
    <Paragraph position="3"> Partitioning a definition in elementary graphs makes it easy for the verificalion algorithm to determine whether a specific link between two words is semantically permitted or not. In facl, g ve ~ two word-senses W1 and W2, these are semantically related by a conceptual relation REL_CONC if  there exist a concept W in the knowledge base including the graph: \[W\] &lt;- &gt; (REL_CONC) &lt;- &gt; \[C\] where W&gt; =WI and C&gt; =W2. To reduce the extent of the knowledge base, C in (1) should be the most general type in the hierarchy for which the (1) holds. The problem of defining a concept hierarchy is however a complex one. The following subsection deals with type hierarchies.</Paragraph>
    <Section position="1" start_page="252" end_page="252" type="sub_section">
      <SectionTitle>
Word-senses and Abstract Classes
</SectionTitle>
      <Paragraph position="0"> Many knowledge representation formalisms for natural language order linguistic entities in a type hierarchy. This is used to deduce the properties of less general concepts from higher level concepts (property inheritance). For example, if a proposition like the one expressed by graph (1) is true, then all the propositions obtained by substitution of C with any of their subtypes must be true. However, generalization of properties is not strictly valid for linguistic entities; for example the graphs:  (2) \[GO\]-- &gt; (OBJ)-- &gt; \[CONCRETE\] (3) \[WATCH\]-- &gt; (AGNT)-- &gt; \[BLIND\] are both false, even though they are specializations respectively of the following graphs: (4) \[MOVE\]-- &gt; lOB J)-- &gt; \[CONCRETE\] (5) \[WATCH\]-- &gt; (AGNT)-- &gt; \[ANIMATE\]  In fact, the sentences &amp;quot;to go something&amp;quot; and &amp;quot;a blind watches&amp;quot; violate semantic constraints and meaning postulates: generalization does not preserve both completeness and consistency of definitions. In addition, if a pragmatic approach is pursued, one quickly realizes that no word-sense definition really includes some other; each word has it own specific uses and only partially overlap with other words. The conclusion id that is not possible to arrange word-senses in a hierarchy; on the other side, it is impractical to replace in the graph (1) the concept type C with all the possible word-senses Wi for which (1) is valid. A compromise solution has been hence adopted. The hierarchy of concepts is structured as follows:  1. There are two levels of concepts: word-senses and abstract classes; 2. Concepts associated to word-senses (indicated by italic  cases) are the leaves of the hierarchy; Abstract conceptual classes, as MOVE_ACTS, HUMAN_ENTITIES, SOCIAL_ACTS etc. (upper cases) are the non-terminal nodes.</Paragraph>
      <Paragraph position="1"> In this hierarchy word-sense concepts are never linked by supertype relations to each other, but at most by brotherhood. Definitions are provided only for word-senses; abstract classes are only used to generalize elementary graphs on word uses.</Paragraph>
      <Paragraph position="2"> This solution does not avoid inconsistencies; for example, the graph (included in the definition of the  word-sense person): (6) \[person\] &amp;quot;--(AGNT) &lt;--\[MOVE_ACT\] is a semantic representation of expressions like: John moves, goes, jumps, runs etc. but also states the validity of the expression &amp;quot;John is the agent of flying&amp;quot; which is instead not valid if John is a person. However the definition offly will include: (7) Ifly\]-- &amp;quot; (AC~NT)-- &gt; \[WINGED_ANIMATi?~S\] (8) \[fly\]-- -(I'ARTICIPANT)--&gt; \[HUMAN\] The semantic algorithm (described in \[PAZ87\]) asserts the validity of a link between two words WI and W2 only if there exist a conceptual relation to represent the meaning of that link. In c,rder for a conceptual relation to be accepted: 1. This relation must be included in some elementary graph (~f W1 and W2 2. The type constraints imposed by the elementary graphs  must bc satisfied for both W1 and W2.</Paragraph>
      <Paragraph position="3"> In conclusion, it is possible to write general conditions on word uses wiHmut get worried about exceptions. The following section gives an example of concept definition. Concept definitions Concept definitions have two descriptors: classilTcation and de l?nition.</Paragraph>
      <Paragraph position="4">  1. Classificalkm.</Paragraph>
      <Paragraph position="5">  Besides the supertype name, this descriptor also includes a type definition, introduced in Section 2. For example, the type definition for house is &amp;quot;building for residence&amp;quot;, which in terms of conceptual graphs is:</Paragraph>
      <Paragraph position="7"> were I~IIII.I)ING represents the species, or supertype, and (MEANS)&lt;--\[RESIDENCE\] the differentia.</Paragraph>
    </Section>
  </Section>
  <Section position="8" start_page="252" end_page="255" type="metho">
    <SectionTitle>
2. Definition.
</SectionTitle>
    <Paragraph position="0"> This descriptor gives the structure and functions of a concept. The definition is partitioned in three subareas, correspnnding to the three conceptual relation categories introduced in the previous section.</Paragraph>
    <Paragraph position="1">  a. P, cde. For an entity, this field lists the actions, /'ttnrli,gns and events, and for an action the subjects, objects and proposition types that can be related to it by means of role type relations. For exnmple, Ihe role subgraph for think would be (A(;NT) .... \[IIUMAN\] (o I~J!- --lTVO P \]  This graph describes the structure of an entity or the occurrence (place, time etc.) of an action. This is obtained by listing the concept types that can be linked to the given concept by means of complement type relations. A complement subgraph for EAT i~:</Paragraph>
    <Paragraph position="3"> Note that sume elementary graph expresses a relation between two terminal nodes (as for example the opposite of eal); in most cases however conditions are more general.</Paragraph>
    <Paragraph position="4"> AN OVHIVIEW OF TIlE SYSTEM.</Paragraph>
    <Paragraph position="5"> This paper focused on semantic knowledge representation issues, lIowever, many other issues related to natural language processing have been dealt with. The purpose of lhis section is to give a brief overview of the text understanding system and its current status of implementatim~. Figure 1 shows the three modules of the text analyzer.</Paragraph>
    <Paragraph position="6">  All the modules are implemented in VM/PROLOG and run on IBM 3812 mainframe. The morphology associates at least one lemma to each word; in Italian this task is particularly complex due to the presence of recursive generation mechamsrns, such as alterations, nominalization of verbs, etc. I.~r example, from the lemma casa (home) it is possible I, derive the words cas-etta (little home), cas-ett-ina (nice little home), cas-ett-in-accia (ugly nice little i 254 home) and so on. At present, the morphology is complete, and uses for its analysis a lexicon of 7000 lemmata \[ANT87\].</Paragraph>
    <Paragraph position="7"> The syntactic analysis determines syntactic attachment between words by verifying grammar rules and forms agreement; the system is based on a context free grammar \[ANT87\]. Italian syntax is also more complex than English: in fact, sentences are usually composed by nested hypotaetical phrases, rather than linked paratactical. For example, a sentence like &amp;quot;John goes with his girl friend Mary to the house by the river to meet a friend for a pizza party ~ might sound odd in English but is a common sentence structure in Italian.</Paragraph>
    <Paragraph position="8"> Syntactic relations only reveal the surface structure of a sentence. A main problem is to determine the correct prepositional attachments between words: it is the task of semantics to explicit the meaning of preposition and to detect the relations between words.</Paragraph>
    <Paragraph position="9"> The task of disambiguating word-senses and relating them to each other is automatic for a human being but is the hardest for a computer based natural language system. The semantic knowledge representation model presented in this paper does not claim to solve the natural language processing problem, but seems to give promising results, in combination with the other system components.</Paragraph>
    <Paragraph position="10"> The semantic processor consists of a semantic knowledge base and a parsing algorithm. The semantic data base presently consists of 850 word-sense definitions; each definition includes in the average 20 elementary graphs. Each graph is represented by a pragmatic rule, with the form:</Paragraph>
    <Paragraph position="12"> The above has the reading :&amp;quot;*x modifies the word-sense W by the relation CONC_REL if *x is a Y&amp;quot;. For example, the PR: AGNT(think,*x) &lt; -COND(H UMAN_ENTITY,*y).</Paragraph>
    <Paragraph position="13"> corresponds to the elementary graph: \[think\]-- &gt; (AGNT)-- &gt; \[HUMAN_ENTITY\] The rule COND(Y,*x) requires in general a more complex computation than a simple supertype test, as detailed in \[PAZ87\]. The short term objective is to enlarge the dictionary to 1000 words. A concept editor has been developed to facilitate this task. The editor also allows to visualize, for each word-sense, a list of all the occurrences of the correspondent words within the press agency releases data base (about 10000 news).</Paragraph>
    <Paragraph position="14"> The algorithm takes as input one or more parse trees, as produced by the syntactic analyzer. The syntactic surface structures are used to derive, for each couple of possibly related words or phrases, an initial set of hypothesis fi~r the correspondent semantic structure. For example, a noun phrase (NP) followed by a verb phrase (VP) could be represented by a subset of the LINK relations listed in the Appendix. The specific relation is selected by verifying type cnnstraints, expressed in the definitions of the correspondent concepts. For example, the phrase &amp;quot;John opens (thc door)&amp;quot; gives the parse:</Paragraph>
    <Paragraph position="16"> A subject-verb relation as the above could be interpreted by one of tile following conceptual relations: AGNT, PARTICII~ANT, INSTRUMENT etc. Each relation is tested for ~emanlic plausibility by the rule: (2) RFI._CONC/?(x,y) &lt;- (x: REL_CONC(x,*y= y) )&amp; (y: REI._CONC(*x = x,y) ).</Paragraph>
    <Paragraph position="17"> The (2) is proved by rewriting the conditions expressed on the right end side in terms of COND(Y,*x) predicates, as in the (I), and Ihcn attempting to verify these conditions. In the above cxamplc, (1) is proved true for the relation AGNT, because: AGNT(open,person: John)&lt;- (open: AGNT(open,*x = person: John) )&amp; (person: AGNT(*y = open,person: John)).</Paragraph>
    <Paragraph position="18"> (open: AGNT(open,*x) &lt; -COND(HUMAN_ENTITY,*x).</Paragraph>
    <Paragraph position="19"> (person: AGNT(*y,person) &lt; -COND(MOVE ACT,*y)).</Paragraph>
    <Paragraph position="20"> The conceptual graph will be \[PERSON: John 1 .: --(AGNT) &lt; --\[OPEN\] For a detailed description of the algorithm, refer to \[PAZ87\] At the end of the semantic analysis, the system produces two possible outputs. The first is a set of short paraphrases of the input sentence: for example, given the sentence &amp;quot;The ACE signs an agreement with the government&amp;quot; gives: The Society ACE is the agent of the act SIGN.</Paragraph>
    <Paragraph position="21"> AGP, EEM ENT is the result of the act SIGN.</Paragraph>
    <Paragraph position="22"> The GOVERN M EN'F participates to the AGREEMENT.</Paragraph>
    <Paragraph position="23"> The second output is a conceptual graph of the sentence, generated using a graphic facility. An example is shown in  stored in a ,:la~ahase for future analysis (query answering, deductions etc.).</Paragraph>
    <Paragraph position="24"> As far aq lhe semantic analysis is concerned, current efforts are directed towards tile development of a query answering system and a language generator. Future studies will concentrate on discourse analysis.</Paragraph>
    <Paragraph position="26"/>
  </Section>
  <Section position="9" start_page="255" end_page="255" type="metho">
    <SectionTitle>
APPENDIX
CONCEPTUAL RELATION ItlERARCHY.
</SectionTitle>
    <Paragraph position="0"> This Appendix provides a list of the three conceptual relation hierarchies (role, complement and link) introduced in Section 3. For each relation type, it is provided:  1. The level number in the hierarchy.</Paragraph>
    <Paragraph position="1"> 2. The complete name.</Paragraph>
    <Paragraph position="2"> 3. The correspondent abbreviation.</Paragraph>
    <Paragraph position="3"> 3. SIMII,ARITY (SIMIL) 2. ORDERING (ORD) 3. TIME SPACE ORDERING (POS) 4. VI(~NI'I'Y (~IEAR) The house near the lake. 4. PRF.CF, I)F, NCE (BEFORE) 4. ACCOMPANIMENT (ACCOM) Mary went with .Iohn 4. SIJPI)OI~,T (ON) The book on the table 4. INC, I,IJSION (IN) 3. LOGIC ORDERING (LOGIC) 4. C, ON~IIN(2TION (AND) I eat and drink. 4. I)IS.IIINCTION (OP,) Either you or me. 4. (2ONTRAPI)OSITION (OPPOSITE) 3. NUIIIF, RIC ORDERING (NUMERIC) 4. ENIIMERATION (ENUM) Five political parties 4. PARTITION (PARTITION) Two of us 4. ADI)ITION (ADD) Fie owns a pen and also a book.  For some of the lower level relation types, an example sentence is also given. In the sentence, the concepts linked by the relation are highlighted, and the relation is cited, if explicit. Bold characters are used for not terminal nodes of the hierarchy.</Paragraph>
    <Paragraph position="4"> The set of conceptual relation has been derived by an analysis of Italian grammar cases (the term &amp;quot;case&amp;quot; is here intended as for \[FIL68\] ) and by a careful study of examples found in the analyzed domain. The final set is a trade-off between two competing requirements: 2.</Paragraph>
    <Paragraph position="5"> A large number of conceptual relations improves the expressiveness of the representation model and allows a &amp;quot;fine&amp;quot; interpretation; A small number of conceptual relations simplifies the task of semantic verification, i.e. to replace syntactic relations between words by conceptual relations between concepts.</Paragraph>
    <Paragraph position="6">  3. POSSESSION (POSS) The house of John 3. SOCIAL RELATION (SOC_REL) The mother of. Jolm 3. KIND O-F (KIND_OF) The minister of the Interiors 2. COMPA-RISON (COMe) 3. MAJORITY (MAJ) He is nicer than me  5. MOVE FROM (SOURCE) 3. TIME ( TI,I, fE) 4. I)F, TIH~MINED TIME (PTIME) I arrived attire 4. T1M F, I ,ENGI-IT (TLENGI IT) The movie lasted for three hours 4. STARTI NG TIME (START) The skyscraper was built since 1940 4. I-NI)ING TIME (END) 4. PIIAgF, (I'IIASE) 3. CONTEXT (CONTEXT) 4. STATFMF, NT (STATEMENT) I will surely come</Paragraph>
  </Section>
  <Section position="10" start_page="255" end_page="256" type="metho">
    <SectionTitle>
4. I'OSSIIIII,ITY (POSSIBLE)
4. NEGATION (NOT)
4. QI~I~RY (QUERY)
</SectionTitle>
    <Paragraph position="0"> 4. IH:,I,IF, F (BF, I,IEF) I think that she will arrive 3. QIIAI,ITY (QUALITY) 3. QUANITI'Y (QUANTITY) 3. INITIAl VAI,I, JE (IVAI,) The shares increased their value fi'om 1000 dollars 3. FINAl, VAIAIF, (FVAL) to I500 2. S'I'RU(TT~&amp;quot;RI PS (STRUCT) 3. SUBSI,I Ix,'('/: (SUBST)</Paragraph>
  </Section>
  <Section position="11" start_page="256" end_page="256" type="metho">
    <SectionTitle>
4. MA'VFER (MATTER) Wooden window
4. ARGUMENT (ARG)
4. PART OF (PART OF) John's arm.
</SectionTitle>
    <Paragraph position="0"> 3. AGENT (AGNT)The escape of the enemies 3. PARTICIPANT (PART) Johnfiies to Roma.</Paragraph>
    <Paragraph position="1"> 3. INITIATOR (INIT) John boils eggs.</Paragraph>
    <Paragraph position="2"> 3. PRODUCER (PRODUCER) John's advise 3. EXPER1ENCER (EXPER) John is cold.</Paragraph>
    <Paragraph position="3"> 3. BENEFIT (BENEFIT) Parents sacrifice themselves to the sons. 3. DISADVANTAGE (DISADV) 3. PATIENT (PATIENT) Mary loves John 3. RECIPIENT (RCPT) I give an apple to him.</Paragraph>
    <Paragraph position="4"> 2. EVENT_ROLES (EV_ROL) 3. CAUSE (CAUSE) fie shivers with cold.</Paragraph>
    <Paragraph position="5"> 3. MEANS (MEANS) Profits increase investments 3. PURPOSE (PURPOSE) 3. CONDITION (COND) lfyou come then you will enjoy. 3. RESULT (RESULT) He was condemned to damages. 2. OBJECT ROLES ( OB_ROL) 3. INSTRUMENT (INST) The key opensthe door. 3. SUBJECT (SUB J) The ball rolls.</Paragraph>
    <Paragraph position="6"> 3. OBJECT (OBJ) John eats the apple.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML