File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/90/c90-1003_abstr.xml
Size: 33,118 bytes
Last Modified: 2025-10-06 13:46:51
<?xml version="1.0" standalone="yes"?> <Paper uid="C90-1003"> <Title>Semantic Abstraction and Anaphora</Title> <Section position="1" start_page="0" end_page="0" type="abstr"> <SectionTitle> Abstract </SectionTitle> <Paragraph position="0"> This paper describes a way of expressing syntactic rules that ~kssociate semantic formulae with strings, but in a manner thai is independent of the syntactic details of these formulac. In particular we show how the same rules construct predicate argument formulae in the style of Montague grammar\[131, rap_ resentations reminiscent of situation semantics(Barwise and Perry 121) and of the event logic of Davidson \[5\], or representations inspired by the discourse representations proposed by Kamp \[191. The idea is that semantic representations are specilied indirectly using semantic construction operators, which enforce an abstraction barrier between the grammar and the semantic representations themselves. First we present a simple grammar which is compatible with the three different sets of constructors for the three formalisms.</Paragraph> <Paragraph position="1"> We then extend the grammar to provide one treatment that accounts for quantilier raising in the three different semantic formalisms Introduction Grammars specifying the relationship between strings and semantic representations often have details of these representations embedded in them. We show how grammar rules can be wrilten in a form which, by abstracting away from details of tim semantic representation, acquires greater modularity and hence theoretical perspicuity and practical robustness. In particular, we believe that the approach helps clarify the relationship between apparently disparate theories of semantic representation. 1. The basis of our proposal is that each grammatical rule should contain, or be paired with, an expression written in terms el' sem~mtic construction operators. Different operations can be associated with these operators and, depending on the set in force at a given time, the effect of interpreting the expression will be to construct a representation in one semantic formalism or another. The set of operators contains me,nbets corresponding to such notions as compositzon, conjttlwtion, etc. The set is small and independent of the semantic formalism. The operations are associated with the operators independently of the grammar and they determine the form of the semantic representation.</Paragraph> <Paragraph position="2"> We present three different sets of semantic constructors here, which we have dubbed the predicate-logic, the sets-of-infons and the discourse-representatio,z constructors. We begin by introducing the constructors used in this paper: no claims are made for their general sufficiency. Not all of the constructors are relevant to all semantic theories and those not needed for a particular one are given degencrate delinitions. The simplest kind of construction operator is the identity function which maps every input i onto just one output, namely i.</Paragraph> <Paragraph position="3"> The operators are the following: emterr~aI(S, EF) relates a semantic representation S and an external form EF, e.g. a representation that constitutes the parscr's output. The internal and cxtemal forms are distinguished because the (internal) representation o c will, in general, contain information that plays a role in the process of analyzing a sentence (e.g. for anaphora tracking) but that is not part of the logical form (EF) of the sentence as a whole.</Paragraph> <Paragraph position="4"> atom( S, I'rop) specifies that the content of the (internal) semantic representation, S, is the atomic proposition l)rop. This is used to construct the semantic values for lexical entries, for example.</Paragraph> <Paragraph position="5"> co~jo~:~z(S1, $2, $12) relates three semantic representations. It specifies that the content of 5'12 is constructed by conjoining 5'1 and 5'2. This operator occurs crucially in the semantics of indefinite determiners.</Paragraph> <Paragraph position="6"> 7~etv_iT~de,~(S, i) specifics that the content of 5&quot; is I, a referential index for a non-anaphoric NP. The form of a referential index is defined by the particular semantic theory.</Paragraph> <Paragraph position="7"> accessible_iTzdez(S, I) specifies that the content of o e is a referential index i of some noun phrase that is a potential antecedent of an anaphor. Constraints on accessible indices are defined by the particular semantic theory.</Paragraph> <Paragraph position="8"> While the primitives discussed in this paper have relatively simple definitions, in other more elaborate theories they may involve non-trivial computation. For example, the compose primitive might impose certain discourseconsistency requirements arising from a more restrictive theory of discourse structure than those described here.</Paragraph> <Paragraph position="9"> A key insight of the Discourse Representation and Situation Semantics accounts, but originating with Karttunen\[10\], is that anaphoric and quantificational domains coincide. Thus, in (1), it can be co-indexed with a donkey only if a donkey is interpreted as hav- null ing wide-scope.</Paragraph> <Paragraph position="10"> (1) Every man kicked a donkey. It developed blue bumps.</Paragraph> <Paragraph position="11"> The relationship between these sentences is one of (semantic) precedence, and we call the operator that relates the corresponding semantic representations compose: corr~pose(S1, $2,5'12) specifies that the information in the representation $12 is the information in $1 followed by the information in $2. Compose defines an ordering of semantic operations that particular semantic theories may or may not be sensitive to. (In this paper, the Montague constructors are not sensitive to this ordering, while the other two types of semantic representations are).</Paragraph> <Paragraph position="12"> When a donkey is interpreted as having narrow scope with respect to every man in (1), the reference marker introduced by a donkey is located in a context subordinate to the sentence as a whole, and hence not accessible to anaphors in the following discourse. To provide for this, we introduce the following operator: null sttbordiTzate( S, ,5'ztb ?f an~e, S ztb ) specifies that S contains an anaphorically and quantilicationally subordinate representation Sub, which has the &quot;name&quot; Sz~bName. The S~LbName would be distinguished from Sub in non-extensional theories of meaning, where a meaning is distinguished from its propositional content (say), as in the sets-of-infons representation described below.</Paragraph> <Paragraph position="13"> We turn now to the grammar without quantifier-raising. We formulate bolh the grammar and the semantic constructors in pure Prolog (exploiting the syntactic sugar of Definite Clause Grammar (Pereira and Shieber \[121, pp. 70-79)) because it is expressive enough for our purposes and is widely used in work of this kind(sec, inter alia Colmerauer \[31, Abramson and Dahl \[ll and \[11\]).</Paragraph> <Paragraph position="14"> A Grammar using Semantic Constructors The grammar generates simple transitive clauses and subject-relative clauses that do not involve long-distance dependencies, it is based on the Montague-style grammars presented in Chapter 4 of Pereira and Shieber\[12\], and the treatments of agreement, Wh-dependencies, etc., presented there could also be incorporated without difficulty.</Paragraph> <Paragraph position="16"> compose(S\], $2, S) , subordinate(Res, ResName, Sl) , compose(Res0, Resl, Res) , subordinate (Scope, ScopeName, Resl) , atom(ResName > ScopeName, $2) .</Paragraph> <Paragraph position="17"> Most of the grammar should be familiar, even if it is somewhat more pedantically expressed than is usual. Following Pereira and Shieber (who were in turn inspired by Montague), VP and N incomings are represented by terms of the form x ^ S, where X represents a referential index and S represents an S meaning. NP meanings are represented by terms of the form vp^S (or equivalently, (x^s0)^S), where vp represents a VP meaning, x a referential index, and S and S0 represent S meanings. All manipulation of semantic values is pertbrmed by constructor primitives, rather than by explicit conslruction of terms. For example, the N1 production that introduces relativeclauses invokes con join explicitly to conjoin the semantic values of the N and the relative clause to yield the semantic value of the N I. The sharing of the referential index x bctween the N and the VP is performed in the grammar alone, since it is a syntactic rather than semantic property of the construction.</Paragraph> <Paragraph position="18"> The semantic component of the production that introduces lexical nouns has two parts. SO represents the atomic predicate Pced associated with the Icxical meaning of the noun. $2 represents the fact that x is a (possibly new) referential index. The component S of the semantic value associated with the noun contains all of the information in SO and S2.</Paragraph> <Paragraph position="19"> The production introducing (lexical) pronouns requires that the referential index x of the pronoun be accessible in S0, and specifies that the S component is the composition of S0 and the S0 component of the VP meaning. (Recall that the semantic representations of pronouns, like all NP's, are terms of the form VP^S, so the SO is a component of the meaning of the VP or V phrase that this pronoun is an argument of).</Paragraph> <Paragraph position="20"> Undoubtedly the most complex component of the grammar is the lexical entry for every. Because the structure of the lexical entries for all anaphoric scope-inducing quantifiers will be similar to the entry for every, wE explain it in some detail.</Paragraph> <Paragraph position="21"> The quantification induced by the determiner every is described in terms of the deternaincr's restriction, which defines the entities that the quantification ranges over, and its scope, the component of the expression quarttiffed over. (2) indicates the components of the utterance corresponding to the restriction and the scope of the quantifier every in the absence of quantilier-raising.</Paragraph> <Paragraph position="22"> (2) Every man that saw a donkey kicked it.</Paragraph> <Section position="1" start_page="0" end_page="0" type="sub_section"> <SectionTitle> Restrictor Scope </SectionTitle> <Paragraph position="0"> The grammars presented here identify the restricter and the scope of a determiner in the syntax; e.g. quantifier-raising arises from the grammar permitting multiple a.ssignmcnts of components of the utterance to the restrictions and scopes of the determiners of that utterance.</Paragraph> <Paragraph position="1"> The semantic value associated with lexical entry for a determiner in the grammars presented here is a term of the form Res~Scol)e~Sentence, where lees is the semantic value associated with the restrictor and Scope is the semantic value associated with the scope. A grammar directly constructing predicate-logic style semantic representations would assign the lexical entry in (3) to the determiner every, where '==>' is interpreted as tire implication operator in semantic representations (see Pcreira and Shieber \[1211).</Paragraph> <Paragraph position="3"> Tiffs 1cxical entry does not suffice for our purposes, since it provides no information about the relative anaphoric scope relationships between the restrictor, the scope, and that portion of the utterance external to the quantificational expression as a whole.</Paragraph> <Paragraph position="4"> Anaphors in opaque quantificational expressions can refer to entities superordinate to the quantificational expression, but in general anaphors outside of an opaque quantificational expression cannot refer to entities introduced in either the restriction or scope of the quantilicational expression 2. Anaphors in the scope of an opaque quantificational expression can refer to entities introduced in the restriction of that expression (e.g. as in (3) above), but anaphors in the restriction cannot refer to entities introduced in the scope.</Paragraph> <Paragraph position="5"> The compose and subordinate predicates in the lexical entry for every in the grammar presented above express subordination relationships that describe the behavior of opaque determiners. The semantic representation S is the composition of S1 and $2, where S2 is the semantic atom ResName =-> Scope-Name. Re s is subordinate to S1, and is itself the composition of ResO and Resl, where Res0 is the semantic representation of the restricter. Scope is subordinate to Resl, and is the semantic representation of the scope.</Paragraph> <Paragraph position="6"> The diagram on the following page sketches the relationship between the various semantic entities mentioned in the lexical entry for every. Subordination relationships are depicted by vertical lines (the name of the subordinate space is written alongside the line), and composition relationships are indicated by V-shaped diagonals.</Paragraph> </Section> <Section position="2" start_page="0" end_page="0" type="sub_section"> <SectionTitle> The Predicate-Logic Constructors </SectionTitle> <Paragraph position="0"> These constructors build a predicate-logic type of semantic reprcscntation in a fairly transparent fashion. Pronouns are treated as free variables, there are no constraints on their distribution, and anaphoric binding is not 9&quot; There are exceptions to this: for example, anaphors can refer to proper names inlJoduced in the restrictor or scoix~ of opaque detetvniners. Within the framework described below, this curl be trealed by adding a new semantic construction operator add top level, which adds a referential irxtex to the most superordinatc level treated. Thus the definitions of the constructors new index, accessible index, compose and subordinate have degenerate definitions.</Paragraph> <Paragraph position="1"> A property is identical with the term reprosenting it: atom(Prop, Prop).</Paragraph> <Paragraph position="2"> The conjunction of P and Q is represented by the term P &Q.</Paragraph> <Paragraph position="3"> conjoin (p, P,PSQ) .</Paragraph> <Paragraph position="4"> There are no constraints on new indices.</Paragraph> <Paragraph position="5"> new index( , ) .</Paragraph> <Paragraph position="6"> Th(-re are no constraints on accessible indices. accessible index ( , ) .</Paragraph> <Paragraph position="7"> Sequencing is unimportant.</Paragraph> <Paragraph position="8"> compose (P, P, P) .</Paragraph> <Paragraph position="9"> A Subordinate space can be introduced freely. subordinate( ,Sub, Sub) .</Paragraph> <Paragraph position="10"> Inlemal and external forms are identical. external (P, P) .</Paragraph> <Paragraph position="11"> The grammar described above predicate-logic constructors yield such as the following:</Paragraph> <Paragraph position="13"> Roughly this latter form might be interpreted as: if X is a m~m and Y is a donkey and X owns Y, then there is a Z such that X beats Z.</Paragraph> <Paragraph position="14"> The Sets-of-In fens Constructors The constructors for the sets-of-infons and the discourse-representation both constrain anaphora by requiring that the referential indices provided by the accessible index constructor be indices that were introduced by new index in some earlier representation (where precedence is defined by the compose constmclor). This entails that tile internal form of these semantic representations encode information about preceding representations. Both constructors thread this information using the difference-list technique described in \[8\].</Paragraph> <Paragraph position="15"> The primitive element of tile sets-of-infons representation is inspired by the infons of Situation Semantics \[21. We represent an infon as a term of the form Sit : P, which means that p is true in the situation Sit. For example, Kim's sleeping in situation sO is represented by s0:sleep (kim). For simplicity arbitrarily named constants (like the gensyms of Lisp) are used as the names of situations in this representation: this has the disadvantage that the definitions o1 the external and subordinate construtters are not declaratively specified. 3 The internal form of a sets of i@)ns representation has three components. We represent them in Prolog with a term of the form @(sits, Infonstn, InfonsOut). The first is a stack whose top element is the situation currently being defined, and whose other elements are the situations superordinate to this one (as defined by the subordinate constructor). The second component is the set of all infons introduced in representations preceding this one. The infons in this list associated with the current or a superordinate situation provide the information needed for the accessible index constructor. The third component of the representation is the set 3 All lhat is required is that there is an infinite stock of situation names, so e.g. integers could have Ixren used as situation names at tile expense of a slight complication of lhe represenlation's data structures.</Paragraph> <Paragraph position="17"> of inRms introduced in preceding representations with the addition any infons added to the representation by the semantic representation constructor. In describing the term @ (Sits, InfonIn, InfonsOut), we use the names TnfonsIn and InfonsOut to stress the fact that they constitute a difference list.</Paragraph> <Paragraph position="19"> The atom constructor introduces a new atomic proposition p as an infon Sit:P, where Sit is the situation currently being constructed. Notice that InfonsOut is the same as InfonsIn but for the addition of (Sit:P).</Paragraph> <Paragraph position="20"> The compose constructor threads the difference list of infons through both of the representations, so the composed representation contains all of the infons added to the sets of infons composed. The conjoin constructor is equivalent to the compose constructor.</Paragraph> <Paragraph position="21"> The subordinate constructor introduces a new subordinate representation by pushing a new situation name Sit on to the list of (now superordinate) situations. The difference list of infons is threaded through the subordinate representation so that any infons added to it will appear in the superordinate representation as well.</Paragraph> <Paragraph position="22"> The new index constructor adds an atom of the form i (Index) to the representation S: no constraints are placed on Index.</Paragraph> <Paragraph position="23"> The accessible index 4 constructor is ,1 rl'he predicate member used here, and elsewhere in this paper, Ires its standard logical definition: viz: member (X, \[X I \] ) * member(X, { , L\]) :- member(X, L). \]f this definkion is satisfied for a referential index Index if Index was introduced by new index to</Paragraph> <Paragraph position="25"> a preceding non-subordinate representation, i.e. if the context contains an infon Sit:i (Index), where Sit is the current or a superordinate situation name.</Paragraph> <Paragraph position="26"> The external (Internal, External) predicate initializes Internal to have no superordinate situations and no preceding context, and returns the list of infons associated with this Internal representation as its external form.</Paragraph> <Paragraph position="27"> When these constnmtors are used with the grammar defined above, the following analyses are obtained: 2- p ( \[a, man, owns, a, donkey\] , S) .</Paragraph> <Paragraph position="29"> This can be paraphrased as: Situation S0 contains individuals X and Y; in .sO X is a man, Y is a donkey and X owns Y.</Paragraph> <Paragraph position="31"> This can be paraphrased as: Situation .sO contains the fact that all situations of type s l are also situations of type s2. A situation is of type sl if it contains individuals X and Y, and X is a man and Y is a donkey. A situation is of type s2 if X owns Y.</Paragraph> <Paragraph position="33"> This can be paraphrased as: Situation s0 contains the fact that all situations of type ,sl are also situations of type s2. A situation is of type sl if it contains individuals X and Y, X used with the grammars and constructors given in this paper, the SLD select.ion rule of Prolog may lead to hen-termination. It is in general necessary to delay the evaluation of the member predicate uotil its second argument is instantiated, which can tx: done using the freeze primitive of Prolog I\[.</Paragraph> <Paragraph position="34"> 22 6 is a man, Y is a donkey and X owns Y.</Paragraph> <Paragraph position="35"> situation is of type a2 if X beats y.5</Paragraph> </Section> <Section position="3" start_page="0" end_page="0" type="sub_section"> <SectionTitle> The Discourse-Representation Constructors </SectionTitle> <Paragraph position="0"> A The representations built by these constructors are inspired by the &quot;box representations&quot; of Kamp's (1981) Discourse Representation Theory \[9\]. A discourse representation &quot;box&quot; is represented by the list of items that constitute its contents. A representation is a difference-pair of the lists of the representations of the currently open boxes (i.e. the current box and all superordinate boxes), as in Johnson and Klein \[81. In Prolog, we use the binary '-' operator to separate the two members os the pair.</Paragraph> <Paragraph position="1"> atom(P, \[BIBs\]-\[\[PIB\] lBs\]).</Paragraph> <Paragraph position="3"> The atom constructor introduces a now atomic proposition p,by adding it to the current box, i.e. the first element of the list of ot)en boxes.</Paragraph> <Paragraph position="4"> The compose constructor threads the difference list representing the open boxes through both compose representations of the items being composed in the same way that the compose constructor of the sets-of-infons represenlations does. The conjoin constructor is equivalent to the compose constructor.</Paragraph> <Paragraph position="5"> The subordinate constructor introduces an empty subordinate box onto the list of' cur5 The grammar and file sels of infons constructors also gene.ralc an additional reading in which tile man that owns the donkey beats himself; i.e. it is taken as anaphorically dependent on every mare Simple extensions to tim grammar (e.g. requiring tile index of a pronoun to differ from the index of all c-commanding NPs) or Ihe semantics (e.g. requiring the gender of tile pronoun to agree with its antecedent's gender) would ride out this spurious analysis. rently open boxes. The &quot;name&quot; B of the subordinate box is the list of atoms it contains.</Paragraph> <Paragraph position="6"> The new index constructor adds an atom of the form 2 (Index) to the semantic representation: no constraints are placed on Index (as in the sets-of-infons representation).</Paragraph> <Paragraph position="7"> The accessible index constructor is satisfied by a referential index Index if Index is introduced by new index in a preceding non-subordinate representation, i.e.</Paragraph> <Paragraph position="8"> if one of the superordinate boxes contains i (Index).</Paragraph> <Paragraph position="9"> The external (Internal, External) predicate initializes Internal tO have exactly one open box (empty), and returns the contents of that box as its external form.</Paragraph> <Paragraph position="10"> With these constructors, the parser yields the following semantic values for the test sentences. null</Paragraph> <Paragraph position="12"> This representation is tree just m case there are two individuals X aid }i, X is a man and Y is a donkey, and X owns }I.</Paragraph> <Paragraph position="14"> This representation is true just in case for all individuals X such that X is a man there is an individual Y such that Y is a donkey and X owns Y.</Paragraph> <Paragraph position="15"> 2- p ( \[every, man, that, owns, a, donkey,</Paragraph> <Paragraph position="17"> This representation is true just in case for all individuals X and Y such that X is a man and Y a donkey and X owns Y, it is also true that X beats Y.</Paragraph> <Paragraph position="18"> Extending the Grammar to handle Quantifier-Raising In this section we sketch a syntactic account of quantifier-raising inspired by the implementation of Cooper-storage (Cooper \[41) presented in Pereira and Shieber \[12\], to which we refer the reader for details. Each syntactic constituent is associated with a list of quantitiers that are &quot;in storage&quot; (this corresponds in an LF-movement account of quantifier-scope to being raised out of this constituent). Quantiticational determiners add items to the quantifier store, and at S nodes, quantifiers are removed from the store and applied to the semantic representation. The quantitier-store of nodes at which quantifiers are neither added nor removed is the shuffle of the quantifier-stores of its children. 6. The grammar presented below is simply the grammar presented above with the addition of quantifier-storage. The lexical entries for this grammar are the same as the above, and so are not listed here.</Paragraph> <Paragraph position="20"> 6 Treating the quantifier-store as a syntactic feature can express many properties of LF-movement accounts, such as quantificational islandhood, etc., without the explicit construction of additional repre- sentations</Paragraph> <Paragraph position="22"> The proposition shuffle (LI, LI, L3) is true just in case L3 is a list that can be seen as having been constructed in a sequence of steps in each of which the next available item is taken from either L1 or L2 and added to the end. So long as items remain on both L1 and L2, it is immaterial which of them supplies the next member of L3. What is important is that the members of L1 and L2 should all be on L3, and in their original order. This relationship is assured by the following Prolog clauses:</Paragraph> <Paragraph position="24"> The first clause asserts that the proposition is true of three empty lists, and serves to terminate the recursion implicit in the other two.</Paragraph> <Paragraph position="25"> The second clause says that, if Q2s and Q3s am suffixes of a pair of lists to be shuftled, and that shuffling them gives Qls, then the item that precedes Q1 s in the final result carl come from the first list, that is, it can be the item preceding Ols. The third clause says that, alternatively, the item preceding Qls can come from the second list.</Paragraph> <Paragraph position="26"> The grammar also makes use of the predicate apply_some (Quant s, OldSemant ic-Value, UnappliedQuant s, NewSemanticValue ) which is true if applying zero or more quantitiers fi'om the beginning of the list Quants to a given OldSemanticValue yields NewSemantic-Value and leaves a suffix of that list of quantitiers, namely UnappliedQuants still unapplied. It can be defined with the following pair of clauses, the first of which terminates the sequence of applications and the second of which applies the next quantifier in sequence.</Paragraph> <Paragraph position="28"> '\]'he new grammar can be used with the three different semantic constructors presented above. Using the Predicate-Logic constructors, it yields results like the following:</Paragraph> <Paragraph position="30"> This example has two (semantically-equivalenl) representations corresponding to the two scope possibilities for the two existentially quantified NPs.</Paragraph> <Paragraph position="31"> 2- q ( \[every, man, owns, a, donkey\] , S) .</Paragraph> <Paragraph position="33"> In this example the two non-equivalent representations correspond to the two different scope possibilities for the quantified NPs.</Paragraph> <Paragraph position="34"> These readings paraphrase as: There is a donkey Y and for each man X, X owns Y and For each man X there is a donkey I / and X owns V&quot;.</Paragraph> <Paragraph position="35"> 2- q ( \[every, man, that, owns,</Paragraph> <Paragraph position="37"> In this example the two non-equivalent representations correspond to the two differenl scope possibilities for the quanti/ied NPs. These readings paraphrase as: There is a donkey Y and for each man X such that X owns Y it is the case that X beats Y, and &quot;For each man X and donkey Y such that X owns Y, it is the case that X beats Y.</Paragraph> <Paragraph position="38"> Using the sets-of-infons constructors, we get the following results: 2- q(\[every, man,owns,a,donkey\],S) .</Paragraph> <Paragraph position="39"> S :-: sO: \[s0:sl==>s2,s2:own(X,Y), sl :i (X), sl:man (X), S0 : 2 (Y),</Paragraph> <Paragraph position="41"> The scope possibilities are indicated here by the situation in which the noun phrases are interpreted. The first reading displayed corresponds to the quantilier-raised interpretation, which paraphrases as: Situation s0 contains the individual Y, the fact that Y is a donkey, and the fact that for all ways of making sl tale, s2 is also true, where sl contains the individual X and the fact that X is a man, and s2 conrains the fact that X owns Y. Since Y is in s(), under this reading it is a potential matecedent for ~maphors in for following sentences.</Paragraph> <Paragraph position="42"> The second reading differs from the first in that the NP a donkey is interpreted in the subordinate situation sl instead el&quot; S0. As well as causing a donkey to be quantilicationally subordinate to every man, this also makes a donkey unavailable as a potential antecedent for anaphors in following sentences.</Paragraph> <Paragraph position="43"> We can therefore account for the fact that under normal intonation a donkey is interpreted as having wide scope over every man in the following discourse fiagmcnt (3).</Paragraph> <Paragraph position="44"> (3) Every man saw a donkey. It had a bushy tail We now consider one of the famous &quot;donkey&quot; sentences: ?- q(\[every, man,that,owns,a,donkey,</Paragraph> <Paragraph position="46"> The first reading displayed again corresponds to the quantifier-raiscd interpretation, which paraphrases as: Situation s0 contains an individual Y, and the facts that Y is a donkey and that every way of making S1 true also makes $2 tree, where S1 contains the individual X and the facts that X is a man and X owns Y, and $2 contains the fact that X beats Y.</Paragraph> <Paragraph position="47"> 9 25 Finally, the discourse-representation constructors yield the following: These representations are direct notational variants of the two set-of-infons representations of this sentence given above. The truth conditions of the first reading correspond to the wide-scope interpretation of a donkey, and can be paraphrased as: There is a donkey Y, and for every man X, X owns !/.</Paragraph> <Paragraph position="48"> ?- q(\[every, man,that,owns,a,donkey, beats, it\] ,S) .</Paragraph> <Paragraph position="50"> man (X) \] ==> \[beat (X, Y) \] \] Again, these representations are direct notational variants of the two sets-of-infons representations of this sentence given above. The truth conditions of the first reading correspond to the wide-scope interpretation of a donkey, and can be paraphrased as: There is a donkey Y, and for every man X such that X owns Y, X beats I/.</Paragraph> <Paragraph position="51"> The same correlation between quantificational scope and anaphoric scope holds with these constructors, as expected.</Paragraph> <Paragraph position="52"> Conclusion We have worked out a scheme lot computing the logical lorms of sentences incrementally in the course of parsing them which we believe achieves an unprecedented level of abstraction of the semantic from the syntactic parts of the grammar. The very incrementality of the scheme might be used to argue against it. Given the prevalence of scope ambiguities, the interests of computational efficiency may be best served by a scheme that delays all semantic computation until the parsing is complete so as not to work unnecessarily on phrases that turn out not to be capable of incorporation in a complete analysis of the sentence. Hobbs and Sheiber \[7\] adopt such a scheme apparently on the grounds of greater perspicuity. In any case, the modifications that need to be made to our scheme are entirely trivial, requiring only the introduction of a modest amount of symbolic computation. Basically, the idea is to use operations which, instead of returning pieces of the final logical form incrementally and nondeterministically, return expression that will exhibit this nondeterministic behavior when evaluated later. The later evaluation will, of course, be as specified be the detinitions we have given. In short, we believe that the abstractions we have created effectively isolate the syntactk&quot; rules both from the corresponding semantic formalism and from the architecture of the system by which both of them will be interpreted.</Paragraph> </Section> </Section> class="xml-element"></Paper>