File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/93/e93-1029_metho.xml

Size: 26,264 bytes

Last Modified: 2025-10-06 14:13:18

<?xml version="1.0" standalone="yes"?>
<Paper uid="E93-1029">
  <Title>Mathematical Aspects of Command Relations</Title>
  <Section position="3" start_page="240" end_page="243" type="metho">
    <SectionTitle>
2 Grammatical Relations on Trees
</SectionTitle>
    <Paragraph position="0"/>
    <Section position="1" start_page="240" end_page="241" type="sub_section">
      <SectionTitle>
2.1 Definitions
</SectionTitle>
      <Paragraph position="0"> A tree is an object T = iT, &lt;, r) with r the root and &lt; a tree ordering. We write x -4 y if z is immediately dominated by y; in mathematical jargon y is said to cover z. A leaf is an element which does not cover; z is interior if it is neither a leaf nor the root. int(T) is the set of interior nodes ofT. We put ~ x = {YlY &lt; x} and \]&amp;quot; z = {YlY &gt;-- Z}. ~ X is called the lower and T z the upper cone of z. If R C_ 7 '2 is a binary relation we write Rx = {ylxRy} and call Rz the R-domaln of z. A function f : T ~ T is called monotone if z &lt; y implies f(x) &lt; f(y), increasing if z &lt;_ f(x) for all x, and strictly increasing if z &lt; f(z) for all x&lt;r.</Paragraph>
      <Paragraph position="1"> Definition 1 A binary relation R C T 2 is called a command relation (CR for short) iff there exists a function fR : T ~ T such that (1), (~) and (8) hold; R is called monotone if in addition it sat- null isfies (4) and tight if it satisfies (5) in addition to (1) - (3). fR is called the associated function of R.</Paragraph>
      <Paragraph position="2"> (1) Rr = ~fR(x) (2) z &lt; fR(z) for all z &lt; r (3) fRO') = ,&amp;quot; (4) z &lt; y implies fR(z) &lt; fR(Y) (5) x &lt; fR(y) impZies fR(x) &lt;_ fR(y).</Paragraph>
      <Paragraph position="3"> (1) expresses that fR(z) represents R; (2) and (3) ex- null press that fR must be strictly increasing. If (4) holds, fR is monotone. A tight relation is monotone; for if z _&lt; y and y &lt; r then y &lt; fR(Y) and so z &lt; fR(Y); whence fR(z) _&lt; fR(Y) by (5). For some reason \[Barker and Pullum, 1990\] do not count monotonicity as a defining property of CRs even though there is no known command relation that fails to be monotone. null Given a set P _C T we can define a function gp by (t) gp(z) = min{yly * P, y &gt; z} We put minO = r; thus gp(r) = r. Let zPy iff y &lt; gp(z), gp is the associated function of P, a relation commonly referred to as P-command. We call P the basic set of gp as well as P.</Paragraph>
      <Paragraph position="4"> Here are some examples. With P the set of branching nodes P is c-command, with P = T we have that P is IDC-command. When we take P to be the set of maximal projections we obtain that P is M-command, and, finally, with P the set of bounding nodes, e. g. {NP, S}, the relation P defined becomes identical to Lasnik's KOMMAND. Lasnik's KOMMAND i8 identical to 1-node subjacency under the typical definition of subjacency.</Paragraph>
      <Paragraph position="5">  Relations that are of the form P for some P are called fair.</Paragraph>
      <Paragraph position="6"> Theorem 2 R is fair iff it is tight. There are</Paragraph>
      <Paragraph position="8"> y}. Then gp(z) = min{z E P\]z &gt; z} &lt;_ gp(y) since gp(y) E P. (C/:) Put P = {fR(z)\]z E T}.</Paragraph>
      <Paragraph position="9"> We have to show (t)- By (5), however, fit(z) = min{fit(z)\]fit(z) &gt; z}. For the second claim observe first that if P, Q differ only in exterior nodes then P = Q. If, however, z E P - Q is interior then y -&lt; z for some y and gp(y) = z but go(Y) &gt; z. * Tight relations have an important property; even when the structure of the tree is lost and we know only P we can recover gp and &lt; to some extent. Notice namely that if Px C/ T then gp(z) is the unique y such that y E Px but the P-domain of y is larger than the P-domain of z. We can then exactly say which elements are dominated by y: exactly the elements of the P-domain of z. By consequence, if we are given T, the root r and we know the IDC-command domains, &lt; can be recovered completely.</Paragraph>
      <Paragraph position="10"> This is of relevance to syntax because often the tree structures are not given directly but are recovered using domains.</Paragraph>
    </Section>
    <Section position="2" start_page="241" end_page="241" type="sub_section">
      <SectionTitle>
2.2 Lattice Structure
</SectionTitle>
      <Paragraph position="0"> Let f, g be increasing functions; then define</Paragraph>
      <Paragraph position="2"> Since f(z),g(z) &gt;_ z, that is, f(z),g(z) E ~z and since T z is linear, the maximum and minimum are always defined. Clearly, with f and g increasing, f LI g, f\[qg and fog are also increasing. Furthermore, if f and g are strictly increasing, the composite functions are strictly increasing as well.</Paragraph>
      <Paragraph position="3"> Lemma 3 fRus = fit U fs. fitns = fit R fs.</Paragraph>
      <Paragraph position="4"> Proof. z &lt;_ fitus(X) iff z(R U S)z iff either zRz or zSz iff either z &lt;_ fR(z) or z &lt; fs(z) iff z &lt; maz{fR(z), fs(z)}. Analogously for intersection, i Theorem 4 For any given tree T the command relations over T form a distributive lattice Er(T) = (Cr(T), N, U) which contains the lattice 93Ion(T) of monotone CRs as a sublattice.</Paragraph>
      <Paragraph position="5"> Proof. By the above lemma, the CRs over T are closed under intersection and union. Distributivity automatically follows since lattices isomorphic to lattices of sets with intersection and union as operations are always distributive. The second claim follows from the fact that if fR, fs are both monotone, so is fit IIfs and fit n fs. We prove one of these claims. Assume z &lt; y. Then fit(z) _&lt; fa(Y) and fs(z) _&lt; fs(Y), hence fit(z) _&lt; max{fR(y),fs(y)} as well as fs(=) &lt;_ maz{fit(u),fs(u)}. So max{fit(=), fs(=)} _&lt; max{fn(y), fs(y)} and ther fore fRus(z) &lt; fRus(y), by definition. * Proposition 5 gPuq = gP \[7 go. Hence tight relations over a tree are closed under intersection. They are generally not closed under closed union.</Paragraph>
      <Paragraph position="6"> Proof. Let P, Q c_ T be two sets upon which the relations P and Q are basedl Then the intersection of the relations, P N Q, is derived from the union P U Q of the basic sets. Namely, gpuq(Z) = min{yly E PU</Paragraph>
      <Paragraph position="8"> To see that tight relations are not necessarily closed under union take the union of N P-command and Scommand. If it were tight, the nodes of the form g(z) for some z define the set on which this relation must be based. But this set is exactly the set of bounding nodes, which defines Lasnik's kommand. The latter, however, is the intersection, not the union of these relations. * The consequences of this theorem are the following. The tight relations form a sub-semilattice of the lattice of command relations; this semi-lattice is isomorphic to (2 int(T), U). Although the natural join of tight relations is not necessarily tight, it is possible to define a join in the semi-lattice. This operation is completely determined by the meet-semilattice structure, because this structure determines the partial order of the elements which in turn defines the join. In order to distinguish this join from the ordinary one we write it as P * Q. The corresponding basic set from which this relation is generated is the set PNQ; this is the only choice, beacuse the semilat- mr(T) tice/2' , U) allows only one extension to a lattice, namely (2 int(T), U, N). The notation for associated functions is the same as for the relations. If gp and gq are associated functions, then gp * go = gPnq denotes the associated function of the (tight) join.</Paragraph>
    </Section>
    <Section position="3" start_page="241" end_page="242" type="sub_section">
      <SectionTitle>
2.3 Composition
</SectionTitle>
      <Paragraph position="0"> For monotone relations there is more structure. Consider the definition of the relationM product R o S = {(z, z) l(3y)(znyaz)} Then fitos = fs o fR (with converse ordering!). For a proof consider the largest z such that x(R o S)z.</Paragraph>
      <Paragraph position="1"> Then there exists a g such that zRySz. Now let tj be the largest g such that zRy. Then not only zR~ but also tgSz, since S is monotone. By choice of ~, ~ = fn(z). By choice of z, z = fs(~t), since fs(~t) &gt; z would contradict the maximality of z. In total, z = (fs o fit)(z) and that had to be proved.</Paragraph>
      <Paragraph position="2"> From the theory of binary relations it is known that o distributes over U, that is, that we have R o</Paragraph>
      <Paragraph position="4"> Proof. Let z(R o (S N T))z, that is, zRy(S N T)z, that is, zRySz and zRyTz for some y. Then, by  definition, x(R o S)z and x(R o T)z and so x((R o S) fq (R o T))z. Conversely, if the latter is true then x(R o S)z and x(R o T)z and so there are Yl, Y2 with xRylSz and xRy2Tz. With y - max{yl,y2} we have xRy(S M T)z since S, T are monotone. Thus x(R o (s n T))z. Now for the second claim. Assume z((S N T) o R)z, that is, x(S fq T)yRz for some y. Then xSy, xTy and yRz, which means x(SoR)z and x(T o R)z and so x((S o R) M (T o R))z. Conversely, if the latter holds then x(S o R)z and x(T o R)z and so there exist Yl, Y2 with xSylRz and xTy2Rz. Put y = rain{y1, Y2}. Then xSy, xTy, hence x(S M T)y.</Paragraph>
      <Paragraph position="5"> Moreover, yRz, from which x( ( S N T) o R)z. *  Definition 7 A distributoid is a structure fO = (D, N, U, o) such thai (1) (D, n, u) is a distributive lattice, (2) o an associative operation and (3) o distributes both over M and U.</Paragraph>
      <Paragraph position="6"> Theorem 8 The monotone CRs over a given tree  form a distributoid denoted by ~Diz(T). *</Paragraph>
    </Section>
    <Section position="4" start_page="242" end_page="243" type="sub_section">
      <SectionTitle>
2.4 Normal Forms
</SectionTitle>
      <Paragraph position="0"> The fact that distributoids have so many distributive laws means that for composite CRs there are quite simple normal forms. Namely, if 9t is a CR composed from the CRs R1,. *., Rn by means of M, U and o, then we can reproduce 91 in the following simple form. Call ~ a chain if it is composed from the Ri using only o. Then 91 is identical to an intersection of unions of chains, and it is identical to a union of intersections of chains. Namely, by (3), both M and U can be moved outside the scope of o. Moreover, fl can be moved outside the scope of U and U can be moved outside the scope of N.</Paragraph>
      <Paragraph position="1"> Theorem 9 (Normal Forms) For every 91 = 91(R1,...,Rn) there exist chains</Paragraph>
      <Paragraph position="3"> From the linguistic point of view, tight relations play a key role because they are defined as a kind of topological closure of nodes with respect to the topology induced by the various categories. (However, this analogy is not perfect because the topological closure is an idempotent operation while the domain closure yields larger and larger sets, eventually being the whole tree.) It is therefore reasonable to assume that all kinds of linguistic CRs be defined using tight relations as primitives. Indeed, \[Koster, 1986\] argues for quite specific choices of fundamental relations, which will be discussed below. It is worthwile to ask how much can be defined from tight relations. This proves to yield quite unexpected answers. Namely, it turns out that union can be eliminated in presence of intersection and composition. We prove this first for the most simple case.</Paragraph>
      <Paragraph position="4"> Lemma 10 Let gp, go be the associated functions of tight relations. Then gp u go = (gP o go) n (go o gp) n (gp * go) Proof. First of all, since gP,gO &lt;- gP o go,go o gP,gP*gO we have gpIIgo &lt; (gPdeggq) \[q(godeg gP) 1-\] (gP * go). The converse inequation needs to be established. There are three cases for a node</Paragraph>
      <Paragraph position="6"> above z is identical to the next Q-node above z and so is identical to the next P N Q-node above z. (it) gp(x) &lt; go(z). Then with y = gp(x) we also have gQ(y) = go(z), by tightness. Hence (gp U go)(x) = (go o gp)(z). (iii) gp(x) &gt;g0(z).</Paragraph>
      <Paragraph position="7"> Then as in (it) (gp LI gq)(x) = (gp o go)(z).</Paragraph>
      <Paragraph position="8"> The next case is the union of two chains of tight relations. Let g = grn ogm_l...ogz and 0 = h, o ha- 1 -. * o hi be two associated functions of such chains. Then define a splice of g and ~ to be any chain t = kt o kt-1...o kl such that PS = m+ n and ki = gj or ki = hj for some j and each gi and hj occurs exactly once and the order of the gi as well as the order of the hi in the splice is as in their original chain. So, the situation is comparable with shuffling two decks of cards into each other. A weak splice is obtained from a splice by replacing some number of gi o hj and hj o gi by gi * hi, least tight relation containing both gi and hi. In a weak splice, the shuffling is not perfect in the sense that some pairs of cards may be glued to each other. If g = g2 o gl and 0 = h2 o hi then the following are all splices of g and 0: g2deggl degh2deghl, g2degh2deggl deghl, g2degh2deghl deggz * The following are weak splices (in addition to the splices, which are also weak splices): g2 091 * h2 0 hi,</Paragraph>
      <Paragraph position="10"> Lemma 11 Let g, ~ be two chains of tight relations (or their associated functions). Let wk(g, O) be the set of weak splices of g and b. Then u b = R @Is wk@, b)) Proof. As before, it is not difficult to show that o &lt; n( l. wk(g, because g, 0 _&lt; s for each weak splice. So it is enough to show that the left hand side is equal to one of the weak splices in any tree for any given node. Consider therefore a tree T and a node z E T. We define a weak splice s such that s(z) = maz{g(z), b(z)}. To this end we define the following nodes, z0 = z, y0 = z,</Paragraph>
      <Paragraph position="12"> hi+l(yl),.... The zi and the yi each form an increasing sequence. We can also assume that both sequences are strictly increasing because otherwise there would be an i such that zi = r or Yi = r. Then (@ U D)(z) = r and so for any weak splice z(z) = r as well. So, all the xi can be assumed distinct and  all the yi as well. Now we define zi as follows.</Paragraph>
      <Paragraph position="14"> min({zz,..., zm, yz,..., Y,~} - {Zl,..., zl}). Thus, the sequence of the zi is obtained by fusing the two sequences along the order given by the upper segment T z. Finally, the weak splice can be defined.</Paragraph>
      <Paragraph position="15"> We begin with st. Ifzt = yl, $1 = gldeghl, ifzt &lt; Yz, sz = 91 and if zz &gt; yl then sz = hi. Generally, for zi+z there are three cases. First, zi+z = zj = Yk for some j, k. Then si+t = gj * hk. Else zi+z = zj for some j, but Zi+l C/ y~ for all k. Then si+t = gj. Or else zi+t = yk for some k but zi+z C/ zj for all j; then si+t = hk. It is straightforward to show that z as just defined is a weak splice, that zi+z = si(zi) and hence that z(z) = maz{0(z), t)(z)}. * The tight relations generate a subdistributoid Sot(T) in :Di~(T) members of which we call tight generable.</Paragraph>
      <Paragraph position="16"> Theorem 12 Each light generable command relation is an intersection of chains of light relations.</Paragraph>
    </Section>
  </Section>
  <Section position="4" start_page="243" end_page="245" type="metho">
    <SectionTitle>
3 Introducing Boolean Labels
</SectionTitle>
    <Paragraph position="0"/>
    <Section position="1" start_page="243" end_page="243" type="sub_section">
      <SectionTitle>
3.1 Boolean GrAmmars
</SectionTitle>
      <Paragraph position="0"> We are now providing means to define CRs uniformly over trees. The trees are assumed to be labelled.</Paragraph>
      <Paragraph position="1"> For mathematical convenience the labels are drawn from a boolean algebra PS = (L, 0, 1, -, n, U). A labelling is a function PS : T ~ L. PS is called full if ~(z) is an atom of PS or 0 for every z. If either ~(z) = a = 0or 0 &lt; PS(x) &lt; a we say that zisof category a. Labelled trees are generated by boolean grammars. Since syntax is abstracting away from actual words to word classes named each by its own syntactical label we may forget to discriminate between the terminal labels with impunity. This allows to give all of them the unique value 0, which is now the only terminal, the non-terminals being all elements of L - {0}. A boolean grammar is defined as a triple 6 = (~, ~, R) where R is a finite subset of (L - {0}) x L + and ~ * L - {0}. G generates</Paragraph>
      <Paragraph position="3"> an appropriate order of the indices there is a rule a --* bt,..., b, in R such that x is of category a and Yl is of category bl for all i. Boolean grammars are a mild step away from context free grammars. Namely, if a --* bz ... bn is a boolean rule, we may consider it as an abbreviation of the set of rules a* --* b~ ... b~ where a* is an atom of PS below a and b~ is an atom of PS below bi for each i. Likewise, the start symbol abbreviates a set of start symbols ~*, which by familiar tricks can be replaced by a single one denoted by R, which is added artificially. In this way we can translate G into a cfg O* over the set of atoms of PS plus 0 and the new start symbol R, which generates the same fully labelled trees - ignoring the deviant start symbol. It is known that there is an effective procedure to eliminate from a cfg labels that never occur in a finite tree generated by the grammar (see e. g. \[Harrison, 1978\]). This procedure can easily be adapted to boolean grammars. A boolean grammar without such superfluous symbols is called normal.</Paragraph>
    </Section>
    <Section position="2" start_page="243" end_page="244" type="sub_section">
      <SectionTitle>
3.2 Domain Specification
</SectionTitle>
      <Paragraph position="0"> Each boolean label a defines the relation of acommand on a fully labelled tree via the set of nodes of category a. This is the classical scenario; the label S defines S-command, the label NPU CP defines Lasnik's Kommand. And so forth. We denote the particular relation induced on (T,PS) by 6T(a).</Paragraph>
      <Paragraph position="1"> ~,From this basic set of tight CRs we allow to define more complex CRs using the operations. To do this we first define a constructor language that contains a constant a for each a E L and the binary symbols A, V and o. (Although we also use e, we will treat it as an abbreviation; also, this operation is defined only for tight relations.) Since we assume the equations of distributoids, the symbols a generate a distributoid with A, V, o, namely the so-called free distributoid. The map ~T can be extended to a homomorphism from this distributoid into :Diz(T).</Paragraph>
      <Paragraph position="3"> By definition, the image of ~ under ~T is tight generable. Hence ~v maps all nearness terms into tight generable relations. With N P U C P being 1-node subjaceny (for English) we find that (NPUCP)o(NPUCP) is 2-node subjacency. Using a more complex definition it is possible to define 0- and 1-subjacency in the barriers system on the condition that there are no double segments of a category. If we consider the power of subsystems of this language, e. g. relations definable using only A etc. the following picture emerges. {o,^}</Paragraph>
      <Paragraph position="5"> This follows mainly from Theorem 12 because the map ~ is by definition into the distributoid &amp;quot;,for(T) of tight generated CRs. Moreover, A alone does not create new CRs, because of Prop. 5. Each of the inclusions is proper as is not hard to see. So V does not add definitional strength in presence of o and A;  although things may be more perspicuously phrased using V it is in principle eliminable. By requiring CRs to be intersections of chains we would therefore not express a real restriction at all.</Paragraph>
    </Section>
    <Section position="3" start_page="244" end_page="245" type="sub_section">
      <SectionTitle>
3.3 The Equational Theory
</SectionTitle>
      <Paragraph position="0"> Given a boolean grammar G, a tree T and two domains D, e constructed from the labels of G we write</Paragraph>
      <Paragraph position="2"> is called the equational theory of (3. To determine the equational theory of a grammar we proceed through a series of reductions. (3 admits the same finite trees as does is normal reduct G n. So, we might as well assume from start that (3 is normal. Second, domains are insensitive to the branching nature of rules. We can replace with impunity any rule p = a --, bl...b, by the set of rules pU = {a --* bili &lt;_ n}. We can do this for all rules of the grammar. The grammar G ~ = (I3, 2, R ~) where R&amp;quot; = {p&amp;quot;\[p E R} is called the unary reduct of G. It has the same equational theory as G since the trees it generates are exactly the branches of tree generated by G. Next we reduce the unary grammar to an ordinary cfg G ~* in the way described above, with an artificially added start symbol R. This grammar is completely isomorphic to a transition network alias directed graph with single source R and single sink 0. This network is realized over the set of atoms of PS plus R and 0. There are only finitely many such networks over given E - to be exact, at most 2 (&amp;quot;+!)~ (!) where n is the number of atoms of 2.</Paragraph>
      <Paragraph position="3"> Finally, it does not harm if we add some transitions from R and transitions to 0. First, if we do so, the equational theory must be included in the theory of G since we allow more structures to be generated.</Paragraph>
      <Paragraph position="4"> But it cannot be really smaller; we are anyway interested in all substructures T z for nodes z, so adding transitions to 0 is of no effect. Moreover, adding transitions from R can only give more equations because the generated trees of this new transition system are branches where some lower and some upper cone is cut off. Thus, rather than taking the grammar G u* we can take a grammar with some more rules, namely all transitions R --+ A, A --* 0 for an atom A plus R ---, 0. In all, the role of source and sink are completely emptied, and we might as well forget about them. What we keep to distinguish grammars is the directed graph on the atoms of ~ induced by the unary reduct of G. Let us denote this graph by Gpb(G). We have seen that if two grammars G, H have the same graph, their equational theory is the same. The converse also holds. To see this, take an atom A and let As deg be the disjunction of all atoms B such that B --, A is a transition in the graph (or, equivalently, in the unary reduct) of G.</Paragraph>
      <Paragraph position="5"> Then A o A e = A o J_ E Eq(G). However, if C ~ A e  then A o C = A o _1_ ~ Eq(G). If O and H have different graphs, then there must be an A such that A~ C/ A~, that is, either A~ ~ A~ or A~ ~ A 8.</Paragraph>
      <Paragraph position="6"> Consequently, either A o A O - A o .L ~ Eq(H) or</Paragraph>
      <Paragraph position="8"> (r)pb(H). Hence it is decidable for any pair G, H o\].</Paragraph>
      <Paragraph position="9"> boolean grammars over the same labels whether or not Eq(G) = Eq(H). m The question is now how we can decide whether a given domain equation holds in a grammar. We know by the reductions that we can assume this grammar to be unary. Now take an equation B e. Suppose this equation is not in the theory and we have a countermodel. This countermodel is a non-branching labelled tree T a node z such that 6T(~)): ~ 6T(C/)~. Let Sf(~) denote the set of subformulas of ~ and Sf(e) the set of subformulas of C/. Put S = {f~(x)l 0 E Sf(~) U Sf(e)}. S is certainly finite and its cardinality is bounded by the sum of the cardinalities of Sf(~) and Sf(C/). Now let y, z be two points from S such that y &lt; z and for all u such that y&lt; u&lt;z u~S. Let ul andu2 be two points such that y &lt; ul &lt; us &lt; z and such that ul and us have the same label. We construct a new labelled tree U by dropping all nodes from ul up until the node immediately below us. The following holds of the new model. (i) It is a tree generated by G and (ii) 6u(0)x ~ 6u(e)x. Namely, if w -&lt; ul then PS(ul) ---, PS(w) is a transition of G, hence PS(u2) --, t(w) is a transition of G as well because l(ul) - PS(u2); and so (i) is proved. For (ii) it is enough to prove that for all ~ E Sf(D) 0 Sf(C/) the value f~(z) in the new model is the same as the value fs(z) in the old model.</Paragraph>
      <Paragraph position="10"> (Identification is possible, because these points have not been dropped.) This is done by reduction on the structure of g. Suppose then that 0 = IJ A and f~(z) -- fb(z) as well as f~(z) = fe(z); then</Paragraph>
      <Paragraph position="12"> fg(z). And similarly for g = b V ~. By the normal form theorem we can assume 0 to be a disjunction of conjunctions of chains, so by the previous reductions it remains to treat the case where g is a chain. Hence let i~ = dot. We assume f;(z) -- re(x)----: y. Let z := f~(z). Then if y &lt; r, y &lt; z and else y = z. By construction, z is the first node above y to be of category a and z E S, by which z is not dropped. In the reduced model, z is again the first node of category a above y, and so f~(z) -- f~(y) = z, which had to be shown.</Paragraph>
      <Paragraph position="13"> Assume now that we have a tree of minimal size generated by G in which/~ = e does not hold. Then ify, z E S such that y &lt; z but for no u E S y &lt; u &lt; z, then in between y and z all nodes have different labels. Thus, in between y and z sit no more points than there are atoms in PS. Let this number be n; then our model has size &lt; n * S. Now if we want to decide whether or not ~ = C/ is in Eq(G), all we have to do is to first generate all possible branches of trees  of length at most n x (~Sf(O)+ ~Sf(c))+ 2 and check the equation on them. If it holds everywhere, then indeed 0 = e is valid in all trees because otherwise we would have found a countermodel of at most this size.</Paragraph>
      <Paragraph position="14"> Theorem 14 It is decidable whether or not ~ - C/ E Eq(O). * These theorems tell us that there is nothing dangerous in using domains in grammar as concerns the question whether the predictions made by this theory can effectively be computed; that is, as!ong as one sticks to the given format of domain constructions, it is decidable whether or not a given grammatical theory makes a certain prediction about domains.</Paragraph>
    </Section>
  </Section>
class="xml-element"></Paper>
Download Original XML