File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/98/w98-0505_metho.xml

Size: 29,167 bytes

Last Modified: 2025-10-06 14:15:06

<?xml version="1.0" standalone="yes"?>
<Paper uid="W98-0505">
  <Title>t Types of syntagmatic grammatical relations and their representation</Title>
  <Section position="2" start_page="0" end_page="41" type="metho">
    <SectionTitle>
2 'Constituency or dependency or
</SectionTitle>
    <Paragraph position="0"> constituency and dependency?' A brief revisit It was shown already quite early in the discussion of constituency vs. dependency that dependency representations and constituency representations are at least weaidy equivalent (Hays, 1964; Gaifman, 1965; Robinson, 1970). However, the discussion has come up again and again bringing forward a number of arguments for and against dependency-only/ constituency-only and for and against hybrid approaches. null Dependency-only approaches. Dependency-only approaches (Tesniere, 1959) maintain that it is sufficient to account for the relation between words for a syntactic description to be adequate, the word being the only syntactic unit acknowledged. Fig-</Paragraph>
    <Paragraph position="2"> She left the possum on the deck  ure 1 shows an example of a syntactic structure resulting from a dependency analysis. The early arguments put forward against the dependency-only approach in the areas of linear sequencing (e.g., (Baumgiirtner, 1970)), features and categorization of higher nodes, and headless constructions could be largely dismissed. Linear order was considered a problem for dependency grammars at a time in the development of grammar theory when in constituency-based grammars sequence was reflected ill tile surface-syntactic tree.</Paragraph>
    <Paragraph position="3"> With removing linear ordering from tree representations and formulating sequencing rules separately, this was no longer considered a problem for dependency grammars (cL (Matthews, 1981))..,Moreover, higher nodes as domains for rule application have been shown not to be necessary because they can equally well be formulated on words, e.g., gapping rules can be formulated on verbs (Hudson, 1989).</Paragraph>
    <Paragraph position="4"> Furthermore. headless constructions can be circumvented, if the notion of category is broadened so that there will be no headless constructions; see e.g., (Hudson, 1980. 194-195).</Paragraph>
    <Paragraph position="5"> Characteristic of current dependency approaches like Meaning-Text Models (XtTM:s; (Mel'~uk. 1988)) or Word Grammar (Hudson, 1984) is the notion of lexicalization: the descriptive burden is in the lexicon, which carries information that acts as constraint on syntactic structure. In particular, the notion of valence is often combined with that of dependency by associating valence with heads, whose properties thus become major constraining factors on syntactic structure.</Paragraph>
    <Paragraph position="6"> Constituency-only approaches. At the other extreme is tile constituency-only position arguing for heads not being necessary for syntactic description, if constituency relations are accounted for. For an example of a traditional constituency structure see Figure 2.</Paragraph>
    <Paragraph position="7"> Strong arguments for the constituency-only position are brought forward for instance in (Zwicky,  1985). Zwicky mainly discusses five candidates for tile concept of head: the subcategorizand, the semantic ar~ment, the morphosvntactic locus, the determinant of concord, and the constituent determining government. These notions have to be included in any grammar model, if it is to interface with semantics, the lexicon, and morphology. However, it should not be necessary to introduce a separate category 'head', unless it can be sho~a that the head-like notions can be generalized int O one category that one could then call 'head'.</Paragraph>
    <Paragraph position="8"> Analyzing six syntactic constructions (Det + N, V + NP, Aux + VP, P + NP, NP + VP, and Comp + S), Zwicky shows that the x~arious head-like notions represent different, actually competing, analyses of syntactic structure. There is identity only between the semantic functor, which he has not listed as a head candidate, the subcategorizand and the governor. Also, the three additional head-like notions that are considered--two of which are often quoted as providing operational criteria for headship~ the distributional equivalent and the obligatory element, the other one representing the head concept of ruler used in dependency grammar--are completely new concepts that do not harmonize with the other five.</Paragraph>
    <Paragraph position="9"> In conclusion, a head is not only superfluous, but it would be a completely different additional category whose use for a grammar model is doubtful.</Paragraph>
    <Paragraph position="10"> Hybrid constituency/ dependency approaches. In a reply to (Zwicky, 1985), (Hudson, 1987) arrives at the opposite conclusion. He argues for a different analysis of Zwicky's sample constructions which reveals that 'head' can be considered a uni~'ing category of most of the head-like notions brought forth by Zwick): As all additional head-like notion Hudson puts forward the semantic functor--rather than the semantic argument--because it is the semantic functor, ill his view, that must be taken as 'semantically characterizing' (Hudson, 1987, 115). The semantic argument is thus taken away from the list of candidate heads; and also the determinant of concord is removed because there is no dependency im-olved in concord, as Hudson maintains. On the basis of these a priori alterations, Hudson argues that if all the remaining head-like notions were either identical with the semantic functor or not applying, then one could claim that most of the head-like notions are tile same category; and that therefore, a generalizing super-category 'head' could be established that embraces them all.</Paragraph>
    <Paragraph position="11"> The critical points in (Zwicky, 1985) are removed by Hudson's analysis with no contradictions remaining and he concludes that 'head' is a grammatical category on a par with grammatical functions but more general, allowing generalizations that can oth- null erwise not be made (cf. (Hudson, 1987, 131)).</Paragraph>
    <Paragraph position="12"> What (Zwicky. 1985) does not realize with his starting point and analysis results is that the convergence of semantic functor, subcategorizand, and governor can already be of adrantage. Creating the super-category of head for these converging notions can provide a general category which can actually be used as an anchor for both valence (subcategorization) and government, as well as for semantic role assignment, thus providing a straightforward way&amp;quot; of interfacing semantics and syntax.</Paragraph>
    <Paragraph position="13"> The identity of semantic functor, subcategorizand and governor is actually what underlies the notion of head both in MT.XI's as proposed by Meaning Text Theory (.XITT) (.Mel'~uk. 1988). a dependency nmdel.</Paragraph>
    <Paragraph position="14"> and I-IPSG (Pollard and Sag, 1987), a hybrid model.</Paragraph>
    <Paragraph position="15"> Ill MT.M'S. governn~ent patterns are associated with lexemes that are considered heads in the syntactic zone of the lexicon, covering subcategorization and case government, and act as constraint on syntactic structure. Similarly, in HPSG, SUBCAT lists are associated with lexemes that are heads and the sub-categorization principle takes care of the 'projection' of that information in a phrasal unit.</Paragraph>
    <Paragraph position="16"> The concepts of head and dependency had actually been taken up already in early Transformational Grammar, e.g., by (Hays, 1964; Robinson, 1970; Anderson, 1971), and incorporated in the deep structure representation. Most clearly, however, a concept of head received a special status in X-bar syntax (Chomsk3, 1970&amp;quot; Jackendoff, 1977), which has become the phrase structure model underlying man)&amp;quot; current grammar aproaches. For example, in Government and Binding (GB) theory, X-bar theory is a subtheory on a par with Binding Theory, Theta Theory', etc., in LFG, c-structure representations are based on X-bar. and also HPSG'S syntactic structure representations conform to the X-bar scheme. For a sample X-bar structure see Figure 3. Similar to dependency grammars, X-bar goes together with a strong notion of lexicalization, where syntactic constraints are primarily associated with lexemes or lexical classes and projected to syntactic structure.</Paragraph>
    <Paragraph position="17"> Subcategorization and government are surely two essential aspects of a grammatical description on the syntagmatic plane. There are other aspects to cover, however; two other kinds of syntagrnatic pattenting that need to be considered in an exhaustive treatment of syntagrnatic relations are agreement and word order. Also, what has not been considered in the Zwicky-Hudson debate are complex syntactic units, such as for example coordinate st1~tctures. These potentially present problems for hierarchical representations such as dependency and constituency, as we will see below in Section 3.</Paragraph>
    <Paragraph position="18"> Coordinate structures are in fact a notorious prol)lem for both dependency and constituency approaches, and there are numerous proposals of how to treat them. Word order, in particular word order x~riation attributed to information distribution is another notorious problem. Agreement, while being a well-understood phenomenon, can be a problem for a dependency-only analysis.</Paragraph>
    <Paragraph position="19"> Looking at the ~ariety of treatments suggested in these areas, it seems that constituency is hard pressed to accommodate information structure, that the representation of coordinate structure is problematic for both constituency and dependency, and that agreement cannot be described as involving a dependency relation in the strict sense (see Section 3 below). In the next section these observations are illustrated discussing coordinate structure in Word Grammar (WG) (Hudson, 1984), information structure in Combinatory Categorial Grammar (CCG) (Steedman, 1991) and agreement in Head-Driven Phrase Structure Grammar (HPSG) (Pollard and Sag, 1994).</Paragraph>
    <Paragraph position="21"> She left the possum on the deck</Paragraph>
  </Section>
  <Section position="3" start_page="41" end_page="42" type="metho">
    <SectionTitle>
3 Limits of dependency and
</SectionTitle>
    <Paragraph position="0"> constituency: Coordinate structure, information structure and agreement In this section I illustrate tile problematic nature of hierarchical representations, such as constituency and dependency, for the representation of coordinate structure, infornmtion structure and syntactic agreement. More particularly. I discuss * why coordinate structure is a problem for a dependency granunar like WG (Section 3.1), * why CCG. as an example of constituency-only approaches, works quite well with information structure (Section 3.2), and  * why HPSG, a hybrid model, works well for agreement (Section 3.3).</Paragraph>
    <Paragraph position="1"> 3.1 Coordinate structure: the limit of dependency in WG Word grammar (Hudson, 1984; Fraser and Hudson, 1992) strives to account for all grammatical relations  by head-dependent relations. However, there is one type of construction where Hudson concedes the necessity of constituency: this is the coordinate structure, including incomplete conjuncts 1 as in gapping, reduced conjunct and right-node raising constructions. null While the problem for the majority of constituency-based approaches is how to accommodate the conjunction in the phrase structure representation and how to deal with phrasally incomplete conjuncts, in a dependency grammar the IA conjunct is a component part of a coordinate structure (Hudson, 1984).</Paragraph>
    <Paragraph position="2">  problem is that there is no unit acknowledged with which a coordinate structure can be referred to, since conjuncts (and, but, etc) cannot be considered the heads (or dependents) of these constructions. For coordinate structures, constituency has to be introduced to wc, so that bracketings such as I like ((red apples) and (green plums)) become possible (el. (Hudson, 1984, 218)). Hudson thus has to single out the representation of coordinate structure from the rest of the representational apparatus: Dependency is not a possible kind of representation.</Paragraph>
    <Section position="1" start_page="42" end_page="42" type="sub_section">
      <SectionTitle>
3.2 Information structure in CCG
</SectionTitle>
      <Paragraph position="0"> Information structuring is a problem for traditional constituent- approaches because units of information structure often do not coincide with the units established by phrase structure. For example, a traditional phrase structure for Fred ate the beans would reflect tile following bracketing: (Fred) (ate the beans), which coincides only with one possible information structuring, where Fred is the Given element, but not with an information structure where Fred ate is the Given element (of. (Steedman, 1991, 274-275)). The definition of 'Given' used here is that of (Halliday, 1985). It is that element in clause structure that represents the cotextually or contextually known information. 2 See Figure 4 illustrating these two kinds of information structuring.</Paragraph>
      <Paragraph position="1"> Fled ate the beans.</Paragraph>
      <Paragraph position="2">  (1) What about the beans? Who ate them? (Fred) (ate tile beans) (2) What about Fred? What did he eat?  Fred ate the beans As (Steedman, 1991) points out, proposals for intonation structure, which is the reflex of information distribution in spoken mode, that try to deal with this divergence either come up with very complex derivations of intonational structure from a surface syntactic structure or they stipulate two autonomous levels of representation. These have to communicate, however, and the representation is 2Another term that has been used for Given is Topic -however, the notion of Topic is often a conflation of Given and Theme, which in Halliday's view is distinct from Given: While Given represents that part of an utterance that is presented as known, Theme is that part which is taken to be the point of departure of a message, whether that is given or new information. In English, Theme is said to occupy the first position in the clause; Gi~en can be coexistent with Theme, but can also be part of Rheme, or cover more constituents thau Theme.</Paragraph>
      <Paragraph position="3"> thus considerably complicated (cf. (Steedman, 1991, 261)). In Categorial Grammar, constituency groupings other than the ones of traditional PS-markers are possible - including those that are coexistent with information structuring, as shown in the proposal of (Steedman. 1991) using Combinatory Categorial Grammar (CCG). In CCG a constituent grouping of Fred ate the beans as (Fred ate) (the beans) is thus possible, opening up the possibility of a unified treatment of information structure and syntactic structure.</Paragraph>
      <Paragraph position="4"> Traditional constituency inhibits a fornmlation of information structuring, in which information structure, and consequently intonation structure, and syntactic structure are isomorphic, because it subscribes to one particular kind of constituent grouping. null</Paragraph>
    </Section>
    <Section position="2" start_page="42" end_page="42" type="sub_section">
      <SectionTitle>
3.3 Syntactic agreement in HPSG
</SectionTitle>
      <Paragraph position="0"> Syntactic agreemeut can be described as involving the sharing of grammatical features across some of tile component parts of a syntactic unit. While for syntactic agreement tile domains of agreement axe often coexistent with head-dependent domains, it is not necessarily the case that tile head is the determinant of concord, nor is it true that it is one set of features that is 'shared' across all the component parts. Thus. heads cmmot simply be equated with determinants of concord.</Paragraph>
      <Paragraph position="1"> However, as just pointed out, the domain of agreement is often coexistent with head-dependent groupings. and agreement can be described based on head-dependent structures. One such proposal for the German nonfinal group is made in (Pollard and Sag, 1994).</Paragraph>
      <Paragraph position="2"> In terms of agreement relations, tile German nominal group (NG) can be briefly characterized as follows: German nouns carr.v grammatical gender, number and case. Adjectives are said to carry these features, too. and exhibit three inflectional patterns: weak, strong and mixed. The choice of inflectional class depends on whether the nominal group contains a definite determiner, a nonspecific or zero deternfiner, or an indefinite deternfiner. Determiners are roughly either definite or indefinite, and they reflect gender, number and case as well. Agreement is therefore not attributable to the head noun as tile source of agreement constraints, but rather, there are several determinants of agreement that affect different grammatical features.</Paragraph>
      <Paragraph position="3"> In (Pollard and Sag, 1994), agreement in the German NG is described in the following way.</Paragraph>
      <Paragraph position="4"> Case agreement is accounted for by feature (or structure) sharing between a head (the noun, of which CASE is an attribute) and its dependents.</Paragraph>
      <Paragraph position="5"> Here, the determinant of concord is coexistent with the head. This is specified by structure sharing of tile CASE attribute between head and dependents, e.g., between the head noun and the determiner: \[HEADno~n noun\[CASE #i\]</Paragraph>
    </Section>
  </Section>
  <Section position="4" start_page="42" end_page="42" type="metho">
    <SectionTitle>
SUBCAT &lt;DetP \[CASE #I\] &gt;\]
</SectionTitle>
    <Paragraph position="0"> Adjectives are described as having GENDER and NUMBER attributes in the CONTENTIINDEX slot and they are structure-shared with the index of the noun that the adjective modifies. This is accounted for by the general scheme for head-adjunct structures, where the adjunct's MOD ~alue is shared with the head's SYNSEM x-alue. Furthermore, adjectives, or more precisely adjectix~l forms, are described as imposing restrictions on the kind of determiner they can combine with, e.g., forms belonging to the weak inflectional class restrict the determiners they combine with to be of type strong whereas adjectix~l forms of the strong class require the determiner in the nominal group to be weak or absent. The sign representing kluge M~dchen (smart girl)</Paragraph>
    <Paragraph position="2"> (cf. (Pollard and Sag, 1994, 87)) can thus only combine with the determiner das (definite determiner), but not with ein (indefinite determiner).</Paragraph>
    <Paragraph position="3"> This description of agreement in the German nominal group acknowledges several determinants of concord, which conforms to the linguistic observations made about the phenomenon. The determinant of concord for case is the head noun, for gender and number it is also the head noun; and agreement between adjective and determiner (the relation between selection of type of determiner and type of adjectival form) is interpreted as the type of inflectional class of the adjective selecting the type of determiner, i.e., the adjective is taken as determinant of concord.</Paragraph>
  </Section>
  <Section position="5" start_page="42" end_page="45" type="metho">
    <SectionTitle>
4 A diversified view of syntagmatic
</SectionTitle>
    <Paragraph position="0"> relations: SFG Ill the preceding section I have tried to illustrate that some kinds of syntagrnatic patternings are hard to fit into dependency and constituency representations. The approaches that do work - like CCG for information structure, and HPSG for agreement, make use of all mltraditional notion of phrase structure and a rather flexible notion of dependency, respectively.</Paragraph>
    <Paragraph position="1"> Abstracting away from the particular representational means that have been discussed here, constituency and dependency, I now want to point to another way of looking at syntagmatic relations that is not a priori committed to a strict notion of dependency or a traditional notion of phrase structure and is therefore unlikely to encounter the problems dicussed ill the preceding section. This is Systemic</Paragraph>
    <Paragraph position="3"> Functional Grammar (SFG; (Hallida), 1973; Halliday, 1985)) which is known in computational linguistics foremostl.v by application ill Natural Language Generation (NLG) (e.g., (Matthiessen and Bateman, 1991; Fawcett and Tucker, 1989; Bateman et al., 1991; Teich and Bateman, 1994)). The representational aspect of SFC, that is most prominent is the system network. System networks are descriptions of paradignlatic ga'ammatical relations intended as declarative statements of grammatical features and the coocurrence constraints between them. System networks are like the type hierarchies used in HPS(J ill this regard (cf. also (Bateman, 1991: Bateman et al., 1992; Henschel, 1995; Teich, in press)): the grammatical types in SI:G, however, are functionally rather than surface-syntactically motivated. Also. constraints on syntactic structure are tied to grammatical t.vpes, so that one could speak of a 'grammaticized&amp;quot; grammar~as opposed to lexicalized grammars. Ill SFG, it is not lexical, but grammarital classes that exhibit constraints on syntactic structure.</Paragraph>
    <Paragraph position="4"> Because syntactic structure is one of the less prominent topics in NLG and because SF6 is primarily a classification-based approach to gramnmr.</Paragraph>
    <Paragraph position="5"> SFG's representation of syntagmatic structure is less known.</Paragraph>
    <Paragraph position="6"> SFG maintains that there are four different kinds of syntagmatic patterning (Halliday, 1979).</Paragraph>
    <Paragraph position="7"> Prosodic structure. Agreement is a syntagmatic phenomenon that is ptvsodic in nature, in the sense that a particular realizational effect spreads over more than one constituent, similar to prosodic features that are strung throughout an intonational unit (see Figure 5 displaying Subject-Finite agreelnellt). null  we find ill clauses: informational prominence, which shows ill the distribution of Given and New, where in spoken mode. the intonation focus, which falls into the New part of the utterance, marks the informational pronfinence by carrying the major pitch change. Another point of prominence is thematic prominence: the structuring of a clause in theme and rheme (see Figure 6).</Paragraph>
    <Paragraph position="8"> Interdependency structure. Coordinate structures belong to a class of structure called in- null paratactic structures and are opposed to hypotactic, i.e., subordinate, structures. While a dependency representation can handle the latter because there is an identifiable head element, there is no head element in paratactic structures: its elements are rather mutually dependent. The term interdependency is used to cover both hypotactically and paratactically related syntactic units, s For an example see Figure 7.</Paragraph>
    <Paragraph position="9">  Ill SFG, the notation for the elements of paratactic structures is 1, 2, etc. and for those of hypotactic stuctures it is ta, 8, ~&amp;quot; etc. Paratactic structures are said not to have heads, whereas in hypotactic structures, n is considered the head. The additional labeling (Extender, Extension, Enhanced, Enhancement) marks the semantic-rhetorical relation between the dements of the structure: In (1) in Figure 7 the 1 element is said to be extended (by the 2 element); ill (2), tile a element is said to be enhanced by the '3 element.</Paragraph>
    <Paragraph position="10"> Constituency structure. The fourth kind of syntagmatic patterning SF6 finds is one that has unique elements such as Subject, Object, Predicate or Actor, Goal, Process. 4 Here, constituency is considered an appropriate means of representation. The categorial values (S, NP, PP etc) are simply attributes associated with these functional constituents. See Figure 8 for an example.</Paragraph>
    <Paragraph position="11">  A unified view: the function structure. A representation of syntagmatic structure in SFG comprises all of these aspects. The distinction of syntagmatic patterning into these four aspects stems from the conception of fimction in SFG. The functional nmtivation of categories in the system network, i.e., tile granunatical type hierarchy, is fourfold: paradigmatic gramnmtical types are exper/entially, logically, interpersonally or textually motivated. Each of these describes the grammar of a language from a different angle and goes together with a particular mode of expression in syntagmatic structure. The experiential aspect of syntagmatic structuring is elemental, reflecting part-whole relations--tiffs aspect can be suitably represented by constituency: the logical aspect can be represented by a special kind of dependency structure, the interdependency structure, which represents part-part relations; the interpersonal and tile textual aspects, however, prosodic and periodic structure, are difficult to press into these schemes because they can cut across constituency boundaries or nmy contradict constituency groupings. Therefore, if all of these aspects of syntagmatic patterning are to be uniformly described in one representation, the colastituency representation part should be as little committed to a particular kind of grouping as possible. so as to avoid conflicts with other groupings as required for instance by information distribution. In fact, the kind of constituency SFG subscribes to is a nmltiple-branching structure, where the nodes are functionally annotated, reflecting a minimal bracketing strategy (see below).</Paragraph>
    <Paragraph position="12"> A syntagmatic representation at clause level of Fred ate the beans, for example, where in terms of infornmtion structure Fred is Given would look as follows: '~  ate the beans Sin a feature structure notation (here: including information about categorial and lexical realization) this is: \[ac$or: II\[NP \[lex: Frcd\]\], subject: 81, theme: goal: 12\[NP \[lex: bean~\], object: $2, process: $3\[V \[lex: ate\]\], finite: #3, theme: &lt;#2,#3&gt;, hey: &lt;#2,#3&gt;\] Ill, And for the interpretation with Fred ate as Given, 'Given' can be conflated with both Actor and Process (Subject and Finite):</Paragraph>
    <Section position="1" start_page="43" end_page="45" type="sub_section">
      <SectionTitle>
Actor Process Goal
Theme Rheme
Given New
Subject Finite Object
</SectionTitle>
      <Paragraph position="0"> Fred ate the beans Here. a constituent is codescribed from several perspectives, taking into account the different aspects of syntagmatic patterning sketched above, each creating a separate &amp;quot;layer&amp;quot; in the representation. or in other words, each coming with a particular set of attributes, like Theme and Rheme/Given anti New for the textual mode, Actor, Process, Goal for the experiential mode and Subject, Finite, Object for the interpersonal modefi A function structure like this implies a very fiat constituency, where the constituent boundaries do not necessarily match one-to-one. Thus, both of the interpretations of information distribution of Fred ate the beans are compatible with the rest of the structure.</Paragraph>
      <Paragraph position="1"> The representation of coordinate structure in SFG benefits similarly from the minimal bracketing strategy: The coordinating conjunction itself is an immediate constituent of a coordinate structure, e.g., for red apples and green plums, the function structure implies a bracketing as (red apples) (and)  Ted apples and green plums Here, none of the elements has to be attributed head status and the conjunction itself is a constituent. null In terms of agreement interpreted as prosodic structure, the HPSG representation of agreement in the German nominal group by structure sharing is actually a possible realization of this view. However, with the current representational means employed in computational implementations of SFC, such as the NIGEL grammar in KPML (Bateman, 1997), such a formulation is not readily possible. This has two reasons, one being of a theoretical nature, the other one being a matter of computational representation. The theoretical problem is that features in the grammarital system network have to be unique and feature sharing among constituents is thus not allowed. As a consequence, there is no mechanism in the ~There are no logical attributes here, because logical structuring only pertains to complex syntactic units, such as paratactic and hypotactic structures.</Paragraph>
      <Paragraph position="2">  KPML implementation for feature sharing. In unification and constraint-based feature structure representations of NIGEL (Kasper, 1987; Henschel, 1994: Henschel, 1995) , feature sharing is possible, but still the feature uniqueness postulate would have to be relaxed in order to nmke use of this mechanism.</Paragraph>
    </Section>
  </Section>
class="xml-element"></Paper>
Download Original XML