File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/95/e95-1006_concl.xml
Size: 6,245 bytes
Last Modified: 2025-10-06 13:57:21
<?xml version="1.0" standalone="yes"?> <Paper uid="E95-1006"> <Title>A Specification Language for Lexical Functional Grammars</Title> <Section position="7" start_page="42" end_page="43" type="concl"> <SectionTitle> 6 Conclusion </SectionTitle> <Paragraph position="0"> The discussion so far should have given the reader some idea of how to specify LFG grammars using PS. To conclude we would like to discuss =c definitions. This topic bears on an important general issue: how are the 'dynamic' (or 'generative', or 'procedural') aspects of grammar to be reconciled with the 'static', (or 'declarative') model theoretic world view.</Paragraph> <Paragraph position="1"> The point is this. Although the LFG equations discussed so far were defining equations, LFG also allows so-called constraining equations (written =e). Kaplan and Bresnan explain the difference as follows. Defining equations allow a feature-value pair to be inserted into an f-structure providing no conflicting information is present. That is, they add a feature value pair to any consistent fstructure. In contrast, constraining equations are intended to constrain the value of an already existing feature-value pair. The essential difference is that constraining equations require that the feature under consideration already has a value, whereas defining equations apply independently of the feature value instantiation level.</Paragraph> <Paragraph position="2"> In short, constraining equations are essentially a global check on completed structures which require the presence of certain feature values. They have an eminently procedural character, and there is no obvious way to handle this idea in the present set up. The bulk of LFG involves stating constraints about a single model, and /: is well equipped for this task, but constraining equations involve looking at the structure of other possible parse trees. (In this respect they are reminiscent of the feature specification defaults of GPSG.) The approach of the present paper has been driven by the view that (a) models capture the essence of LFG ontology, and, (b) the task of the linguist is to explain, in terms of the relations that exist within a single model, what grammatical structure is. Most of the discussion in Kaplan and Bresnan (1982) is conducted in such terms. However constraining equations broaden the scope of the permitted discourse; basically, they allow implicit appeal to possible derivational structure. In short, in. common with most of the grammatical formalisms with which we are familiar, LFG seems to have a dynamic residue that resists a purely declarative analysis. What should be done? We see three possible responses. Firstly, we note that the model theoretic approach can almost certainly be extended to cover constraining equations. The move involved is analogous to the way first order logic (a so-called 'extensional' logic) can be extended to cope with intensional notions such as belief and necessity. The basic idea -- it's the key idea underlying first order Kripke semantics -- is to move from dealing with a single model to dealing with a collection of models linked by an accessibility relation. Just as quantification over possible states of affairs yields analyses of intensional phenomena, so quantification over related models could provide a 'denotational semantics' for =~. Preliminary work suggests that the required structures have formal similarities to the structures used in preferential semantics for default and non-monotonic reasoning. This first response seems to be a very promising line of work: the requisite tools are there, and the approach would tackle a full blooded version of LFG head on. The drawback is the complexity it introduces into an (up till now) quite simple story. Is such additional complexity reMly needed? A second response is to admit that there is a dynamic residue, but to deal with it in overtly computational terms. In particular, it may be possible to augment our approach with an explicit operational semantics, perhaps the evolving algebra approach adopted by Moss and Johnson (1994). Their approach is attractive, because it permits a computational treatment of dynamism that abstracts from low level algorithmic details.</Paragraph> <Paragraph position="3"> In short, the second strategy is a 'divide and conquer' strategy: treat structural issues using model theoretic tools, and procedural issues with (revealing) computational tools. It's worth remarking that this second response is not incompatible with the first; it is common to provide programming languages with both a denotational and an operational semantics.</Paragraph> <Paragraph position="4"> The third strategy is both simpler and more speculative. While it certainly seems to be the case that LFG (and other 'declarative' formalisms) have procedural residues, it is far from clear that these residues are necessary. One of the most striking features of LFG (and indeed, GPSG) is the way that purely structural (that is, model theoretic) argumentation dominates. Perhaps the procedural aspects are there more or less by accident? After all, both LFG and GPSG drew on (and developed) a heterogeneous collection of traditional grammar specification tools, such as context free rules, equations, and features. It could be the case such procedural residues as --C/ are simply an artifact of using the wrong tools for talking about models. If this is the case, it might be highly misguided to attempt to capture =C/ using a logical specification language. Better, perhaps, would be to draw on what is good in LFG and to explore the logical options that arise naturally when the model theoretic view is taken as primary. Needless to say, the most important task that faces this third response is to get on with the business of writing grammars; that, and nothing else, is the acid test.</Paragraph> <Paragraph position="5"> It is perhaps worth adding that at present the authors simply do not know what the best response is. If nothing else, the present work has made very clear to us that the interplay of static and dynamic ideas in generative grammar is a delicate and complex matter which only further work can resolve.</Paragraph> </Section> class="xml-element"></Paper>