File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/97/a97-1013_intro.xml
Size: 4,125 bytes
Last Modified: 2025-10-06 14:06:15
<?xml version="1.0" standalone="yes"?> <Paper uid="A97-1013"> <Title>Developing a hybrid NP parser</Title> <Section position="3" start_page="81" end_page="81" type="intro"> <SectionTitle> 2 The Relaxation Labelling Algorithm </SectionTitle> <Paragraph position="0"> Since we are dealing with a set of constraints and want to find a solution which optimally satisfies them M1, we can use a standard Constraint Satisfaction algorithm to solve that problem.</Paragraph> <Paragraph position="1"> Constraint Satisfaction Problems are naturally modelled as Consistent Labeling Problems (Larrosa and Meseguer, 1995). An algorithm that solves CLPs is Relaxation Labelling.</Paragraph> <Paragraph position="2"> It has been applied to part-of-speech tagging (Padr6, 1996) showing that it can yield as good results as a HMM tagger when using the same information. In addition, it can deal with any kind of constraints, thus the model can be improved by adding any other constraints available, either statistics, hand-written or automatically extracted (Mhrquez and Rodrfguez, 1995; Samuelsson et al., 1996).</Paragraph> <Paragraph position="3"> Relaxation labelling is a generic name for a family of iterative algorithms which perform function optimisation, based on local information. See (Torras, 1989) for a summary.</Paragraph> <Paragraph position="4"> Given a set of variables, a set of possible labels for each variable, and a set of compatibility constraints between those labels, the algorithm finds a combination of weights for the labels that maximises &quot;global consistency&quot; (see below).</Paragraph> <Paragraph position="5"> Let V = {vl, v2,..., v,~} be a set of variables.</Paragraph> <Paragraph position="6"> Let tl {til ti2, i = , ...,tmi ) be the set of possible labels for variable vi.</Paragraph> <Paragraph position="7"> Let CS be a set of constraints between the labels of the variables. Each constraint C E CS states a &quot;compatibility value&quot; Cr for a combination of pairs variable-label. Any number of variables may be involved in a constraint.</Paragraph> <Paragraph position="8"> The aim of the algorithm is to find a weighted labelling I such that &quot;global consistency&quot; is maximised. Maximising &quot;global consistency&quot; is defined as maximising ~j p~. x Sij , Vvi, where p~. is the weight for label j in variable vi and Sij the support received by the same combination. The support for the pair variable-label expresses how compatible that pair is with the labels of neighbouring variables, according to the constraint set.</Paragraph> <Paragraph position="9"> 1A weighted labelling is a weight assignment for each label of each variable such that the weights for the labels of the same variable add up to one.</Paragraph> <Paragraph position="10"> The support is defined as the sum of the influence of every constraint on a label.</Paragraph> <Paragraph position="12"> where: R~j is the set of constraints on label j for variable i, i.e. the constraints formed by any combination of variable--label pairs that includes the pair (vi, tj). rl fd Inf(r) = Cr x Pk,(m) x ... x Pkd(m), is the product of the current weights 2 for the labels appearing in the constraint except (vi,tj) (representing how applicable the constraint is in the current context) multiplied by Cr which is the constraint compatibility value (stating how compatible the pair is with the context).</Paragraph> <Paragraph position="13"> Briefly, what the algorithm does is: 1. Start with a random weight assignment.</Paragraph> <Paragraph position="14"> 2. Compute the support value for each label of each variable. (How compatible it is with the current weights for the labels of the other variables.) null 3. Increase the weights of the labels more compat- null ible with the context (support greater than 0) and decrease those of the less compatible labels (support less than 0) 3 , using the updating function: null</Paragraph> <Paragraph position="16"> 4. If a stopping/convergence criterion 4 is satisfied, stop, otherwise go to to step 2.</Paragraph> </Section> class="xml-element"></Paper>