File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/evalu/97/p97-1054_evalu.xml

Size: 10,975 bytes

Last Modified: 2025-10-06 14:00:26

<?xml version="1.0" standalone="yes"?>
<Paper uid="P97-1054">
  <Title>Co-evolution of Language and of the Language Acquisition Device</Title>
  <Section position="6" start_page="423" end_page="425" type="evalu">
    <SectionTitle>
4 Experimental Results
</SectionTitle>
    <Paragraph position="0"/>
    <Section position="1" start_page="423" end_page="423" type="sub_section">
      <SectionTitle>
4.1 Effectiveness of Learning Procedures
</SectionTitle>
      <Paragraph position="0"> Two learning procedures were predefined - a default learner and an unset learner. These LAgts were initialized with p-settings consistent with a minimal inherited CGUG consisting of application with NP and S atomic categories. All the remaining p-settings were genuine parameters for both learners. The unset learner was initialized with all unset, whilst the default learner had default settings for the parameters gendir and subjdir and argorder which specify a minimal SVO right-branching grammar, as well as default (off) settings for comp and perm which determine the availability of Composition and Permutation, respectively. The unset learner represents a 'pure' principles-and-parameters learner. The default learner is modelled on Bickerton's bioprogram learner.</Paragraph>
      <Paragraph position="1"> Each learner was tested against an adult LAgt initialized to generate one of seven full languages in the set which are close to an attested language; namely, &amp;quot;English&amp;quot; (SVO, predominantly right-branching), &amp;quot;Welsh&amp;quot; (SVOvl, mixed order), &amp;quot;Malagasy&amp;quot; (VOS, right-branching), &amp;quot;Tagalog&amp;quot; (VSO, right-branching), &amp;quot;Japanese&amp;quot; (SOV, left-branching), &amp;quot;German&amp;quot; (SOVv2, predominantly right-branching), &amp;quot;Hixkaryana&amp;quot; (OVS, mixed order), and an unattested full OSV language with left-branching syntax. In these tests, a single learner interacted with a single adult. After every ten interactions, in which the adult randomly generated a sentence type and the learner attempted to parse and learn from it, the state of the learner's p-settings was examined to determine whether the learner had converged on the same grammar as the adult. Table 1 shows the number of such interaction cycles (i.e. the number of input sentences to within ten) required by each type of learner to converge on each of the eight languages. These figures are each calculated from 100 trials to a 1% error rate; they suggest that, in general, the default learner is more effective than the unset learner. However, for the OVS language (OVS languages represent 1.24% of the world's languages, Tomlin, 1986), and for the unattested OSV language, the default (SVO) learner is less effective.</Paragraph>
      <Paragraph position="2"> So, there are at least two learning procedures in the space defined by the model which can converge with some presentation orders on some of the grammars in this set. Stronger conclusions require either exhaustive experimentation or theoretical analysis of the model of the type undertaken by Gibson and Wexler (1994) and Niyogi and Berwick (1995).</Paragraph>
    </Section>
    <Section position="2" start_page="423" end_page="424" type="sub_section">
      <SectionTitle>
4.2 Evolution of Learning Procedures
</SectionTitle>
      <Paragraph position="0"> In order to test the preference for default versus unset parameters under different conditions, the five parameters which define the difference between the two learning procedures were tracked through an-other series of 50 cycle runs initialized with either 16 default learning adult speakers and 16 unset learning adult speakers, with or without memory-limitations during learning and parsing, speaking one of the eight languages described above. Each condition was run ten times. In the memory limited runs, default parameters came to dominate some but not all populations. In a few runs all unset parameters disappeared altogether. In all runs with populations initialized to speak &amp;quot;English&amp;quot; (SVO) or &amp;quot;Malagasy&amp;quot; (VOS) the preference for default settings was 100%.</Paragraph>
      <Paragraph position="1"> In 8 runs with &amp;quot;Tagalog&amp;quot; (VSO) the same preference emerged, in one there was a preference for unset parameters and in the other no clear preference. However, for the remaining five languages there was no strong preference.</Paragraph>
      <Paragraph position="2"> The results for the runs without memory limitations are different, with an increased preference for unset parameters across all languages but no clear 100% preference for any individual language. Table 2 shows the pattern of preferences which emerged across 160 runs and how this was affected by the presence or absence of memory limitations.</Paragraph>
      <Paragraph position="3"> To test whether it was memory limitations during learning or during parsing which were affecting the results, another series of runs for &amp;quot;English&amp;quot; was performed with either memory limitations during learning but not parsing enabled, or vice versa. Memory limitations during learning are creating the bulk of the preference for a default learner, though there appears to be an additive effect. In seven of the ten runs with memory limitations only in learning, a clear preference for default learners emerged. In five of the runs with memory limitations only in parsing there appeared to be a slight preference for defaults emerging. Default learners may have a fitness advantage when the number of interactions required to learn successfully is greater because they will tend to converge faster, at least to a subset language. This will tend to increase their fitness over unset learners who do not speak any language until further into the  learning period.</Paragraph>
      <Paragraph position="4"> The precise linguistic environment of adaptation determines the initial values of default parameters which evolve. For example, in the runs initialized with 16 unset learning &amp;quot;Malagasy&amp;quot; VOS adults and 16 default (SVO) learning VOS adults, the learning procedure which dominated the population was a variant VOS default learner in which the value for subjdir was reversed to reflect the position of the subject in this language. In some of these runs, the entire population evolved a default subjdir 'right' setting, though some LAgts always retained unset settings for the other two ordering parameters, gendir and argo, as is illustrated in Figure 11. This suggests that if the human language faculty has evolved to be a right-branching SVO default learner, then the environment of linguistic adaptation must have contained a dominant language fully compatible with this (minimal) grammar.</Paragraph>
    </Section>
    <Section position="3" start_page="424" end_page="425" type="sub_section">
      <SectionTitle>
4.3 Emergence of Language and Learners
</SectionTitle>
      <Paragraph position="0"> To explore the emergence and persistence of structured language, and consequently the emergence of effective learners, (pseudo) random initialization was used. A series of simulation runs of 500 cycles were performed with random initialization of 32 LAgts' p-settings for any combination of p-setting values, with a probability of 0.25 that a setting would be an absolute principle, and 0.75 a parameter with unbiased allocation for default or unset parameters and for values of all settings. All LAgts were initialized to be age 1 with a critical period of 3 interaction cycles of 2000 random interactions for learning, a maximum age of 10, and the ability to reproduce by crossover (0.9 probability) and mutation (0.01 probability) from 4-10. In around 5% of the runs, language(s) emerged and persisted to the end of the run.</Paragraph>
      <Paragraph position="1"> Languages with close to optimal WML scores typically came to dominate the population quite rapidly. However, sometimes sub-optimal languages were initially selected and occasionally these persisted despite the later appearance of a more optimal language, but with few speakers. Typically, a minimal subset language dominated - although full and intermediate languages did appear briefly, they did not survive against less expressive subset languages with a lower mean WML. Figure 12 is a typical plot of the emergence (and extinction) of languages in one of these runs. In this run, around 10 of the initial population converged on a minimal OVS language and 3 others on a VOS language. The latter is more optimal with respect to WML and both are of equal expressivity so, as expected, the VOS language acquired more speakers over the next few cycles. A few speakers also converged on VOS-N, a more expressive but higher WML extension of VSO-N-GWP-COMP. However, neither this nor the OVS language survived beyond cycle 14. Instead a VSO language emerged at cycle 10, which has the same minimal expressivity of the VOS language but a lower WML (by virtue of placing the subject before the object) and this language dominated rapidly and eclipsed all others by cycle 40.</Paragraph>
      <Paragraph position="2"> In all these runs, the population settled on sub-set languages of low expressivity, whilst the percentage of absolute principles and default parameters increased relative to that of unset parameters (mean % change from beginning to end of runs: +4.7, +1.5 and -6.2, respectively). So a second identical set of ten was undertaken, except that the initial population now contained two SOV-V2 &amp;quot;German&amp;quot; speaking unset learner LAgts. In seven of these runs, the population fixed on a full SOV-V2 language, in two on the intermediate subset language SOV-V2-N, and in one on the minimal subset language SOV-V2-N-GWP-COMP. These runs suggest that if a full language defines the environment of adaptation then a population of randomly initialized LAgts is more likely to converge on a (related) full language. Thus, although the simulation does not model the development of expressivity well, it does appear that it can model the emergence of effective learning procedures for (some) full languages. The pattern of language emergence and extinction followed that of the previous series of runs: lower mean WML languages were selected from those that emerged during the run. However, often the initial optimal SVO-V2 itself was lost before enough LAgts evolved capable of learning this language. In these runs, changes in the percentages of absolute, default or unset p-settings in the population show a marked difference:  the mean number of absolute principles declined by 6.1% and unset parameters by 17.8%, so the number of default parameters rose by 23.9% on average between the beginning and end of the 10 runs. This may reflect the more complex linguistic environment in which (incorrect) absolute settings are more likely to handicap, rather than simply be irrelevant to, the performance of the LAgt.</Paragraph>
    </Section>
  </Section>
class="xml-element"></Paper>
Download Original XML