File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/relat/02/p02-1026_relat.xml

Size: 1,809 bytes

Last Modified: 2025-10-06 14:15:40

<?xml version="1.0" standalone="yes"?>
<Paper uid="P02-1026">
  <Title>Entropy Rate Constancy in Text</Title>
  <Section position="3" start_page="2" end_page="2" type="relat">
    <SectionTitle>
2 Related Work
</SectionTitle>
    <Paragraph position="0"> There has been work in the speech community inspired by this constancy rate principle.In speech, distortion of the audio signal is an extra source of uncertainty, and this principle can by applied in the following way: A given word in one speech context might be common, while in another context it might be rare. To keep the entropy rate constant over time, it would be necessary to take more time (i.e., pronounce more carefully) in less common situations. Aylett (1999) shows that this is indeed the case.</Paragraph>
    <Paragraph position="1"> It has also been suggested that the principle of constant entropy rate agrees with biological evidence of how human language processing has evolved (Plotkin and Nowak, 2000).</Paragraph>
    <Paragraph position="2"> Kontoyiannis (1996) also reports results on 5 consecutive blocks of characters from the works  It may seem like an arbitrary choice, but a word is a natural unit of length, after all when one is asked to give the length of an essay one typically chooses the number of words as a measure.</Paragraph>
    <Paragraph position="3">  Strictly speaking, we want the cross-entropy between all words in the sentences number n and the true model of English to be the same for all n.</Paragraph>
    <Paragraph position="4"> Computational Linguistics (ACL), Philadelphia, July 2002, pp. 199-206. Proceedings of the 40th Annual Meeting of the Association for of Jane Austen which are in agreement with our principle and, in particular, with its corollary as derived in the following section.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML