Title :
Zipf´s law and entropy (Corresp.)
fDate :
9/1/1974 12:00:00 AM
Abstract :
The estimate of the entropy of a language by assuming that the word probabilities follow Zipf´s law is discussed briefly. Previous numerical results [3] on the vocabulary size implied by Zipf´s law and entropy per word are corrected. The vocabulary size should be 12 366 words (not 8727 words) and the entropy per word 9.27 bits (not 11.82).
Keywords :
Entropy functions; Languages; Entropy; Frequency estimation; Humans; Natural languages; Vocabulary;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.1974.1055269