DocumentCode :
922173
Title :
Zipf´s law and entropy (Corresp.)
Author :
Yavuz, D.
Volume :
20
Issue :
5
fYear :
1974
fDate :
9/1/1974 12:00:00 AM
Firstpage :
650
Lastpage :
650
Abstract :
The estimate of the entropy of a language by assuming that the word probabilities follow Zipf´s law is discussed briefly. Previous numerical results [3] on the vocabulary size implied by Zipf´s law and entropy per word are corrected. The vocabulary size should be 12 366 words (not 8727 words) and the entropy per word 9.27 bits (not 11.82).
Keywords :
Entropy functions; Languages; Entropy; Frequency estimation; Humans; Natural languages; Vocabulary;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.1974.1055269
Filename :
1055269
Link To Document :
بازگشت