• DocumentCode
    38210
  • Title

    How Suboptimal Is the Shannon Code?

  • Author

    Narimani, Hamed ; Khosravifard, Mohammadali ; Gulliver, T. Aaron

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Isfahan Univ. of Technol., Isfahan, Iran
  • Volume
    59
  • Issue
    1
  • fYear
    2013
  • fDate
    Jan. 2013
  • Firstpage
    458
  • Lastpage
    471
  • Abstract
    In order to determine how suboptimal the Shannon code is, one should compare its performance with that of the optimal code, i.e., the corresponding Huffman code, in some sense. It is well known that in the worst case the redundancy of both the Shannon and Huffman codes can be arbitrarily close to 1. Beyond this worst case viewpoint, very little is known. In this paper, we compare the performance of these codes from an average point of view. The redundancy is considered as a random variable on the set of all sources with n symbols and its average is evaluated. It is shown that the average redundancy of the Shannon code is very close to 0.5 bits, whereas the average redundancy of the Huffman code is less than n-1(1+ln n)+0.086 bits . It is also proven that the variance of the redundancy of the Shannon code tends to zero as n increases. Therefore, for sources with alphabet size n, the redundancy of the Shannon code is approximately 0.5 bits with probability approaching 1 as n→ ∞.
  • Keywords
    Huffman codes; information theory; Huffman code; Shannon code; optimal code; Channel coding; Entropy; Probability density function; Random variables; Redundancy; Upper bound; Average redundancy; Huffman code; Shannon code;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2012.2216975
  • Filename
    6293889