• DocumentCode
    929274
  • Title

    Variations on a theme by Huffman

  • Author

    Gallager, Robert G.

  • Volume
    24
  • Issue
    6
  • fYear
    1978
  • fDate
    11/1/1978 12:00:00 AM
  • Firstpage
    668
  • Lastpage
    674
  • Abstract
    In honor of the twenty-fifth anniversary of Huffman coding, four new results about Huffman codes are presented. The first result shows that a binary prefix condition code is a Huffman code iff the intermediate and terminal nodes in the code tree can be listed by nonincreasing probability so that each node in the list is adjacent to its sibling. The second result upper bounds the redundancy (expected length minus entropy) of a binary Huffman code by P_{1}+ \\log _{2}[2(\\log _{2}e)/e]=P_{1}+0.086 , where P_{1} is the probability of the most likely source letter. The third result shows that one can always leave a codeword of length two unused and still have a redundancy of at most one. The fourth result is a simple algorithm for adapting a Huffman code to slowly varying esthnates of the source probabilities. In essence, one maintains a running count of uses of each node in the code tree and lists the nodes in order of these counts. Whenever the occurrence of a message increases a node count above the count of the next node in the list, the nodes, with their attached subtrees, are interchanged.
  • Keywords
    Adaptive coding; Huffman codes; Communication system control; Control systems; Costs; Entropy; Frequency; Huffman coding; Information theory; Source coding; Statistics; Upper bound;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.1978.1055959
  • Filename
    1055959