Title :
A note on the competitive optimality of the Huffman code
Author_Institution :
Dept. of Electr. Eng.-Syst., Tel-Aviv Univ., Israel
fDate :
3/1/1992 12:00:00 AM
Abstract :
A bound on the probability that the length of any source code will be shorter than the self information by gamma bits is easily obtained using a Chebyshev-type argument. From this bound, one can establish the competitive optimality of the self information and of the Shannon-Fano code (up to one bit). In general, however, the Huffman code cannot be examined using this technique. Nevertheless, in the present work, the competitive optimality (up to one bit) of the Huffman code for general sources is also established using a different technique.<>
Keywords :
codes; Chebyshev-type argument; Huffman code; Shannon-Fano code; competitive optimality; probability; self information; source code; Channel capacity; Decoding; Entropy; Lattices; Quantization; Random variables; Signal processing; Speech processing; Upper bound;
Journal_Title :
Information Theory, IEEE Transactions on