Title :
Asymptotic convergence of dual-tree entropy codes
Author :
Freeman, George H.
Author_Institution :
Dept. of Electr. & Comput. Eng., Waterloo Univ., Ont., Canada
Abstract :
Entropy coding introduces a probabilistic matching at the interface between the discrete-time, finite-alphabet, source model sequence and the discrete-time, finite-alphabet, channel information sequence. Source entropy codes, such as common-string parsers, alter the source statistics to match the given channel. Channel entropy codes, such as Huffman codes, alter the channel statistics to match the given source. These are given a joint, dual description using parse and code trees describing the source and channel word sets. Divergence and variational distance are defined between the source word distribution and the desired channel word distribution. Convergence of any scheme for extending the size of the source word set is demonstrated if the divergence divided by the expected source-word length becomes zero. Huffman channel coding implies an upper bound on divergence in terms of variational distance. This leads to an heuristic algorithm to extend the source word set while keeping the divergence low. Examples show a significant complexity reduction for performance equivalent to Tunstall source coding or Huffman channel coding alone
Keywords :
codes; convergence; entropy; trees (mathematics); Huffman channel coding; Tunstall source coding; asymptotic convergence; common-string parsers; complexity reduction; divergence; dual-tree entropy codes; heuristic algorithm; parse and code trees; performance; variational distance; Channel capacity; Channel coding; Convergence; Decoding; Entropy coding; Heuristic algorithms; Source coding; Statistical distributions; Statistics; Upper bound;
Conference_Titel :
Data Compression Conference, 1991. DCC '91.
Conference_Location :
Snowbird, UT
Print_ISBN :
0-8186-9202-2
DOI :
10.1109/DCC.1991.213360