Title :
Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding
Author :
Linder, Tamas ; Lugosi, Gabor ; Zeger, Kenneth
Author_Institution :
Dept. of Telecommun., Tech. Univ. Budapest, Hungary
fDate :
11/1/1994 12:00:00 AM
Abstract :
Rate of convergence results are established for vector quantization. Convergence rates are given for an increasing vector dimension and/or an increasing training set size. In particular, the following results are shown for memoryless real-valued sources with bounded support at transmission rate R. (1) If a vector quantizer with fixed dimension k is designed to minimize the empirical mean-square error (MSE) with respect to m training vectors, then its MSE for the true source converges in expectation and almost surely to the minimum possible MSE as O(√(log m/m)). (2) The MSE of an optimal k-dimensional vector quantizer for the true source converges, as the dimension grows, to the distortion-rate function D(R) as O(√(log k/k)). (3) There exists a fixed-rate universal lossy source coding scheme whose per-letter MSE on a real-valued source samples converges in expectation and almost surely to the distortion-rate function D(R) as O((√(loglog n/log n)). (4) Consider a training set of n real-valued source samples blocked into vectors of dimension k, and a k-dimension vector quantizer designed to minimize the empirical MSE with respect to the m=[n/k] training vectors. Then the per-letter MSE of this quantizer for the true source converges in expectation and almost surely to the distortion-rate function D(R) as O(√(log log n/log n))), if one chooses k=[(1/R)(1-ε)log n] for any ε∈(0.1)
Keywords :
convergence of numerical methods; source coding; vector quantisation; MSE; convergence rates; distortion-rate function; empirical quantizer design; mean-square error; memoryless real-valued sources; source coding theorem; training set size; training vectors; transmission rate; universal lossy source coding; vector dimension; vector quantization; Algorithm design and analysis; Convergence; Distortion measurement; Information theory; Loss measurement; Mathematics; Propagation losses; Quantization; Source coding; Training data;
Journal_Title :
Information Theory, IEEE Transactions on