Title :
Improved minimax bounds on the test and training distortion of empirically designed vector quantizers
Author_Institution :
Comput. & Autom. Res. Inst., Hungarian Acad. of Sci., Budapest
Abstract :
It has been shown by earlier results that the minimax expected (test) distortion redundancy of empirical vector quantizers with three or more levels designed from n independent and identically distributed (i.i.d.) data points is at least Omega(1/radicn) for the class of distributions on a bounded set. In this correspondence, a much simpler construction and proof for this are given with much better constants. There are similar bounds for the training distortion of the empirically optimal vector quantizer with three or more levels. These rates, however, do not hold for a one-level quantizer. Here, the two-level quantizer case is clarified, showing that it already shares the behavior of the general case. Given that the minimax bounds are proved using a construction that involves discrete distributions, one suspects that for the class of distributions with uniformly bounded continuous densities, the expected distortion redundancy might decrease as o(1/radicn) uniformly. It is shown as well that this is not so, proving that the lower bound for the expected test distortion remains true for these subclasses
Keywords :
minimax techniques; statistical analysis; vector quantisation; clustering method; discrete distribution; empirical vector quantization; independent-identically distribution; minimax expected distortion; training distortion; uniformly bounded continuous density; Algorithm design and analysis; Artificial neural networks; Convergence; Information theory; Minimax techniques; Neural networks; Notice of Violation; Pattern recognition; Testing; Vector quantization; Clustering methods; distortion; empirical design; lower bounds; minimax control; redundancy; training; vector quantization;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2005.856980