Title :
The distortion of vector quantizers trained on n vectors decreases to the optimum as Op(1/n)
Author_Institution :
Xerox Palo Alto Res. Center, CA, USA
fDate :
27 Jun-1 Jul 1994
Abstract :
Recently it has been experimentally observed that the expected squared error of a fixed-rate vector quantizer trained on n vectors decreases to the expected squared error of the optimal fixed-rate vector quantizer as roughly O(1/n). We confirm this observation theoretically, using Pollard´s (1982) result that the codewords of a fixed-rate vector quantizer trained on n independent vectors converge to the codewords of the unique optimal vector quantizer according to a central limit theorem
Keywords :
coding errors; convergence of numerical methods; error statistics; rate distortion theory; vector quantisation; VQ; central limit theorem; codewords; convergence; distortion; optimal fixed-rate vector quantizer; squared error; trained vector quantizers; Covariance matrix; Machine intelligence; Nearest neighbor searches; Notice of Violation; Random variables; Taylor series;
Conference_Titel :
Information Theory, 1994. Proceedings., 1994 IEEE International Symposium on
Conference_Location :
Trondheim
Print_ISBN :
0-7803-2015-8
DOI :
10.1109/ISIT.1994.395072