DocumentCode :
2624914
Title :
The distortion of vector quantizers trained on n vectors decreases to the optimum as Op(1/n)
Author :
Chou, Philip A.
Author_Institution :
Xerox Palo Alto Res. Center, CA, USA
fYear :
1994
fDate :
27 Jun-1 Jul 1994
Firstpage :
457
Abstract :
Recently it has been experimentally observed that the expected squared error of a fixed-rate vector quantizer trained on n vectors decreases to the expected squared error of the optimal fixed-rate vector quantizer as roughly O(1/n). We confirm this observation theoretically, using Pollard´s (1982) result that the codewords of a fixed-rate vector quantizer trained on n independent vectors converge to the codewords of the unique optimal vector quantizer according to a central limit theorem
Keywords :
coding errors; convergence of numerical methods; error statistics; rate distortion theory; vector quantisation; VQ; central limit theorem; codewords; convergence; distortion; optimal fixed-rate vector quantizer; squared error; trained vector quantizers; Covariance matrix; Machine intelligence; Nearest neighbor searches; Notice of Violation; Random variables; Taylor series;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 1994. Proceedings., 1994 IEEE International Symposium on
Conference_Location :
Trondheim
Print_ISBN :
0-7803-2015-8
Type :
conf
DOI :
10.1109/ISIT.1994.395072
Filename :
395072
Link To Document :
بازگشت