DocumentCode :
1943577
Title :
Information Theoretic Vector Quantization with Fixed Point Updates
Author :
Rao, Sudhir ; Han, Seungju ; Principe, José
Author_Institution :
Florida Univ., Gainesville
fYear :
2007
fDate :
12-17 Aug. 2007
Firstpage :
1020
Lastpage :
1024
Abstract :
In this paper, we revisit information theoretic vector quantization (ITVQ) algorithm introduced in (T. Lehn-Schioler et al., 2005) and make it practical. We derive a fixed point update rule to minimize the Cauchy-Schwartz(CS) pdf divergence between the set of codewords and the actual data. In doing so, we overcome two severe deficiencies of the previous gradient based method namely, the number of parameters to be optimized and slow convergence rate, thus making this algorithm more efficient and useful as a compression algorithm.
Keywords :
convergence; gradient methods; higher order statistics; optimisation; vector quantisation; convergence; data compression; fixed point update rule; gradient based method; higher order statistics; information theoretic vector quantization; optimisation; Annealing; Convergence; Entropy; Gaussian processes; Kernel; Neural networks; Neurons; Optimization methods; Self organizing feature maps; Vector quantization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
ISSN :
1098-7576
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
Type :
conf
DOI :
10.1109/IJCNN.2007.4371098
Filename :
4371098
Link To Document :
بازگشت