Title :
Vector-quantization by density matching in the minimum Kullback-Leibler divergence sense
Author :
Hegde, Anant ; Erdogmus, Deniz ; Lehn-Schioler, T. ; Rao, Yadunandana N. ; Principe, Jose C.
Author_Institution :
Dept. of Electr. & Comput. Eng., Florida Univ., Gainesville, FL, USA
Abstract :
Representation of a large set of high-dimensional data is a fundamental problem in many applications such as communications and biomedical systems. The problem has been tackled by encoding the data with a compact set of code-vectors called processing elements. In this study, we propose a vector quantization technique that encodes the information in the data using concepts derived from information theoretic learning. The algorithm minimizes a cost function based on the Kullback-Liebler divergence to match the distribution of the processing elements with the distribution of the data. The performance of this algorithm is demonstrated on synthetic data as well as on an edge-image of a face. Comparisons are provided with some of the existing algorithms such as LBG and SOM.
Keywords :
information theory; learning (artificial intelligence); self-organising feature maps; vector quantisation; cost function; density matching; high-dimensional data; information theoretic learning; minimum Kullback-Leibler divergence sense; processing elements; vector quantization technique; Biomedical computing; Biomedical engineering; Biomedical signal processing; Cost function; Data engineering; Encoding; Entropy; Kernel; Signal processing algorithms; Vector quantization;
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
Print_ISBN :
0-7803-8359-1
DOI :
10.1109/IJCNN.2004.1379879