Title :
Dual Weight Learning Vector Quantization
Author :
Lv, Chuanfeng ; An, Xing ; Liu, ZhiWen ; Zhao, Qiangfu
Author_Institution :
Dept. of Electron. Eng., Beijing Inst. of Technol., Beijing
Abstract :
A new learning vector quantization (LVQ) approach, so-called dual weight learning vector quantization (DWLVQ), is presented in this paper. The basic idea is to introduce an additional weight (namely the importance vector) for each feature of reference vectors to indicate the importance of this feature during the classification. The importance vectors are adapted regarding the fitness of the respective reference vector over the training iteration. Along with the progress of the training procedure, the dual weights (reference vector and importance vector) can be adjusted simultaneously and mutually to improve the recognition rate eventually. Machine learning databases from UCI are selected to verify the performance of the proposed new approach. The experimental results show that DWLVQ can yield superior performance in terms of recognition rate, computational complexity and stability, compared with the other existing methods which including LVQ, generalized LVQ(GLVQ), relevance LVQ(RLVQ) and generalized relevance LVQ (GRLVQ).
Keywords :
computational complexity; learning (artificial intelligence); signal classification; vector quantisation; computational complexity; dual weight learning vector quantization; generalized relevance LVQ; iteration training; machine learning databases; recognition rate; respective reference vector; Computational complexity; Computer science; Databases; Machine learning; Neural networks; Neurons; Pattern recognition; Stability; Statistics; Vector quantization;
Conference_Titel :
Signal Processing, 2008. ICSP 2008. 9th International Conference on
Conference_Location :
Beijing
Print_ISBN :
978-1-4244-2178-7
Electronic_ISBN :
978-1-4244-2179-4
DOI :
10.1109/ICOSP.2008.4697470