DocumentCode :
2203383
Title :
VQ assistance for training perceptron networks
Author :
Porter, William A. ; Abou-Ali, Abdel-Latief
Author_Institution :
Dept. of Electr. & Comput. Eng., Alabama Univ., Huntsville, AL, USA
fYear :
1996
fDate :
11-14 Apr 1996
Firstpage :
352
Lastpage :
358
Abstract :
Vector quantization algorithms are used to find finite sets of exemplars which represent a data set to within an a priori error tolerance. Such representation is of the essence in codebook based data compression and transmission. We first develop modifications of the basic algorithm and then explore the use of vector quantization as a tool to speed the training of perceptron networks. We show that the vector quantization provides an efficient initialization for the backprop algorithm. We also explore the use of vector quantization to decompose large scale computational problems into more computable parts. Classification problems are considered
Keywords :
backpropagation; pattern classification; perceptrons; vector quantisation; backprop algorithm; codebook based data compression; error tolerance; exemplars; finite sets; large scale computational problem; perceptron networks; training; transmission; vector quantization algorithms; Clustering algorithms; Computer errors; Data compression; Function approximation; Iterative algorithms; Kernel; Large-scale systems; Nearest neighbor searches; Neural networks; Vector quantization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Southeastcon '96. Bringing Together Education, Science and Technology., Proceedings of the IEEE
Conference_Location :
Tampa, FL
Print_ISBN :
0-7803-3088-9
Type :
conf
DOI :
10.1109/SECON.1996.510089
Filename :
510089
Link To Document :
بازگشت