Title :
Combination of radial basis function neural networks with optimized learning vector quantization
Author_Institution :
Inst. for Parallel & Distributed High Performance Syst., Stuttgart Univ., Germany
Abstract :
Randomly initialized radial basis function neural networks are compared to networks whose centers are obtained by using vector quantization. It is shown that the error rate for small networks can be decreased by about 28%. To achieve the same performance with a trained network as with a randomly initialized network, only half of the number of hidden neurons is needed. This may be important for time critical applications. The time used for the training and initialization of a smaller network is comparable to the time used for the initialization of a larger network
Keywords :
learning (artificial intelligence); neural nets; vector quantisation; hidden neurons; initialization; optimized learning; radial basis function neural networks; vector quantization; Art; Backpropagation; Error analysis; Handwriting recognition; Neural networks; Neurons; Pattern recognition; Radial basis function networks; Smoothing methods; Vector quantization;
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
DOI :
10.1109/ICNN.1993.298837