Title :
Clustering-based algorithms for single-hidden-layer sigmoid perceptron
Author_Institution :
Control Eng. Lab., Helsinki Univ. of Technol., Espoo, Finland
fDate :
5/1/2003 12:00:00 AM
Abstract :
Gradient-descent type supervised learning is the most commonly used algorithm for design of the standard sigmoid perceptron (SP). However, it is computationally expensive (slow) and has the local-minima problem. Moody and Darken (1989) proposed an input-clustering based hierarchical algorithm for fast learning in networks of locally tuned neurons in the context of radial basis function networks. We propose and analyze input clustering (IC) and input-output clustering (IOC)-based algorithms for fast learning in networks of globally tuned neurons in the context of the SP. It is shown that "localizing\´\´ the input layer weights of the SP by the IC and the IOC minimizes an upper bound to the SP output error. The proposed algorithms could possibly be used also to initialize the SP weights for the conventional gradient-descent learning. Simulation results offer that the SPs designed by the IC and the IOC yield comparable performance in comparison with its radial basis function network counterparts.
Keywords :
feedforward neural nets; learning (artificial intelligence); multilayer perceptrons; pattern clustering; performance evaluation; radial basis function networks; clustering-based algorithms; gradient-descent learning; gradient-descent type supervised learning; hierarchical algorithm; input-output clustering; local-minima problem; locally tuned neurons; performance; radial basis function networks; simulation; single-hidden-layer sigmoid perceptron; upper bound; Algorithm design and analysis; Clustering algorithms; Electrical engineering; Feedforward systems; Neurons; Neuroscience; Radial basis function networks; Supervised learning; Upper bound; Vectors;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2003.813532