Title :
Back to single-layer learning principles
Author_Institution :
Daimler-Benz AG, Ulm-Boefingen, Germany
Abstract :
Summary form only given, as follows. Simple single-layer learning principles like the perceptron rule have been proven to be of limited computational power. However, combining two such principles, the perceptron rule and a simple competitive learning rule, relaxes a great number of these limitations. Moreover, this combination preserves positive properties of simple learning rules such as fast and reliable convergence and good generalization capabilities. Computational experiments with a difficult, highly nonlinear classification task have confirmed these hypotheses: a neural network based on these two principles is superior to the classical multi layer backpropagation in misclassification rates, learning speed and reliability of convergence. In addition, its generalization capabilities are substantially better due to the smoothness enforced by the linearity of the single-layer perceptron
Keywords :
learning systems; neural nets; pattern recognition; competitive learning rule; convergence; learning speed; misclassification rates; nonlinear classification task; perceptron rule; single-layer learning principles; Backpropagation; Computer networks; Convergence; Linearity; Multi-layer neural network; Neural networks;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155546