Title :
A competitive learning of three-layer neural networks
Author :
Park, Sung-Kee ; Kim, Ji H.
Author_Institution :
Dept. of Electr. Eng., Tennessee Technol. Univ., Cookeville, TN
Abstract :
Summary form only given, as follows. A competitive learning algorithm called geometrical expansion learning (GEL) was proposed to train a three-layer neural network for an arbitrary function in discrete space. The most significant difference between GEL and backpropagation learning (BPL) is that GEL always guarantees the convergence, while the convergence of BPL is not known. Moreover, GEL automatically determines the required number of neurons in a hidden layer, which varies depending on the given training patterns. Also, the learning speed of GEL is much faster than that of BPL
Keywords :
convergence; learning systems; neural nets; backpropagation learning; competitive learning; convergence; geometrical expansion learning; hidden layer; learning speed; three-layer neural networks; training patterns; Backpropagation algorithms; Computer networks; Convergence; Neural networks; Neurons; Power line communications; Space technology;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155585