DocumentCode :
1818051
Title :
Generalization ability of artificial neural network using Fahlman and Lebiere´s learning algorithm
Author :
Hamamoto, Masanori ; Kamruzzaman, Joarder ; Kumagai, Yukio
Author_Institution :
Dept. of Comput. & Syst. Eng., Muroran Inst. of Tech., Hokkaido, Japan
Volume :
1
fYear :
1992
fDate :
7-11 Jun 1992
Firstpage :
613
Abstract :
The authors investigate the generalization ability of the network generated by S.E. Fahlman and C. Lebiere´s (FL) (1990) learning algorithm which has a distinctive feature to build any network topology. They trained the same three-layer network by the FL algorithm and the backpropagation (BP) algorithm, and a comparison of recognition abilities shows that the FL network performs much better than the BP network. The FL network performs better because, in this network, hidden units use only saturated values and thus the hidden layer acts as a filter for noise. A two-layer network performs excellently if the training set is trainable by the two-layer network and if a pattern is recognized by detecting the maximum valued output. Since the FL algorithm begins with a minimal two-layer network which performs best under the condition stated, a designer can construct either a two-layer or a multilayer network according to which one best fits a particular application. Thus, it can be concluded that in all these respects the FL algorithm is preferable to the BP algorithm
Keywords :
backpropagation; learning (artificial intelligence); neural nets; FL algorithm; artificial neural network; backpropagation; learning algorithm; three-layer network; Algorithm design and analysis; Artificial neural networks; Buildings; Multi-layer neural network; Network topology; Stochastic processes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
Type :
conf
DOI :
10.1109/IJCNN.1992.287144
Filename :
287144
Link To Document :
بازگشت