DocumentCode :
1748925
Title :
Effects of initialization on structure formation and generalization of neural networks
Author :
Shiratsuchi, Hiroshi ; Gotanda, Hiromu ; Inoue, Katuhiro ; Kumamaru, Kousuke
Author_Institution :
Fac. of Eng., Ryukyus Univ., Okinawa, Japan
Volume :
4
fYear :
2001
fDate :
2001
Firstpage :
2644
Abstract :
In this paper, we propose an initialization method of multilayer neural networks (NN) employing the structure learning with forgetting. The proposed initialization consists of two steps: weights of hidden units are initialized so that their hyperplanes should pass through the center of input pattern set, and those of output units are initialized to zero. Several simulations were performed to study how the initialization affects the structure forming process of the NN. From the simulation result, it was confirmed that the initialization gives better network structure and higher generalization ability
Keywords :
generalisation (artificial intelligence); learning (artificial intelligence); multilayer perceptrons; forgetting; generalization; hyperplanes; initialization; multilayer neural networks; structure formation; structure learning; Cause effect analysis; Computer science; Convergence; Modeling; Multi-layer neural network; Neural networks; Nonhomogeneous media; Systems engineering and theory; Training data; Visualization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.938787
Filename :
938787
Link To Document :
بازگشت