Title :
The effect of initial weights on premature saturation in back-propagation learning
Author :
Lee, Youngjik ; Oh, Sang-Hoon ; Kim, Myung Won
Author_Institution :
Electron. & Telecommun. Res. Inst., Daejeon, South Korea
Abstract :
The critical drawback of the backpropagation learning algorithm is its slow error convergence. The major reason for this is the premature saturation, a phenomenon in which the error of a neural network stays almost constant for some period of time during learning. It is known to be caused by an inappropriate set of initial weights. The probability of incorrectly saturated output nodes at the beginning epoch of learning is derived as a function of the range of initial weights, the number of nodes in each layer, and the maximum slope of the sigmoidal activation function. This is verified by Monte Carlo simulation
Keywords :
Monte Carlo methods; convergence; errors; learning systems; neural nets; probability; Monte Carlo simulation; backpropagation learning algorithm; error convergence; initial weights; maximum slope; neural nets; nodes; premature saturation; probability; sigmoidal activation function; Algorithm design and analysis; Cities and towns; Convergence; Intelligent networks; Multi-layer neural network; Multilayer perceptrons; Neural networks; Pattern classification; Pattern recognition; Random number generation;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155275