Title :
Gain elimination from backpropagation neural networks
Author :
Thimm, G. ; Fiesler, E. ; Moerland, P.
Author_Institution :
IDIAP, Martigny, Switzerland
Abstract :
It is shown that the gain of the sigmoidal activation function, as used in backpropagation neural networks, can be eliminated since there exists a well-defined relationship between the gain, the learning rate, and the set of initial weights. Similarly, it is also possible to eliminate the learning rate by adjusting the gain and the initial weights. This relationship is proven and extended to various variations of the backpropagation learning rule as well as applied to hardware implementations of neural networks
Keywords :
backpropagation; feedforward neural nets; transfer functions; backpropagation; gain elimination; gain weight; learning rate; neural networks; sigmoidal activation function; Backpropagation; Backpropagation algorithms; Electronic mail; Multi-layer neural network; Network topology; Neural network hardware; Neural networks; Nonlinear optics; Optical computing; Optical fiber networks; Optical propagation;
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
DOI :
10.1109/ICNN.1995.488126