DocumentCode :
2629065
Title :
Backpropagation based on the logarithmic error function and elimination of local minima
Author :
Matsuoka, Kiyotoshi ; Yi, Jlanqiang
Author_Institution :
Dept. of Control Eng., Kyushu Inst. of Technol., Kitakyushu, Japan
fYear :
1991
fDate :
18-21 Nov 1991
Firstpage :
1117
Abstract :
It is has previously been pointed out that, in backpropagation learning of neural networks, using a logarithmic error function instead of the familiar quadratic error function yields remarkable reductions in learning times. In the present work, it is shown theoretically and experimentally that learning based on the logarithmic error function has the effect of reducing the density of local minima. It is proved mathematically that, in a particular sense, the logarithmic error function provides a lower (at most equal) density of local minima in any network. the logarithmic error function also alleviates the problem of getting stuck in local minima
Keywords :
error analysis; learning systems; neural nets; backpropagation learning; local minima; logarithmic error function; neural networks; Acceleration; Backpropagation; Computer networks; Control engineering; Error correction; Functional programming; Large Hadron Collider; Learning systems; Neural networks; Packaging;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
Type :
conf
DOI :
10.1109/IJCNN.1991.170546
Filename :
170546
Link To Document :
بازگشت