Title :
Improved kick out learning algorithm with delta-bar-delta-bar rule
Author :
Ochiai, Keihiro ; Usui, Shiro
Author_Institution :
Dept. of Inf. & Comput. Sci., Toyohashi Univ. of Technol., Japan
Abstract :
A new adaptive rule is proposed. It is called the delta-bar-delta-bar rule. It improves robustness for settling the increment and decrement factors of the learning rate of an accelerated backpropagation algorithm. This rule is introduced into the kick out algorithm, and it is shown that it is effective in extracting the best performance of the kick out algorithm. Using the delta-bar-delta-bar rule, the rate of convergence of the kick out algorithm is substantially improved, even though the learning rates are not optimally set
Keywords :
backpropagation; convergence; neural nets; accelerated backpropagation algorithm; adaptive rule; best performance; decrement factors; delta-bar-delta-bar rule; increment factors; kick out learning algorithm; learning rate; rate of convergence; robustness; Acceleration; Backpropagation algorithms; Convergence; Jacobian matrices; Robustness;
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
DOI :
10.1109/ICNN.1993.298568