DocumentCode :
744680
Title :
Globally convergent algorithms with local learning rates
Author :
Magoulas, George D. ; Plagianakos, Vassilis P. ; Vrahatis, Michael N.
Author_Institution :
Dept. of Inf. Syst. & Comput., Brunel Univ., London, UK
Volume :
13
Issue :
3
fYear :
2002
fDate :
5/1/2002 12:00:00 AM
Firstpage :
774
Lastpage :
779
Abstract :
A novel generalized theoretical result is presented that underpins the development of globally convergent first-order batch training algorithms which employ local learning rates. This result allows us to equip algorithms of this class with a strategy for adapting the overall direction of search to a descent one. In this way, a decrease of the batch-error measure at each training iteration is ensured, and convergence of the sequence of weight iterates to a local minimizer of the batch error function is obtained from remote initial weights. The effectiveness of the theoretical result is illustrated in three application examples by comparing two well-known training algorithms with local learning rates to their globally convergent modifications
Keywords :
backpropagation; neural nets; search problems; Qprop; Quickprop; Silva-Almeida method; backpropagation networks; batch error function; batch training; batch-error measure; endoscopy; generalized theoretical result; globally convergent algorithms; globally convergent first-order batch training algorithms; globally convergent modifications; gradient descent; local learning rate adaptation; local learning rates; local minimizer; remote initial weights; training algorithms; training iteration; Algorithm design and analysis; Artificial intelligence; Convergence; Design methodology; Endoscopes; Error correction; Heuristic algorithms; Information systems; Mathematics;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2002.1000148
Filename :
1000148
Link To Document :
بازگشت