Title :
Parallel tangent methods with variable stepsize
Author :
Petalas, Y.G. ; Vrahatis, M.N.
Author_Institution :
Dept. of Math., Patras Univ., Greece
Abstract :
The most widely used algorithm for training multilayer feedforward neural networks is backpropagation. Back-propagation is an iterative gradient descent algorithm. Since its appearance, various methods which modify the conventional BP have been created to improve its efficiency. One such algorithm which uses an adaptive learning rate is backpropagation with variable stepsize, is proposed. Parallel tangent methods are used in global optimization to modify and improve the simple gradient descent algorithm by using from time to time the difference between the current point and the point before two steps as a search direction, instead of using the gradient. In this study, we investigate the combination of the BPVS method with the parallel tangent approach for neural network training. We perform experimental results on well-known test problems to evaluate the efficiency of the method.
Keywords :
backpropagation; feedforward neural nets; gradient methods; optimisation; search problems; BP variable stepsize; adaptive learning; backpropagation; global optimization; iterative gradient descent algorithm; multilayer feedforward neural networks; neural network training; parallel tangent methods; search problems; Artificial intelligence; Artificial neural networks; Backpropagation algorithms; Electronic mail; Equations; Error correction; Feedforward neural networks; Iterative algorithms; Mathematics; Neural networks;
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
Print_ISBN :
0-7803-8359-1
DOI :
10.1109/IJCNN.2004.1380081