DocumentCode :
1587971
Title :
A New Method to Improve the Gradient Based Search Direction to Enhance the Computational Efficiency of Back Propagation Based Neural Network Algorithms
Author :
Nawi, N.M. ; Ransing, R.S. ; Ransing, M.R.
Author_Institution :
Fac. of Inf. Technol. & Multimedia, Univ. Tun Hussein Onn Malaysia, Batu Pahat
fYear :
2008
Firstpage :
546
Lastpage :
552
Abstract :
Neural-network techniques, particularly back- propagation algorithms, have been widely used as a tool for discovering a mapping function between a known set of input and output examples. Neural networks learn from the known example set by adjusting its internal parameters, referred to as weights, using an optimisation procedure based on the ´least square fit principle´. The optimisation procedure normally involves thousands of iterations to converge to an acceptable solution. Hence, improving the computational efficiency of a neural- network algorithm is an active area of research. It has been shown in the existing literature that the variation of the gain parameter improves the learning efficiency of the gradient-descent method. However, it can be concluded from previous researchers´ claims that the adaptive-gain variation improved the learning rate and hence the efficiency. It was discovered in this research that the gain variation has no influence on the learning rate; however, it actually influences the search direction. A novel technique of integrating the adaptive-learning rate method coupled with an improved search direction for improving the computational efficiency of neural networks is presented. The new approach had shown that this modification can significantly enhance the computational efficiency of training process.
Keywords :
backpropagation; gradient methods; iterative methods; neural nets; optimisation; search problems; adaptive-learning rate method; back propagation based neural network algorithm; computational efficiency enhancement; gradient-descent based search direction; iterative method; least square fit principle; mapping function discovery; optimisation procedure; Approximation algorithms; Asia; Backpropagation algorithms; Computational efficiency; Computational modeling; Equations; Error correction; Information technology; Neural networks; Optimization methods; Adaptive gain variation; back propagation; gradient based method; optimisation; training efficiency;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Modeling & Simulation, 2008. AICMS 08. Second Asia International Conference on
Conference_Location :
Kuala Lumpur
Print_ISBN :
978-0-7695-3136-6
Electronic_ISBN :
978-0-7695-3136-6
Type :
conf
DOI :
10.1109/AMS.2008.70
Filename :
4530534
Link To Document :
بازگشت