Title :
A simple procedure in back-propagation training
Author :
Yu, Chien-Cheng ; Liu, Bin-Da
Author_Institution :
Dept. of Electr. Eng., Cheng Kung Univ., Tainan, Taiwan
Abstract :
The standard back-propagation (BP) algorithm for multilayer feedforward neural networks is basically a gradient-descent method, it has the problems of local minima and slow convergence. In this paper, a simple method based on the BP algorithm by employing an adaptive learning rate and momentum factor to reduce the training time is presented. Simulation results indicate a superior convergence speed as compared to other competing methods
Keywords :
backpropagation; convergence; feedforward neural nets; gradient methods; multilayer perceptrons; BP algorithm; MFNN; adaptive learning rate; back-propagation training; backpropagation training; convergence speed; gradient-descent method; local minima; momentum factor; multilayer feed-forward neural networks; multilayer feedforward neural networks; slow convergence; training time reduction; Acceleration; Convergence; Error correction; Feedforward neural networks; Feedforward systems; Jacobian matrices; Multi-layer neural network; Neural networks; Supervised learning;
Conference_Titel :
Info-tech and Info-net, 2001. Proceedings. ICII 2001 - Beijing. 2001 International Conferences on
Conference_Location :
Beijing
Print_ISBN :
0-7803-7010-4
DOI :
10.1109/ICII.2001.983111