DocumentCode :
1842493
Title :
A novel neural learning algorithm for multilayer perceptrons
Author :
Luh, Peter B. ; Zhang, Li
Author_Institution :
Dept. of Electr. & Syst. Eng., Connecticut Univ., Storrs, CT, USA
Volume :
3
fYear :
1999
fDate :
1999
Firstpage :
1696
Abstract :
Multilayer perceptron networks have been used to perform a variety of forecasting tasks, and back propagation is one of the most widely used training methods. It is a gradient method that can get stuck in local minima and has slow convergence. This paper presents a novel learning algorithm using the multiplier method. Testing results show that the new method has better convergence performance and generalization capability as compared to the back propagation method
Keywords :
convergence; learning (artificial intelligence); multilayer perceptrons; back propagation; backpropagation; convergence; generalization; gradient method; local minima; multilayer perceptrons; multiplier method; neural learning algorithm; slow convergence; training methods; Convergence; Cost function; Gradient methods; Joining processes; Lagrangian functions; Load forecasting; Multilayer perceptrons; Neurons; Systems engineering and theory; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-5529-6
Type :
conf
DOI :
10.1109/IJCNN.1999.832630
Filename :
832630
Link To Document :
بازگشت