Title :
A novel neural learning algorithm for multilayer perceptrons
Author :
Luh, Peter B. ; Zhang, Li
Author_Institution :
Dept. of Electr. & Syst. Eng., Connecticut Univ., Storrs, CT, USA
Abstract :
Multilayer perceptron networks have been used to perform a variety of forecasting tasks, and back propagation is one of the most widely used training methods. It is a gradient method that can get stuck in local minima and has slow convergence. This paper presents a novel learning algorithm using the multiplier method. Testing results show that the new method has better convergence performance and generalization capability as compared to the back propagation method
Keywords :
convergence; learning (artificial intelligence); multilayer perceptrons; back propagation; backpropagation; convergence; generalization; gradient method; local minima; multilayer perceptrons; multiplier method; neural learning algorithm; slow convergence; training methods; Convergence; Cost function; Gradient methods; Joining processes; Lagrangian functions; Load forecasting; Multilayer perceptrons; Neurons; Systems engineering and theory; Testing;
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-5529-6
DOI :
10.1109/IJCNN.1999.832630