Title :
Conjugate gradient learning algorithms for multilayer perceptrons
Author :
Goryn, D. ; Kaveh, M.
Author_Institution :
Dept. of Electr. Eng., Minnesota Univ., Minneapolis, MN, USA
Abstract :
Learning complex tasks in a multilayer perceptron is a nonlinear optimization problem that is often very difficult and painstakingly slow. The use of conjugate gradient methods to speed up convergence rates is proposed. These methods result in a very moderate increase in storage and computational complexity compared to the commonly used backpropagation algorithm. The algorithm used is a modified conjugate gradient method that uses inexact line searches. This reduces the number of function evaluations necessary in the line search part of the algorithm. Simulation results that show the improved convergence rate compared to the backpropagation algorithm are presented
Keywords :
computational complexity; function evaluation; learning systems; neural nets; optimisation; complex tasks; computational complexity; conjugate gradient methods; convergence rates; function evaluations; inexact line searches; learning algorithms; multilayer perceptrons; nonlinear optimization problem; Artificial neural networks; Backpropagation algorithms; Computational complexity; Computational modeling; Convergence; Feedforward systems; Gradient methods; Multilayer perceptrons; Neurons; Speech processing;
Conference_Titel :
Circuits and Systems, 1989., Proceedings of the 32nd Midwest Symposium on
Conference_Location :
Champaign, IL
DOI :
10.1109/MWSCAS.1989.101960