Title :
Exploring and comparing the best “direct methods” for the efficient training of MLP-networks
Author :
Di Martino, M. ; Fanelli, S. ; Protasi, M.
Author_Institution :
Dipartimento di Matematica, Rome Univ., Italy
fDate :
11/1/1996 12:00:00 AM
Abstract :
It is well known that the main difficulties of the algorithms based on backpropagation are the susceptibility to local minima and the slow adaptivity to the patterns during the training. In this paper, we present a class of algorithms, which overcome the above difficulties by utilizing some “direct” numerical methods for the computation of the matrices of weights. In particular, we investigate the performances of the FBFBK-LSB (least-squares backpropagation) algorithms and iterative conjugate gradient singular-value decomposition (ICGSVD), respectively, introduced by Barmann and Biegler-Konig (1993) and by the authors. Numerical results on several benchmark problems show a major reliability and/or efficiency of our algorithm ICGSVD
Keywords :
backpropagation; conjugate gradient methods; least squares approximations; multilayer perceptrons; singular value decomposition; direct numerical methods; feedforward neural networks; iterative conjugate gradient SVD; least-squares backpropagation; local minima; multilayer perceptrons; singular-value decomposition; Backpropagation algorithms; Equations; Feedforward neural networks; Iterative algorithms; Iterative methods; Matrix decomposition; Multilayer perceptrons; Neural networks; Neurons; Performance analysis;
Journal_Title :
Neural Networks, IEEE Transactions on