DocumentCode :
1509947
Title :
Exploring and comparing the best “direct methods” for the efficient training of MLP-networks
Author :
Di Martino, M. ; Fanelli, S. ; Protasi, M.
Author_Institution :
Dipartimento di Matematica, Rome Univ., Italy
Volume :
7
Issue :
6
fYear :
1996
fDate :
11/1/1996 12:00:00 AM
Firstpage :
1497
Lastpage :
1502
Abstract :
It is well known that the main difficulties of the algorithms based on backpropagation are the susceptibility to local minima and the slow adaptivity to the patterns during the training. In this paper, we present a class of algorithms, which overcome the above difficulties by utilizing some “direct” numerical methods for the computation of the matrices of weights. In particular, we investigate the performances of the FBFBK-LSB (least-squares backpropagation) algorithms and iterative conjugate gradient singular-value decomposition (ICGSVD), respectively, introduced by Barmann and Biegler-Konig (1993) and by the authors. Numerical results on several benchmark problems show a major reliability and/or efficiency of our algorithm ICGSVD
Keywords :
backpropagation; conjugate gradient methods; least squares approximations; multilayer perceptrons; singular value decomposition; direct numerical methods; feedforward neural networks; iterative conjugate gradient SVD; least-squares backpropagation; local minima; multilayer perceptrons; singular-value decomposition; Backpropagation algorithms; Equations; Feedforward neural networks; Iterative algorithms; Iterative methods; Matrix decomposition; Multilayer perceptrons; Neural networks; Neurons; Performance analysis;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.548177
Filename :
548177
Link To Document :
بازگشت