DocumentCode :
850350
Title :
Fast learning algorithms for neural networks
Author :
Karayiannis, Nicolaos B. ; Venetsanopoulos, Anastasios N.
Author_Institution :
Dept. of Electr. Eng., Houston Univ., TX, USA
Volume :
39
Issue :
7
fYear :
1992
fDate :
7/1/1992 12:00:00 AM
Firstpage :
453
Lastpage :
474
Abstract :
A generalized criterion for the training of feedforward neural networks is proposed. Depending on the optimization strategy used, this criterion leads to a variety of fast learning algorithms for single-layered as well as multilayered neural networks. The simplest algorithm devised on the basis of this generalized criterion is the fast delta rule algorithm, proposed for the training of single-layered neural networks. The application of a similar optimization strategy to multilayered neural networks in conjunction with the proposed generalized criterion provides the fast backpropagation algorithm. Another set of fast algorithms with better convergence properties is derived on the basis of the same strategy that provided recently a family of Efficient LEarning Algorithms for Neural NEtworks (ELEANNE). Several experiments verify that the fast algorithms developed perform the training of neural networks faster than the corresponding learning algorithms existing in the literature
Keywords :
feedforward neural nets; learning (artificial intelligence); ELEANNE; backpropagation algorithm; delta rule algorithm; feedforward neural networks; learning algorithms; multilayered neural networks; optimization strategy; single-layered neural networks; Computer networks; Control systems; Convergence; Feedforward neural networks; Feedforward systems; Image recognition; Multi-layer neural network; Neural networks; Signal processing algorithms; Speech;
fLanguage :
English
Journal_Title :
Circuits and Systems II: Analog and Digital Signal Processing, IEEE Transactions on
Publisher :
ieee
ISSN :
1057-7130
Type :
jour
DOI :
10.1109/82.160170
Filename :
160170
Link To Document :
بازگشت