DocumentCode :
384262
Title :
Step acceleration based training algorithm for feedforward neural networks
Author :
Li, Yanlai ; Wang, Kuanquan ; Zhang, David
Author_Institution :
Dept. of Comput. Sci. & Eng., Harbin Inst. of Technol., China
Volume :
2
fYear :
2002
fDate :
2002
Firstpage :
84
Abstract :
This paper presents a very fast step acceleration based training algorithm (SATA) for multilayer feedforward neural network training. The most outstanding virtue of this algorithm is that it does not need to calculate the gradient of the target function. In each iteration step, the computation only concentrates on the corresponding varied part. The proposed algorithm has attributes in simplicity, flexibility and feasibility, as well as high speed of convergence. Compared with the other methods, including the conventional backpropagation (BP), conjugate gradient, and weight extrapolation based BP, many simulations confirmed the superiority of this algorithm in terms of converging speed and computation time required.
Keywords :
computational complexity; convergence; feedforward neural nets; learning (artificial intelligence); SATA; backpropagation; computational complexity; convergence; multilayer feedforward neural network; neural network training; step acceleration based training algorithm; Acceleration; Artificial neural networks; Biometrics; Computer networks; Computer science; Convergence; Extrapolation; Feedforward neural networks; Multi-layer neural network; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 2002. Proceedings. 16th International Conference on
ISSN :
1051-4651
Print_ISBN :
0-7695-1695-X
Type :
conf
DOI :
10.1109/ICPR.2002.1048243
Filename :
1048243
Link To Document :
بازگشت