DocumentCode :
1527439
Title :
A new error function at hidden layers for past training of multilayer perceptrons
Author :
Oh, Sang-Hoon ; Lee, Soo-Young
Author_Institution :
Dept. of Electr. Eng., Korea Adv. Inst. of Sci. & Technol., Taejon, South Korea
Volume :
10
Issue :
4
fYear :
1999
fDate :
7/1/1999 12:00:00 AM
Firstpage :
960
Lastpage :
964
Abstract :
This paper proposes a new error function at hidden layers to speed up the training of multilayer perceptrons (MLPs). With this new hidden error function, the layer-by-layer (LBL) algorithm approximately converges to the error backpropagation algorithm with optimum learning rates. Especially, the optimum learning rate for a hidden weight vector appears approximately as a multiplication of two optimum factors, one for minimizing the new hidden error function and the other for assigning hidden targets. Effectiveness of the proposed error function was demonstrated for handwritten digit recognition and isolated-word recognition tasks. Very fast learning convergence was obtained for MLPs without the stalling problem experienced in conventional LBL algorithms
Keywords :
backpropagation; convergence; error analysis; handwritten character recognition; multilayer perceptrons; optimisation; speech recognition; convergence; error backpropagation; error function; handwritten digit recognition; isolated-word recognition; layer-by-layer algorithm; multilayer perceptrons; optimum learning rates; past training; speech recognition; Adaptive signal processing; Bit error rate; Convolution; Delay estimation; Digital communication; Equalizers; Maximum likelihood estimation; Multilayer perceptrons; Radial basis function networks; Time-varying channels;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.774272
Filename :
774272
Link To Document :
بازگشت