Title :
Solving flat-spot problem in back-propagation learning algorithm based on magnified error
Author :
Yang, Bo ; Wang, Ya-dong ; Su, Xiao-Hong ; Wang, Li-juan
Author_Institution :
Sch. of Comput. Sci. & Technol., Harbin Inst. of Technol., China
Abstract :
A new learning algorithm based on magnified error is proposed to speedup the training of back-propagation neural networks, and to improve the performances of neural network. The key to this algorithm lies in varying the error item of output layer, which magnify the backward propagated error signal especially when the weight adjustment of output layer is slow or even suppressed. Therefore, the algorithm is able to get rid of the influence of "flat spot" problem, and solve the slow convergence problem. Consequently the convergence rate can be accelerated, and the training has great capability in meeting the convergence criteria quickly with a simple network structure. Experiments on parity-3 problem and soybean data classification problem show that this method has advantages of faster learning speed and less computational cost than most of the improved algorithms such as sigmoid-prime offset technique (SPO), scaled linear approximation of sigmoid method (SLA) and so on.
Keywords :
approximation theory; backpropagation; convergence; error statistics; neural nets; pattern classification; backpropagation learning algorithm; backward propagated error signal; computational cost; convergence rate problem; flat spot problem; neural network structure; neural network training; parity-3 problem; scaled linear approximation; sigmoid prime offset technique; soybean data classification problem; Acceleration; Computational efficiency; Computer errors; Computer science; Convergence; Linear approximation; Machine learning; Machine learning algorithms; Multi-layer neural network; Neural networks;
Conference_Titel :
Machine Learning and Cybernetics, 2004. Proceedings of 2004 International Conference on
Print_ISBN :
0-7803-8403-2
DOI :
10.1109/ICMLC.2004.1382065