Title :
A stochastic backpropagation algorithm for training neural networks
Author :
Chen, Y.Q. ; Yin, T. ; Babri, H.A.
Author_Institution :
Sch. of Electr. & Electron. Eng., Nanyang Technol. Inst., Singapore
Abstract :
The popularly used backpropagation algorithm (BP) for training multilayered neural networks is generally slow and prone to getting stuck in local minima. A novel method to improve the performance of the BP by randomising the cost function is proposed. The method is effective in helping the BP algorithm to escape from local minima and therefore improve the convergence and generalization. This is demonstrated on a non-convex pattern recognition problem
Keywords :
backpropagation; convergence of numerical methods; feedforward neural nets; pattern recognition; random processes; stochastic processes; convergence; experimental results; local minima; multilayered feedforward neural networks; neural network training; nonconvex pattern recognition; performance; randomised cost function; stochastic backpropagation algorithm; Backpropagation algorithms; Convergence; Cost function; Entropy; Feedforward neural networks; Jacobian matrices; Multi-layer neural network; Neural networks; Pattern recognition; Stochastic processes;
Conference_Titel :
Information, Communications and Signal Processing, 1997. ICICS., Proceedings of 1997 International Conference on
Print_ISBN :
0-7803-3676-3
DOI :
10.1109/ICICS.1997.652068