Title :
Deterministic weight modification algorithm for efficient learning
Author :
Ng, S.C. ; Cheung, C.C. ; Leung, S.H.
Author_Institution :
Sch. of Sci. & Tech., Open Univ. of HK, Hong Kong, China
Abstract :
This paper presents a new approach using deterministic weight modification (DWM) to speed up the convergence rate effectively and improve the global convergence capability of the standard and modified back-propagation (BP) algorithms. The main idea of DWM is to reduce the system error by changing the weights of a multi-layered feed-forward neural network in a deterministic way. Simulation results show that the performance of DWM is better than BP and other modified BP algorithms for a number of learning problems.
Keywords :
backpropagation; convergence; feedforward neural nets; gradient methods; multilayer perceptrons; convergence rate; deterministic weight modification algorithm; global convergence capability; gradient methods; modified backpropagation algorithms; multilayered feedforward neural network; system error reduction; Computational modeling; Convergence; Feedforward neural networks; Feedforward systems; Genetic algorithms; Multi-layer neural network; Neural networks; Neurons; Optimization methods; Simulated annealing;
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
Print_ISBN :
0-7803-8359-1
DOI :
10.1109/IJCNN.2004.1380076