DocumentCode :
2341475
Title :
The research on the relation of self-learning ratio and the convergence speed in BP networks
Author :
Zhaoying Zhou
fYear :
1994
fDate :
10-12 May 1994
Firstpage :
131
Abstract :
The relation of self-learning ratio and the convergence speed in BP network is proposed in this paper. In theory, only when the self-learning ratio μ→0, the real gradient descent can be got, and the computation will converge to a certain local minimum point. But, a too small μ will cause a slow convergence speed and a too large μ may cause divergence. On the base of mathematical analysis and some computer simulations, the relation formula is given out as follows: n=ln[ε/|W(0)-W*|]/ln(1-μa) where n is the amount of iterative, μ is self-learning ratio, w(0) is the original weight and w* is the best weight, ε is the precision requirement, a is the slope of gradient imitative straight line. It is also proposed for a method to determine a better self-learning ratio
Keywords :
backpropagation; convergence of numerical methods; digital simulation; iterative methods; neural nets; numerical analysis; backpropagation networks; computer simulation; convergence speed; gradient imitative straight line; mathematical analysis; neural nets; real gradient descent; self-learning ratio; Artificial neural networks; Computer networks; Computer simulation; Convergence; Feedforward neural networks; Intelligent networks; Mathematical analysis; Motion control; Neural networks; Neurons;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Instrumentation and Measurement Technology Conference, 1994. IMTC/94. Conference Proceedings. 10th Anniversary. Advanced Technologies in I & M., 1994 IEEE
Conference_Location :
Hamamatsu
Print_ISBN :
0-7803-1880-3
Type :
conf
DOI :
10.1109/IMTC.1994.352107
Filename :
352107
Link To Document :
بازگشت