DocumentCode :
3180457
Title :
Interactive gradient algorithm for radial basis function networks
Author :
Li Jianyu ; Siwei, Luo ; Yingjian, Qi ; Yaping, Huang
Author_Institution :
Comput. Sci. Dept, Northern Jiaotong Univ., Beijing, China
Volume :
2
fYear :
2002
fDate :
26-30 Aug. 2002
Firstpage :
1187
Abstract :
In this paper the radial basis function neural network is divided into two parts: (1) the input and the hidden layer, (2) the output layer, and the parameters of the two parts are trained through an interactive gradient learning algorithm. Experimental results in function approximation are more attractive, which show that the algorithm not only avoids the slow rate of the conventional gradient algorithm, but also reduces the nonlinear degree of the radial basis function neural network.
Keywords :
function approximation; gradient methods; learning (artificial intelligence); radial basis function networks; function approximation; hidden layer; input layer; interactive gradient algorithm; neural network; nonlinear degree; output layer; parameter training; radial basis function networks; Approximation algorithms; Broadcasting; Equations; Function approximation; Iterative algorithms; Neural networks; Neurons; Pattern recognition; Radial basis function networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signal Processing, 2002 6th International Conference on
Print_ISBN :
0-7803-7488-6
Type :
conf
DOI :
10.1109/ICOSP.2002.1180002
Filename :
1180002
Link To Document :
بازگشت