DocumentCode :
2622740
Title :
On L2 convergence rates of radial basis function networks and kernel regression estimators
Author :
Krzyzak, Adam ; Xu, Lei ; Niemann, Heinrich
Author_Institution :
Dept. of Comput. Sci., Concordia Univ., Montreal, Que., Canada
fYear :
1994
fDate :
27 Jun-1 Jul 1994
Firstpage :
37
Abstract :
The paper generalises the rates of L2 convergence for RBF nets based on the kernel regression estimates (KRE) obtained by optimising the empirical error with respect to the weight vector and the receptive field size. The centers of the radial functions are placed at the points sampled with replacement from the learning sequence. The bounded output convergence and the rate of convergence for the RBF net have been obtained for radial functions with noncompact support. New results have been obtained for the L2 convergence rates of KRE and RBF nets in the case of unbounded outputs
Keywords :
convergence; feedforward neural nets; multilayer perceptrons; recursive estimation; L2 convergence rates; RBF nets; bounded output convergence; error optimisation; kernel regression estimators; learning sequence; radial basis function networks; receptive field size; unbounded outputs; weight vector; Approximation error; Computer science; Convergence; Kernel; Radial basis function networks; Upper bound;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 1994. Proceedings., 1994 IEEE International Symposium on
Conference_Location :
Trondheim
Print_ISBN :
0-7803-2015-8
Type :
conf
DOI :
10.1109/ISIT.1994.394934
Filename :
394934
Link To Document :
بازگشت