Title :
Some results on L1 convergence rate of RBF networks and kernel regression estimators
Author :
Krzyzak, Adam ; Xu, Lei
Author_Institution :
Dept. of Comput. Sci., Concordia Univ., Montreal, Que., Canada
fDate :
27 Jun-2 Jul 1994
Abstract :
Rather than studying the L2 convergence rates of kernel regression estimators (KRE) and radial basis function (RBF) nets given in Xu-Krzyzak-Yuille (1992 & 1993), we study convergence properties of the mean integrated absolute error (MIAE) for KRE and RBF nets. It has been shown that MIAE of KRE and RBF nets can converge to zero as the size of networks and the size of the training sequence tend to ∞, and that the upper bound for the convergence rate of MIAE is O(n-αs/(2+s)( 2α+d)) for approximating Lipschitz functions
Keywords :
convergence of numerical methods; estimation theory; feedforward neural nets; learning (artificial intelligence); statistical analysis; L1 convergence rate; approximating Lipschitz functions; kernel regression estimators; mean integrated absolute error; radial basis function nets; training sequence; upper bound; Convergence; Estimation error; Kernel; Neural networks; Probability; Radial basis function networks; Symmetric matrices; Upper bound;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374356