DocumentCode :
3170322
Title :
On L1 convergence rate of RBF networks and kernel regression estimators with applications in classification
Author :
Krzyzak, A. ; Klasa, S. ; Xu, L.
Author_Institution :
Dept. of Comput. Sci., Concordia Univ., Montreal, Que., Canada
Volume :
2
fYear :
1994
fDate :
9-13 Oct 1994
Firstpage :
364
Abstract :
Studies the convergence properties of the mean integrated absolute error (MIAE) for kernel regression estimators (KRE) and radial basis function (RBF) nets. The authors show that the MIAE of KRE and RBF nets converges to zero as the size of network and the size of training sequence tend to infinity, and the authors give the upper bound for the convergence rate for approximating functions satisfying Lipschitz condition of order α,0<α⩽1. The obtained results are applied to nonparametric classification
Keywords :
feedforward neural nets; L1 convergence rate; Lipschitz condition; convergence properties; kernel regression estimators; mean integrated absolute error; nonparametric classification; radial basis function nets; training sequence; Application software; Approximation error; Computer errors; Computer science; Convergence; H infinity control; Intelligent networks; Kernel; Radial basis function networks; Upper bound;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 1994. Vol. 2 - Conference B: Computer Vision &amp; Image Processing., Proceedings of the 12th IAPR International. Conference on
Conference_Location :
Jerusalem
Print_ISBN :
0-8186-6270-0
Type :
conf
DOI :
10.1109/ICPR.1994.576937
Filename :
576937
Link To Document :
بازگشت