DocumentCode :
3158924
Title :
Nonparametric classification using radial basis function nets and empirical risk minimization
Author :
Krzyzak, A. ; Linder, T. ; Lugosi, G.
Author_Institution :
Dept. of Comput. Sci., Concordia Univ., Montreal, Que., Canada
Volume :
2
fYear :
1994
fDate :
9-13 Oct 1994
Firstpage :
72
Abstract :
In the paper convergence properties of radial basis function (RBF) networks are studied for a large class of basis functions. The universal approximation property of the nets is shown. Parameters of RBF nets are learned through empirical risk minimization. The optimal nets are shown to be consistent in nonparametric classification. The tools used in the analysis include Vapnik-Chervonenkis (VC) dimension and the covering numbers
Keywords :
feedforward neural nets; Vapnik-Chervonenkis dimension; convergence; covering numbers; empirical risk minimization; nonparametric classification; radial basis function nets; risk minimization; universal approximation; Computer science; Convergence; Estimation error; Kernel; Mathematics; Neural networks; Radial basis function networks; Random variables; Risk management; Virtual colonoscopy;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 1994. Vol. 2 - Conference B: Computer Vision & Image Processing., Proceedings of the 12th IAPR International. Conference on
Conference_Location :
Jerusalem
Print_ISBN :
0-8186-6270-0
Type :
conf
DOI :
10.1109/ICPR.1994.576878
Filename :
576878
Link To Document :
بازگشت