Title :
Radial basis function networks and nonparametric classification: complexity regularization and rates of convergence
Author :
Krzyzak, Adam ; Linder, Tamas
Author_Institution :
Dept. of Comput. Sci., Concordia Univ., Montreal, Que., Canada
Abstract :
The method of complexity regularization is applied to one hidden-layer radial basis function networks to derive regression estimation bounds and convergence rates for classification. Bounds on the expected risk in terms of the training sample size are obtained for a large class of activation functions, namely functions of bounded variation. Rates of convergence to the optimal loss are also derived
Keywords :
convergence; feedforward neural nets; multilayer perceptrons; optimisation; pattern classification; statistical analysis; activation functions; bounded variation; complexity regularization; convergence rates; nonparametric classification; one-hidden-layer radial basis function networks; regression estimation bounds; training sample size; Artificial neural networks; Computer science; Convergence; Entropy; Estimation error; Probability distribution; Radial basis function networks; Random variables; Training data; Yield estimation;
Conference_Titel :
Pattern Recognition, 1996., Proceedings of the 13th International Conference on
Conference_Location :
Vienna
Print_ISBN :
0-8186-7282-X
DOI :
10.1109/ICPR.1996.547645