Title :
Radial basis function networks and complexity regularization in function learning
Author :
Krzyzak, Adam ; Linder, Tamás
Author_Institution :
Dept. of Comput. Sci., Concordia Univ., Montreal, Que., Canada
fDate :
3/1/1998 12:00:00 AM
Abstract :
We apply the method of complexity regularization to derive estimation bounds for nonlinear function estimation using a single hidden layer radial basis function network. Our approach differs from previous complexity regularization neural-network function learning schemes in that we operate with random covering numbers and l1 metric entropy, making it possible to consider much broader families of activation functions, namely functions of bounded variation. Some constraints previously imposed on the network parameters are also eliminated this way. The network is trained by means of complexity regularization involving empirical risk minimization. Bounds on the expected risk in terms of the sample size are obtained for a large class of loss functions. Rates of convergence to the optimal loss are also derived
Keywords :
computational complexity; feedforward neural nets; function approximation; learning (artificial intelligence); minimisation; transfer functions; activation functions; complexity regularization; convergence rates; empirical risk minimization; estimation bounds; expected risk; function learning; l1 metric entropy; loss functions; nonlinear function estimation; random covering numbers; single hidden layer radial basis function network; Approximation error; Artificial neural networks; Convergence; Entropy; Estimation error; Intelligent networks; Neural networks; Probability distribution; Radial basis function networks; Risk management;
Journal_Title :
Neural Networks, IEEE Transactions on