DocumentCode :
1797814
Title :
Learning rates of neural network estimators via the new FNNs operators
Author :
Yi Zhao ; Dansheng Yu
Author_Institution :
Sch. of Sci., Hangzhou Dianzi Univ., Hangzhou, China
fYear :
2014
fDate :
6-11 July 2014
Firstpage :
1359
Lastpage :
1365
Abstract :
In this paper, estimation of a regression function with independent and identically distributed random variables is investigated. The regression estimators are defined by minimization of empirical least-square regularized algorithm over a class of functions, which are defined by the feed forward neural networks (FNNs). In order to derive the learning rates of these FNNs regression function estimators, the new FNNs operators are constructed via modified sigmoidal functions. Vapnik-Chervonenkis dimension (V-C dimension) of the class of FNNs functions is also discussed. In addition, the direct approximation theorem by the neural network operators in Lρx2 with Borel probability measure ρ is established.
Keywords :
feedforward neural nets; learning (artificial intelligence); least squares approximations; neural nets; regression analysis; Borel probability measure; FNN operators; V-C dimension; Vapnik-Chervonenkis dimension; direct approximation theorem; empirical least-square regularized algorithm; feedforward neural networks; learning rates; modified sigmoidal functions; neural network estimators; neural network operators; regression function estimation; Approximation methods; Biological neural networks; Educational institutions; Estimation; Feeds; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2014 International Joint Conference on
Conference_Location :
Beijing
Print_ISBN :
978-1-4799-6627-1
Type :
conf
DOI :
10.1109/IJCNN.2014.6889633
Filename :
6889633
Link To Document :
بازگشت