DocumentCode :
2344003
Title :
Neural network approximation and estimation of functions
Author :
Cheang, Gerald H L
Author_Institution :
Dept. of Stat., Yale Univ., New Haven, CT, USA
fYear :
1994
fDate :
27-29 Oct 1994
Firstpage :
59
Abstract :
Approximation and estimation bounds were obtained by Barron (see Proc. of the 7th Yale workshop on adaptive and learning systems, 1992, IEEE Transactions on Information Theory, vol.39, pp.930-944, 1993 and Machine Learning, vol.14, p.113-143, 1994) for function estimation by single hidden-layer neural nets. This paper highlights the extension of his results to the two hidden-layer case. The bounds derived for the two hidden-layer case depend on the number of nodes T1 and T2 in each hidden-layer, and also on the sample size N. It is seen from our bounds that in some cases, an exponentially large number of nodes, and hence parameters, is not required
Keywords :
approximation theory; estimation theory; feedforward neural nets; function approximation; functional analysis; multilayer perceptrons; estimation bounds; function estimation; hidden-layer neural nets; neural network approximation; nodes; Approximation error; Ellipsoids; Neural networks; Probability distribution; Statistics; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory and Statistics, 1994. Proceedings., 1994 IEEE-IMS Workshop on
Conference_Location :
Alexandria, VA
Print_ISBN :
0-7803-2761-6
Type :
conf
DOI :
10.1109/WITS.1994.513888
Filename :
513888
Link To Document :
بازگشت