DocumentCode :
507958
Title :
How to Measure the Essential Approximation Capability of a FNN
Author :
Wang, JianJun ; Bin Zou ; Chen, Baili
Author_Institution :
Sch. of Math. & Stat., Southwest Univ., Chongqing, China
Volume :
2
fYear :
2009
fDate :
14-16 Aug. 2009
Firstpage :
394
Lastpage :
398
Abstract :
In this paper, we firstly review the recent work on approximation properties of feedforward neural networks (FNN). We summarize the state-of-the-art results and explain their impact and significance. For feedforward neural networks, it is revealed the essential order of their approximation. It is proven that for any continuous function defined on a compact set of Rd, there exist three layer of FNNs with fixed number of hidden neurons that attain the essential order. Under certain assumption on the FNNs, the ideal upper bound and lower bound estimations on approximation precision of the FNNs are provided. The obtained results not only characterize the intrinsic property of approximation of the FNNs, but also uncover the implicit relationship between the precision (speed) and the number of hidden neurons of the FNNs.
Keywords :
approximation theory; feedforward neural nets; approximation properties; feedforward neural network; hidden neuron; Computer science; Convergence; Feedforward neural networks; Mathematics; Network topology; Neural networks; Neurons; Particle measurements; Statistics; Upper bound; Feedforward Neural networks; Neural Networks Research; essential approximation capability;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Natural Computation, 2009. ICNC '09. Fifth International Conference on
Conference_Location :
Tianjin
Print_ISBN :
978-0-7695-3736-8
Type :
conf
DOI :
10.1109/ICNC.2009.421
Filename :
5364223
Link To Document :
بازگشت