DocumentCode :
1082750
Title :
Notes on weighted norms and network approximation of functionals
Author :
Sandberg, Irwin W.
Author_Institution :
Dept. of Electr. & Comput. Eng., Texas Univ., Austin, TX, USA
Volume :
43
Issue :
7
fYear :
1996
fDate :
7/1/1996 12:00:00 AM
Firstpage :
600
Lastpage :
601
Abstract :
Among other results in the literature concerning arbitrarily good approximation that concern more general types of “target” functionals, different network structures, other nonlinearities, and various measures of approximation errors is the proposition that any continuous real nonlinear functional on a compact subset of a real normed linear space can be approximated arbitrarily well using a single-hidden-layer neural network with a linear functional input layer and exponential (or polynomial or sigmoidal or radial basis function) nonlinearities
Keywords :
feedforward neural nets; functional equations; nonlinear equations; approximation errors; continuous real nonlinear functional; linear functional input layer; network approximation; network structures; polynomial nonlinearities; radial basis function; real normed linear space; sigmoidal nonlinearities; single-hidden-layer neural network; target functionals; weighted norms; Approximation error; Circuits; Extraterrestrial measurements; Integral equations; Neural networks; Polynomials;
fLanguage :
English
Journal_Title :
Circuits and Systems I: Fundamental Theory and Applications, IEEE Transactions on
Publisher :
ieee
ISSN :
1057-7122
Type :
jour
DOI :
10.1109/81.508182
Filename :
508182
Link To Document :
بازگشت