Title :
Notes on weighted norms and network approximation of functionals
Author :
Sandberg, Irwin W.
Author_Institution :
Dept. of Electr. & Comput. Eng., Texas Univ., Austin, TX, USA
fDate :
7/1/1996 12:00:00 AM
Abstract :
Among other results in the literature concerning arbitrarily good approximation that concern more general types of “target” functionals, different network structures, other nonlinearities, and various measures of approximation errors is the proposition that any continuous real nonlinear functional on a compact subset of a real normed linear space can be approximated arbitrarily well using a single-hidden-layer neural network with a linear functional input layer and exponential (or polynomial or sigmoidal or radial basis function) nonlinearities
Keywords :
feedforward neural nets; functional equations; nonlinear equations; approximation errors; continuous real nonlinear functional; linear functional input layer; network approximation; network structures; polynomial nonlinearities; radial basis function; real normed linear space; sigmoidal nonlinearities; single-hidden-layer neural network; target functionals; weighted norms; Approximation error; Circuits; Extraterrestrial measurements; Integral equations; Neural networks; Polynomials;
Journal_Title :
Circuits and Systems I: Fundamental Theory and Applications, IEEE Transactions on