• DocumentCode
    288651
  • Title

    On the synthesis and complexity of feedforward networks

  • Author

    Sacha, Jaroslaw P. ; Cios, Krzysztof J.

  • Author_Institution
    Toledo Univ., OH, USA
  • Volume
    4
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    2185
  • Abstract
    A method for synthesis of one hidden layer feedforward neural networks approximating functions of several variables to the desired degree of accuracy is presented. The method fully determines the network architecture including the values of all weights. In the first step, the inverse Radon transform is used to decompose a problem of of approximating a function of several variables into several problems of approximating a function of one variable. In the second step, each of the obtained one-dimensional functions is approximated by a sub-network. Then sub-networks are combined to construct the network approximating the original function. The upper bound of the final approximation error ε and errors at each step are estimated. The complexity of the network, or the number of neurons in the hidden layer, is Oε (1/εn), where n is the dimension of the space
  • Keywords
    Radon transforms; approximation theory; computational complexity; feedforward neural nets; function approximation; functional analysis; approximation error; complexity; decomposition; function approximation; hidden layer feedforward neural networks; inverse Radon transform; network architecture; upper bound; weights; Approximation error; Feedforward neural networks; Indium tin oxide; Integral equations; Multidimensional systems; Network synthesis; Neural networks; Neurons; Transfer functions; Upper bound;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374555
  • Filename
    374555