• DocumentCode
    288605
  • Title

    On the required size of multilayer networks for implementing real-valued functions

  • Author

    Atiya, Amir

  • Author_Institution
    Dept. of Comput. Eng., Cairo Univ., Egypt
  • Volume
    3
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    1454
  • Abstract
    One of the important theoretical issues studied by neural network researchers is how large should the network be to realize an arbitrary set of training patterns. Baum (1988) considered the case of two-class classification problems, where the input vectors are in general position. By general position the author means that no D+1 vectors lie on a (D-1)dimensional hyperplane. He proved that [M/D] hidden nodes are both necessary and sufficient for implementing any arbitrary dichotomy, where M denotes the number of examples, D denotes the dimension of the pattern vectors, and [x] means the smallest integer ⩾x. Buang and Huang (1991) and Sartori and Antsaklis (1991) proved that for the case that the general position condition does not hold, M-1 hidden nodes are sufficient for implementing analog mappings. In this paper the author considers analog mappings (real-valued input vectors and real-valued scalar outputs), and assumes the general position condition. It is proved that 2[M/D] hidden nodes are sufficient for implementing arbitrary mappings
  • Keywords
    learning (artificial intelligence); multilayer perceptrons; analog mappings; arbitrary mappings; general position condition; hidden nodes; multilayer networks; real-valued functions; real-valued input vectors; real-valued scalar outputs; two-class classification problems; Computer networks; Multi-layer neural network; Neural networks; Nonhomogeneous media; Transfer functions;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374500
  • Filename
    374500