• DocumentCode
    328237
  • Title

    Some n-bit parity problems are solvable by feedforward networks with less than n hidden units

  • Author

    Setiono, Rudy ; Hui, Lucas Chi Kwong

  • Author_Institution
    Dept. of Inf. Syst. & Comput. Sci., Nat. Univ. of Singapore, Singapore
  • Volume
    1
  • fYear
    1993
  • fDate
    25-29 Oct. 1993
  • Firstpage
    305
  • Abstract
    Starting with two hidden units, we train a simple single hidden layer feedforward neural network to solve the n-bit parity problem. If the network fails to recognize correctly all the input patterns, an additional hidden unit is added to the hidden layer and the network is retrained. This process is repeated until a network that correctly classifies all the input patterns has been constructed. Using a variant of the quasi-Newton methods for training, we have been able to find networks with a single layer containing less than n hidden units that solve the n-bit parity problem for some value of n. This proves the power of combining quasi-Newton method and node incremental approach.
  • Keywords
    Newton method; feedforward neural nets; learning (artificial intelligence); optimisation; pattern recognition; feedforward neural network; hidden units; n-bit parity problems; node incremental approach; optimisation; pattern recognition; quasi-Newton methods; Computer science; Error correction; Feedforward neural networks; Feedforward systems; Information systems; Network topology; Neural networks; Newton method; Pattern recognition; Transfer functions;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
  • Print_ISBN
    0-7803-1421-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.1993.713918
  • Filename
    713918