• DocumentCode
    1748951
  • Title

    Training multilayer networks with discrete activation functions

  • Author

    Plagianakos, Vassilis P. ; Magoulas, G.D. ; Nousis, N.K. ; Vrahatis, M.N.

  • Author_Institution
    Dept. of Math., Patras Univ., Greece
  • Volume
    4
  • fYear
    2001
  • fDate
    2001
  • Firstpage
    2805
  • Abstract
    Efficient training of multilayer networks with discrete activation functions is a subject of considerable ongoing research. The use of these networks greatly reduces the complexity of the hardware implementation, provides tolerance to noise and improves the interpretation of the internal representations. Methods available in the literature mainly focus on two-state (binary) nodes and try to train these networks by approximating the gradient and modifying appropriately the gradient descent. However, they exhibit slow convergence speed and low possibility of success compared to networks with continuous activations. In this work, we propose an evolution-motivated approach, which is eminently suitable for networks with discrete output states and compare its performance with four other methods
  • Keywords
    computational complexity; convergence; gradient methods; learning (artificial intelligence); multilayer perceptrons; transfer functions; binary nodes; discrete activation functions; evolution-motivated approach; gradient approximation; gradient descent; hardware implementation complexity; internal representation interpretation; multilayer network training; noise tolerance; slow convergence; two-state nodes; Artificial intelligence; Computer networks; Electronic mail; Hardware; Information systems; Mathematics; Multi-layer neural network; Neural networks; Neurons; Nonhomogeneous media;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7044-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2001.938819
  • Filename
    938819