• DocumentCode
    3389322
  • Title

    Activation function manipulation for fault tolerant feedforward neural networks

  • Author

    Taniguchi, Yasuyuki ; Kamiura, Naotake ; Hata, Yutaka ; Matsui, Nobuyuki

  • Author_Institution
    Dept. of Comput. Eng., Himeji Inst. of Technol., Hyogo, Japan
  • fYear
    1999
  • fDate
    1999
  • Firstpage
    203
  • Lastpage
    208
  • Abstract
    We propose a learning algorithm to enhance the fault tolerance of feedforward neural networks (NNs for short) by manipulating the gradient of sigmoid activation function of the neuron. For the output layer, we employ the function with the relatively gentle gradient. For the hidden layer we steepen the gradient of function after convergence. The experimental results show that our NNs are superior to NNs trained with other algorithms employing fault injection and the calculation of relevance of each weight to the output error in fault tolerance, learning cycles and time. The gradient manipulation never spoils the generalization ability
  • Keywords
    fault tolerant computing; feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); activation function manipulation; convergence; fault injection; fault tolerant feedforward neural networks; generalization ability; gradient manipulation; learning algorithm; output error; output layer; sigmoid activation function; Backpropagation algorithms; Computer networks; Convergence; Costs; Electronic mail; Fault tolerance; Feedforward neural networks; Hardware; Neural networks; Neurons;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Test Symposium, 1999. (ATS '99) Proceedings. Eighth Asian
  • Conference_Location
    Shanghai
  • ISSN
    1081-7735
  • Print_ISBN
    0-7695-0315-2
  • Type

    conf

  • DOI
    10.1109/ATS.1999.810751
  • Filename
    810751