• DocumentCode
    353284
  • Title

    Evaluation function for fault tolerant multi-layer neural networks

  • Author

    Takase, Haruhilo ; Shinogi, Tsuyoshi ; Hayashi, Terumine ; Kita, Hidehiko

  • Author_Institution
    Dept. of Electr. & Electron. Eng., Mie Univ., Tsu, Japan
  • Volume
    3
  • fYear
    2000
  • fDate
    2000
  • Firstpage
    521
  • Abstract
    We propose a new learning algorithm to enhance fault tolerance of multilayer neural networks (MLN). This method is based on the idea that strong weights make MLN sensitive to faults. The purpose of the proposed algorithm is to make weights as small as possible through its training. The evaluation function of the proposed algorithm consists of not only the output error but also the square sum of weights. With the new evaluation function the learning algorithm minimizes not only output error but also weights. We discussed about the value of parameter to balance effects of these two terms. Next, we apply it to pattern recognition problems. As a result, it is shown that the degradation of recognition ratio is improved
  • Keywords
    character recognition; fault tolerance; learning (artificial intelligence); minimisation; multilayer perceptrons; MLN; evaluation function; fault tolerant multilayer neural networks; learning algorithm; output error minimization; pattern recognition; recognition ratio degradation; training; weight minimization; Artificial neural networks; Degradation; Equations; Fault tolerance; Multi-layer neural network; Neural networks; Output feedback; Pattern recognition;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
  • Conference_Location
    Como
  • ISSN
    1098-7576
  • Print_ISBN
    0-7695-0619-4
  • Type

    conf

  • DOI
    10.1109/IJCNN.2000.861361
  • Filename
    861361