• DocumentCode
    801140
  • Title

    Sensitivity analysis of single hidden-layer neural networks with threshold functions

  • Author

    Oh, Sang-Hoon ; Lee, Youngjik

  • Author_Institution
    Res. Dept., Electron. & Telecommun. Res. Inst., Daejeon, South Korea
  • Volume
    6
  • Issue
    4
  • fYear
    1995
  • fDate
    7/1/1995 12:00:00 AM
  • Firstpage
    1005
  • Lastpage
    1007
  • Abstract
    An important consideration when applying neural networks to pattern recognition is the sensitivity to weight perturbation or to input errors. In this paper, we analyze the sensitivity of single hidden-layer networks with threshold functions. In a case of weight perturbation or input errors, the probability of inversion error for an output neuron is derived as a function of the trained weights, the input pattern, and the variance of weight perturbation or the bit error probability of the input pattern. The derived results are verified with a simulation of the Madaline recognizing handwritten digits. The result shows that the sensitivity of trained networks is far different from that of networks with random weights
  • Keywords
    character recognition; error statistics; neural nets; sensitivity analysis; Madaline; bit error probability; handwritten digit recognition; input errors; inversion error probability; sensitivity analysis; single hidden-layer neural networks; threshold functions; weight perturbation; Approximation methods; Degradation; Error probability; Handwriting recognition; Joining processes; Neural network hardware; Neural networks; Neurons; Pattern recognition; Sensitivity analysis;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/72.392264
  • Filename
    392264