• DocumentCode
    303387
  • Title

    Deep feedforward networks: application to pattern recognition

  • Author

    Babri, Haroon A. ; Tong, Yin

  • Author_Institution
    Sch. of Electr. & Electron. Eng., Nanyang Technol. Inst., Singapore
  • Volume
    3
  • fYear
    1996
  • fDate
    3-6 Jun 1996
  • Firstpage
    1422
  • Abstract
    Issues pertaining to learning in deep feedforward neural networks are addressed. The sensitivity of the network output is derived as a function of weight perturbation in different hidden layers. For a given hidden layer l, the output sensitivity varies inversely with current activation levels of neurons of the previous layer l-1, and the magnitude of the connection weights between layers l and l-1. Learning involves modifying weights. Relatively small connection weights (usually during initial learning phase of BP algorithm), or small neuron activation levels can increase the sensitivity of the network and make learning unstable. This problem is further aggravated as the depth of the network increases. A weight initialization strategy and a modified activation (sigmoid) function are proposed to alleviate these problems. Using this scheme, deep networks trained with the error backpropagation learning rule show substantial improvement in error curve (trajectory) control and convergence speed when applied to pattern recognition problems
  • Keywords
    backpropagation; convergence; feedforward neural nets; pattern recognition; perturbation techniques; sensitivity analysis; convergence; deep feedforward networks; error backpropagation; hidden layers; learning; neuron activation; output sensitivity; pattern recognition; sigmoid function; weight perturbation; Artificial neural networks; Backpropagation algorithms; Computer networks; Convergence; Error correction; Feedforward neural networks; Multi-layer neural network; Neural networks; Neurons; Pattern recognition;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1996., IEEE International Conference on
  • Conference_Location
    Washington, DC
  • Print_ISBN
    0-7803-3210-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1996.549108
  • Filename
    549108