• DocumentCode
    303204
  • Title

    An adaptive and fully sparse training approach for multilayer perceptrons

  • Author

    Wang, Fang ; Zhang, Q.J.

  • Author_Institution
    Dept. of Electron., Carleton Univ., Ottawa, Ont., Canada
  • Volume
    1
  • fYear
    1996
  • fDate
    3-6 Jun 1996
  • Firstpage
    102
  • Abstract
    An adaptive and fully sparse backpropagation training approach is proposed in this paper. The technique speeds up training by combining a sparse optimization concept with neural network training. The sparse phenomenon due to neuron activation, which is inherent in neural networks, is exploited in both feedforward and backpropagation phases. A new computational algorithm with sparse pattern reuse and refreshment has been developed together with the adaptation procedure of a new set of parameters which regulate the sparse training process. The proposed training approach has been applied to speech recognition and circuit extraction problems and achieved significant speed-up of training
  • Keywords
    adaptive systems; backpropagation; multilayer perceptrons; optimisation; speech recognition; adaptive learning; circuit extraction; multilayer perceptrons; neural networks; neuron activation; sparse backpropagation; sparse optimization; speech recognition; Backpropagation algorithms; Circuits; Feedforward neural networks; Gradient methods; Jacobian matrices; Multilayer perceptrons; Neural networks; Neurons; Optimization methods; Speech recognition;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1996., IEEE International Conference on
  • Conference_Location
    Washington, DC
  • Print_ISBN
    0-7803-3210-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1996.548874
  • Filename
    548874