• DocumentCode
    288307
  • Title

    Reducing the number of multiplies in backpropagation

  • Author

    Boonyanit, Kan ; Peterson, Allen M.

  • Author_Institution
    LSI Logic Corp., Milpitas, CA, USA
  • Volume
    1
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    28
  • Abstract
    There have been many algorithms to speed up the learning time of backpropagation. However, most of them do not take into consideration the amount of hardware required to implement the algorithm. Without suitable hardware implementation, the real promise of neural network applications will be difficult to achieve. Since multiply dominates computation and is expensive in hardware, this paper proposes a method to reduce the number of multiplies in the backward path of backpropagation algorithm by setting some neuron errors to zero. It proves the convergence theorem by the general Robbins-Monro process, a stochastic approximation process
  • Keywords
    approximation theory; backpropagation; convergence of numerical methods; neural nets; Robbins-Monro process; backpropagation; backward path; convergence theorem; learning time; multiply reduction; neural network; neuron errors; stochastic approximation; Backpropagation algorithms; Convergence; Data flow computing; Large scale integration; Logic; Neural network hardware; Neural networks; Neurons; Noise reduction; Stochastic processes;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374133
  • Filename
    374133