• DocumentCode
    2213856
  • Title

    Stochastic Hebbian learning with binary synapses

  • Author

    Barrows, Geoffrey L.

  • Author_Institution
    Tactical Electron. Warfare Div., Naval Res. Lab., Washington, DC, USA
  • Volume
    1
  • fYear
    1998
  • fDate
    4-8 May 1998
  • Firstpage
    525
  • Abstract
    This paper explores a variant of Hebbian learning in which binary synapses are updated stochastically rather than deterministically. In this variant, a single potentiation or depression event is implemented by setting a synapse weight respectively to “one” or “zero” with a finite probability, if it is not this value already. This learning rule is compared to the conventional Hebbian rule where a continuously valued synapse moves a fraction towards 1.0 or 0.0. It is shown that given a set of input-output pattern pairs, the expected value of a particular synapse is the same for both learning rules. Also, as the network size and the input activity levels increase, the signal to noise ratio of the dendritic sums approaches infinity. These stochastic binary synapses are presented as a viable mechanism for the VLSI implementation of Hebbian-based neural networks
  • Keywords
    Hebbian learning; neural nets; pattern classification; probability; binary synapses; dendritic sums; depression event; neural networks; pattern classification; potentiation; probability; signal noise ratio; stochastic Hebbian learning; Capacitors; Circuits; Computer networks; Dynamic range; Electronic warfare; Hebbian theory; Laboratories; Neural networks; Stochastic processes; Very large scale integration;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
  • Conference_Location
    Anchorage, AK
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-4859-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.1998.682322
  • Filename
    682322