• DocumentCode
    3601149
  • Title

    Memristor-Based Multilayer Neural Networks With Online Gradient Descent Training

  • Author

    Soudry, Daniel ; Di Castro, Dotan ; Gal, Asaf ; Kolodny, Avinoam ; Kvatinsky, Shahar

  • Author_Institution
    Dept. of StatisticsCenter for Theor. Neurosci., Columbia Univ., New York, NY, USA
  • Volume
    26
  • Issue
    10
  • fYear
    2015
  • Firstpage
    2408
  • Lastpage
    2421
  • Abstract
    Learning in multilayer neural networks (MNNs) relies on continuous updating of large matrices of synaptic weights by local rules. Such locality can be exploited for massive parallelism when implementing MNNs in hardware. However, these update rules require a multiply and accumulate operation for each synaptic weight, which is challenging to implement compactly using CMOS. In this paper, a method for performing these update operations simultaneously (incremental outer products) using memristor-based arrays is proposed. The method is based on the fact that, approximately, given a voltage pulse, the conductivity of a memristor will increment proportionally to the pulse duration multiplied by the pulse magnitude if the increment is sufficiently small. The proposed method uses a synaptic circuit composed of a small number of components per synapse: one memristor and two CMOS transistors. This circuit is expected to consume between 2% and 8% of the area and static power of previous CMOS-only hardware alternatives. Such a circuit can compactly implement hardware MNNs trainable by scalable algorithms based on online gradient descent (e.g., backpropagation). The utility and robustness of the proposed memristor-based circuit are demonstrated on standard supervised learning tasks.
  • Keywords
    backpropagation; memristors; neural nets; parallel processing; transistors; CMOS transistors; MNN; memristor-based arrays; memristor-based multilayer neural network; online gradient descent training; scalable algorithms; standard supervised learning task; synaptic circuit; synaptic weights; voltage pulse; Algorithm design and analysis; Backpropagation; Hardware; Memristors; Training; Transistors; Backpropagation; hardware; memristive systems; memristor; multilayer neural networks (MNNs); stochastic gradient descent; synapse; synapse.;
  • fLanguage
    English
  • Journal_Title
    Neural Networks and Learning Systems, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    2162-237X
  • Type

    jour

  • DOI
    10.1109/TNNLS.2014.2383395
  • Filename
    7010034