• DocumentCode
    3455331
  • Title

    Low area overhead in-situ training approach for memristor-based classifier

  • Author

    Zamanidoost, Elham ; Klachko, Michael ; Strukov, Dmitri ; Kataeva, Irina

  • Author_Institution
    Electr. & Comput. Eng. Dept., Univ. of California Santa Barbara, Santa Barbara, CA, USA
  • fYear
    2015
  • fDate
    8-10 July 2015
  • Firstpage
    139
  • Lastpage
    142
  • Abstract
    We propose combination of "dropout" and "Manhattan Rule" training algorithms for memristive crossbar neural networks to reduce circuit area overhead of in-situ training. Using accurate phenomenological model of memristive devices, we show that such combination allows achieving 0.7% misclassification rate on the MNIST benchmark, which is comparable to the best reported results. At the same time, the considered training approach allows reducing the size of memory circuits, the largest area overhead component, which is required to store intermediate weight adjustments during training, by as much as 40% at 16% longer training time as compared to the baseline crossbar circuit compatible "Manhattan Rule" training. The further reduction of the memory circuit area overhead is possible but at the expense of inferior classification performance.
  • Keywords
    CMOS memory circuits; memristor circuits; memristors; neural nets; intermediate weight adjustments; low area overhead in-situ training approach; memory circuit area overhead; memristive crossbar neural networks; memristive devices; memristor-based classifier; Decision support systems; Economic indicators; Nanoscale devices; Crossbar; Dropout training; Manhattan Rule; Memristor; Multilayer Perceptron; Pattern Classification;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Nanoscale Architectures (NANOARCH), 2015 IEEE/ACM International Symposium on
  • Conference_Location
    Boston, MA
  • Type

    conf

  • DOI
    10.1109/NANOARCH.2015.7180601
  • Filename
    7180601