Title :
Enabling back propagation training of memristor crossbar neuromorphic processors
Author :
Hasan, Ragib ; Taha, Tarek M.
Author_Institution :
Univ. of Dayton, Dayton, OH, USA
Abstract :
Recent studies have shown that memristor crossbar based neuromorphic hardware enables high performance implementations of neural networks at low power and in low chip area. This paper presents circuits to train a cascaded set of memristor crossbars representing a multi-layered neural network. The circuits presented implement back-propagation training and would enable on-chip training of memristor crossbars. On-chip training of memristor crossbars can be necessary to overcome the effect of device variability and alternate current paths within crossbars being used as neural networks. We model the memristor crossbars in SPICE in order capture alternate current paths and the impact of wire resistance. Our design can be scaled to multiple neural layers and multiple output neurons. We demonstrate the training of up to three layered neural networks evaluating non-linearly separable functions through detailed SPICE simulations. This is the first study in the literature we have seen that examines the implementation of back-propagation based training of memristor crossbar circuits. The impact of this work would be to enable the design of highly energy efficient and compact neuromorphic processing systems that can be trained to implement large deep networks (such as deep belief networks).
Keywords :
SPICE; backpropagation; digital simulation; memristors; multiprocessing systems; neural nets; SPICE simulations; alternate current paths; compact neuromorphic processing systems; deep belief networks; energy efficient neuromorphic processing systems; large deep networks; memristor crossbar based neuromorphic hardware; memristor crossbar circuit back-propagation training; memristor crossbar neuromorphic processors; memristor crossbar on-chip training; multilayered neural network; multiple neural layers; multiple output neurons; nonlinearly separable functions; three layered neural networks; wire resistance; Biological neural networks; Inverters; Memristors; Neuromorphics; Neurons; Power demand; Training; Neural networks; memristor crossbars; neuromorphic architectures;
Conference_Titel :
Neural Networks (IJCNN), 2014 International Joint Conference on
Conference_Location :
Beijing
Print_ISBN :
978-1-4799-6627-1
DOI :
10.1109/IJCNN.2014.6889893