DocumentCode :
3661458
Title :
Training neural hardware with noisy components
Author :
Fred Rothganger;Brian R. Evans;James B. Aimone;Erik P. DeBenedictis
Author_Institution :
Sandia National Laboratories, Albuquerque, New Mexico 87185, USA
fYear :
2015
fDate :
7/1/2015 12:00:00 AM
Firstpage :
1
Lastpage :
8
Abstract :
Some next generation computing devices may consist of resistive memory arranged as a crossbar. Currently, the dominant approach is to use crossbars as the weight matrix of a neural network, and to use learning algorithms that require small incremental weight updates, such as gradient descent (for example Backpropagation). Using real-world measurements, we demonstrate that resistive memory devices are unlikely to support such learning methods. As an alternative, we offer a random search algorithm tailored to the measured characteristics of our devices.
Keywords :
"Resistance","Backpropagation","Hardware"
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2015 International Joint Conference on
Electronic_ISBN :
2161-4407
Type :
conf
DOI :
10.1109/IJCNN.2015.7280772
Filename :
7280772
Link To Document :
بازگشت