DocumentCode :
3323057
Title :
A stochastic architecture for neural nets
Author :
Van den Bout, David E. ; Miller, T.K.
Author_Institution :
Dept. of Electr. & Comput. Eng., North Carolina State Univ., Raleigh, NC, USA
fYear :
1988
fDate :
24-27 July 1988
Firstpage :
481
Abstract :
A stochastic digital architecture is described for simulating the operation of Hopfield neural networks. This architecture provides reprogrammability (since synaptic weights are stored in digital shift registers), large dynamic range (by using either fixed or floating-point weights), annealing (by coupling variable neuron gains with noise from stochastic arithmetic), high execution speeds ( approximately=N*10/sup 8/ connections per second), expandability (by cascading of multiple chips to host large networks), and practicality (by building with very conservative MOS device technologies).<>
Keywords :
MOS integrated circuits; digital arithmetic; microprocessor chips; neural nets; parallel architectures; Hopfield neural networks; MOS; expandability; neural nets; reprogrammability; simulated annealing; stochastic arithmetic; stochastic digital architecture; variable neuron gains; Digital arithmetic; MOS integrated circuits; Microprocessors; Neural networks; Parallel architectures;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1988., IEEE International Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/ICNN.1988.23882
Filename :
23882
Link To Document :
بازگشت