Title :
An all-analog expandable neural network LSI with on-chip backpropagation learning
Author :
Morie, Takashi ; Amemiya, Yoshihito
Author_Institution :
NTT LSI Labs., Atsugi, Japan
fDate :
9/1/1994 12:00:00 AM
Abstract :
This paper proposes an all-analog neural network LSI architecture and a new learning procedure called contrastive backpropagation learning. In analog neural LSI´s with on-chip backpropagation learning, inevitable offset errors that arise in the learning circuits seriously degrade the learning performance. Using the learning procedure proposed here, offset errors are canceled to a large extent and the effect of offset errors on the learning performance is minimized. This paper also describes a prototype LSI with 9 neurons and 81 synapses based on the proposed architecture which is capable of continuous neuron-state and continuous-time operation because of its fully analog and fully parallel property. Therefore, an analog neural system made by combining LSI´s with feedback connections is promising for implementing continuous-time models of recurrent networks with real-time learning
Keywords :
CMOS integrated circuits; analogue processing circuits; backpropagation; errors; feedback; large scale integration; linear integrated circuits; neural chips; parallel architectures; LSI architecture; all-analog expandable neural network; continuous neuron-state operation; continuous-time operation; contrastive backpropagation learning; feedback connections; fully parallel property; offset errors; onchip backpropagation learning; real-time learning; recurrent networks; synapses; Backpropagation; Circuits; Degradation; Large scale integration; Network-on-a-chip; Neural networks; Neurofeedback; Neurons; Prototypes; Real time systems;
Journal_Title :
Solid-State Circuits, IEEE Journal of