DocumentCode :
2699774
Title :
Effects of weight discretization on the back propagation learning method: algorithm design and hardware realization
Author :
Caviglia, Daniele D. ; Valle, Maurizio ; Bisio, Giacomo M.
fYear :
1990
fDate :
17-21 June 1990
Firstpage :
631
Abstract :
An architectural configuration for the back-propagation (BP) algorithm is illustrated. The circuit solution for the basic blocks is presented, and the effect of weight discretization on the BP algorithm is analyzed. It is demonstrated, through simulations, how the BP algorithm can be operated successfully with discretized weights. In particular, better performances can be achieved with an exponential discretization, i.e. the strength of weights varies exponentially with the controlling variable (voltage). The discretized voltage values differ by a quantity high enough that the neural network can be backed up with a refresh technique in combination with a multilevel dynamic memory that entails a particularly low wiring cost. A quasi-analog adaptive architecture is devised, properly matching the BP algorithm, and its CMOS circuit implementation is detailed. The mechanism controlling weight changes is simple enough to be reproduced locally at each synapsis site, thus meeting one of the requirements for an efficient storage technology for analog VLSI
Keywords :
CMOS integrated circuits; VLSI; learning systems; neural nets; parallel architectures; CMOS circuit implementation; VLSI; algorithm design; architectural configuration; hardware realization; multilevel dynamic memory; neural network; propagation learning method; refresh technique; weight discretization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/IJCNN.1990.137958
Filename :
5726915
Link To Document :
بازگشت