Title :
Circuit implementation of trainable neural networks employing both supervised and unsupervised techniques
Author :
Hasler, Paul ; Akers, Lex
Author_Institution :
Center for Solid State Electron. Res., Arizona State Univ., Tempe, AZ, USA
Abstract :
An efficient hardware implementation of a training algorithm is presented, combining supervised and unsupervised techniques. In VLSI circuits, effects including random offsets and mismatch, system distortion, frequency response, and temperature deviations perturb the system outputs. Analysis of the generalized Hebbian algorithm shows that these small deviations result in small perturbations of the output statistics and the weight matrix. Also, the conjugate-gradient optimization algorithm is formulated in continuous-time. Both the generalized Hebbian algorithm and the optimization system can be efficiently implemented on a mesh of synapses. The low distortion and high bandwidth (>100 MHz) for the matrix operations and a high-performance analog memory indicate high performance for the generalized Hebbian algorithm. From the mathematical theory, the error of the output statistics is less than 1%
Keywords :
Hebbian learning; VLSI; analogue storage; neural chips; VLSI circuits; analog memory; conjugate-gradient optimization algorithm; frequency response; generalized Hebbian algorithm; hardware implementation; matrix operations; output statistics; random offsets; supervised techniques; system distortion; temperature deviations; trainable neural networks; training algorithm; unsupervised techniques; weight matrix; Algorithm design and analysis; Analog memory; Bandwidth; Circuits; Frequency response; Hardware; Neural networks; Statistical analysis; Temperature; Very large scale integration;
Conference_Titel :
Circuits and Systems, 1992. ISCAS '92. Proceedings., 1992 IEEE International Symposium on
Conference_Location :
San Diego, CA
Print_ISBN :
0-7803-0593-0
DOI :
10.1109/ISCAS.1992.230199