DocumentCode
288476
Title
Linear quantization of Hebbian-type associative memories in interconnection implementations
Author
Chung, Pau-Choo ; Tsai, Ching-Tsorng ; Sun, Yung-Nien
Author_Institution
Dept. of Electr. Eng., Nat. Cheng Kung Univ., Tainan, Taiwan
Volume
2
fYear
1994
fDate
27 Jun-2 Jul 1994
Firstpage
1092
Abstract
The effects of linear quantized Hebbian-type associative memories (HAMs) on storage capacity and hardware implementations are explored in this paper. For the linear quantization, the interconnection weights are linearly quantized into a small number of levels. This consideration focuses mainly on the situation when only a limited accuracy range can be achieved on hardware implementations. Results of simulation and theory show that the number of quantization levels required is relatively small compared with the possible values of interconnections. Therefore, linear quantization in HAMs is worthwhile in hardware implementations
Keywords
Hebbian learning; content-addressable storage; neural chips; neural nets; quantisation (signal); HAMs; Hebbian-type associative memories; hardware implementations; interconnection implementations; interconnection weights; linear quantization; linear quantized Hebbian-type associative memories; quantization levels; simulation; storage capacity; Analog circuits; Associative memory; Digital circuits; Hardware; Hebbian theory; Integrated circuit interconnections; Neurons; Quantization; Sun; Very large scale integration;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location
Orlando, FL
Print_ISBN
0-7803-1901-X
Type
conf
DOI
10.1109/ICNN.1994.374335
Filename
374335
Link To Document