DocumentCode :
1564778
Title :
The Two-level Quantization Strategy of Quadratic Hebbian-type Associative Memories
Author :
Liaw, Chishyan ; Tsai, Ching-Tsorng ; Ko, Chao-Hu
Author_Institution :
Dept. of Comput. Sci. & Inf. Eng., Tunghai Univ., Taichung
Volume :
2
fYear :
2005
Firstpage :
899
Lastpage :
904
Abstract :
The strategy of two-level quantization for quadratic Hebbian-type associative memories are proposed and their performances are analyzed. The strategy reduces the interconnection values and makes the hardware implementation of Hebbian-type associative memory more feasible. The probabilities of direct convergence of the quantized networks are explored and simulations are also used to verify the proposed strategies. The results show that two-level quantization of quadratic Hebbian-type associative memories have approximately the same convergent capability as their original networks if the original recall capacity is higher. The performance of the first order and second order demonstrate that the second order networks have better performance than the first order networks after either two-level or three-level quantization
Keywords :
Hebbian learning; content-addressable storage; convergent capability; quadratic Hebbian-type associative memories; two-level quantization strategy; Associative memory; Chaos; Computer science; Convergence; Hardware; Neural networks; Performance analysis; Probability; Quantization; Very large scale integration;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks and Brain, 2005. ICNN&B '05. International Conference on
Conference_Location :
Beijing
Print_ISBN :
0-7803-9422-4
Type :
conf
DOI :
10.1109/ICNNB.2005.1614766
Filename :
1614766
Link To Document :
بازگشت