Title :
A delta rule algorithm using double hysteresis thresholds for recurrent associative memory
Author :
Nakayama, Kenji ; Nishimura, Katsuaki
Author_Institution :
Dept. of Electr. & Comput. Eng., Kanazawa Univ., Japan
fDate :
27 Jun-2 Jul 1994
Abstract :
An associative memory using fixed and variable hysteresis thresholds in learning and recalling processes, respectively, has been proposed by the authors. This model can achieve a large memory capacity and very low noise sensitivity. However, a relation between weight change Δw and the hysteresis threshold ±T has not been well discussed. In this paper, a new learning algorithm is proposed, which is based on a delta rule. However, in order to stabilize the learning process, a method of using double hysteresis thresholds is proposed. Unit states are updated using ±T. The error, used for adjusting weights, is evaluated using ±(T+dT): this means `over correction´. Stable and fast convergence can be obtained. Relations between η=dT/T and convergence rate and noise sensitivity are discussed, resulting the optimum selection for η. Furthermore, the order of presenting training data is optimized taking correlation into account. In the recalling process, a threshold control method is further proposed in order to achieve fast recalling from noisy patterns
Keywords :
content-addressable storage; convergence; learning (artificial intelligence); recurrent neural nets; sensitivity; convergence; correlation; delta rule algorithm; double hysteresis thresholds; learning algorithm; noise sensitivity; recalling process; recurrent associative memory; weight change; Artificial neural networks; Associative memory; Convergence; Equations; Error correction; Hysteresis; Process control; Training data;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374347