DocumentCode :
1528789
Title :
Improving Leung´s bidirectional learning rule for associative memories
Author :
Lenze, Burkhard
Author_Institution :
Dept. of Comput. Sci., Univ. of Appl. Sci., Dortmund, Germany
Volume :
12
Issue :
5
fYear :
2001
fDate :
9/1/2001 12:00:00 AM
Firstpage :
1222
Lastpage :
1226
Abstract :
Leung (1994) introduced a perceptron-like learning rule to enhance the recall performance of bidirectional associative memories (BAMs). He proved that his so-called bidirectional learning scheme always yields a solution within a finite number of learning iterations in case that a solution exists. Unfortunately, in the setting of Leung a solution only exists in case that the training set is strongly linear separable by hyperplanes through the origin. We extend Leung´s approach by considering conditionally strong linear separable sets allowing separating hyperplanes not containing the origin. Moreover, we deal with BAMs, which are generalized by defining so-called dilation and translation parameters enlarging their capacity, while leaving their complexity almost unaffected. The whole approach leads to a generalized bidirectional learning rule which generates BAMs with dilation and translation that perform perfectly on the training set in a case that the latter satisfies the conditionally strong linear separability assumption. Therefore, in the sense of Leung, we conclude with an optimal learning strategy which contains Leung´s initial idea as a special case
Keywords :
content-addressable storage; learning (artificial intelligence); perceptrons; Leung learning rule; bidirectional associative memory; conditional strong linear separability; dilation; optimal learning strategy; perceptron; translation; Associative memory; Code standards; Computer science; Intelligent networks; Magnesium compounds; Neurons;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.950150
Filename :
950150
Link To Document :
بازگشت