DocumentCode :
993140
Title :
Linear algebra approach to neural associative memories and noise performance of neural classifiers
Author :
Cherkassky, Vladimir ; Fassett, Karen ; Vassilas, Nikolaos
Author_Institution :
Dept. of Electr. Eng., Minnesota Univ., Minneapolis, MN, USA
Volume :
40
Issue :
12
fYear :
1991
fDate :
12/1/1991 12:00:00 AM
Firstpage :
1429
Lastpage :
1435
Abstract :
The authors present an analytic evaluation of saturation and noise performance for a large class of associative memories based on matrix operations. The importance of using standard linear algebra techniques for evaluating noise performance of associative memories is emphasized. The authors present a detailed comparative analysis of the correlation matrix memory and the generalized inverse memory construction rules for auto-associative memory and neural classifiers. Analytic results for the noise performance of neural classifiers that can store several prototypes in one class are presented. The analysis indicates that for neural classifiers the simple correlation matrix memory provides better noise performance than the more complex generalized inverse memory
Keywords :
content-addressable storage; linear algebra; neural nets; performance evaluation; analytic evaluation; comparative analysis; correlation matrix memory; generalized inverse memory construction rules; linear algebra approach; neural associative memories; neural classifiers; noise performance; saturation; Associative memory; Degradation; Linear algebra; Neural networks; Performance analysis; Prototypes; Vectors;
fLanguage :
English
Journal_Title :
Computers, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9340
Type :
jour
DOI :
10.1109/12.106229
Filename :
106229
Link To Document :
بازگشت