DocumentCode :
893171
Title :
On the number of memories that can be perfectly stored in a neural net with Hebb weights
Author :
Sussmann, H.J.
Author_Institution :
Dept. of Math., Rutgers Univ., New Brunswick, NJ, USA
Volume :
35
Issue :
1
fYear :
1989
fDate :
1/1/1989 12:00:00 AM
Firstpage :
174
Lastpage :
178
Abstract :
Let {wij} be the weights of the connections of a neural network with n nodes, calculated from m data vectors v1, ···, vm in {1,-1}n, according to the Hebb rule. The author proves that if m is not too large relative to n and the vk are random, then the wij constitute, with high probability, a perfect representation of the vk in the sense that the v k are completely determined by the wij up to their sign. The conditions under which this is established turn out to be less restrictive than those under which it has been shown that the vk can actually be recovered by letting the network evolve until equilibrium is attained. In the specific case where the entries of the vk are independent and equal to 1 or -1 with probability 1/2, the condition on m is that m should not exceed n/0.7 log n
Keywords :
content-addressable storage; information theory; neural nets; Hebb weights; associative memory; neural net; Associative memory; Intelligent networks; Mathematics; Neural networks; Neurons;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.42187
Filename :
42187
Link To Document :
بازگشت