DocumentCode
1301413
Title
Long-term attraction in higher order neural networks
Author
Burshtein, David
Author_Institution
Dept. of Electr. Eng., Tel Aviv Univ., Israel
Volume
9
Issue
1
fYear
1998
fDate
1/1/1998 12:00:00 AM
Firstpage
42
Lastpage
50
Abstract
Recent results on the memory storage capacity of higher order neural networks indicate a significant improvement compared to the limited capacity of the Hopfield model. However, such results have so far been obtained under the restriction that only a single iteration is allowed to converge. This paper presents a indirect convergence (long-term attraction) analysis of higher order neural networks. Our main result is that for any κd<d!2d-1/(2d)!, and 0⩽ρ<1/2, a Hebbian higher order neural network of order d with n neurons can store a random set of κdnd/log n fundamental memories such that almost all memories have an attraction radius of size ρn. If κd<d!2d-1/((2d)!(d+1)), then all memories possess this property simultaneously. It indicates that the lower bounds on the long-term attraction capacities are larger than the corresponding direct convergence capacities by a factor of 1/(1-2ρ) 2d. In addition we upper bound the convergence rate (number of iterations required to converge). This bound is asymptotically independent of n. Similar results are obtained for zero diagonal higher order neural networks
Keywords
Hopfield neural nets; associative processing; content-addressable storage; convergence; iterative methods; Hebbian neural network; Hopfield associative memory; convergence rate; higher order neural networks; indirect convergence; iterative method; long-term attraction; memory capacity; upper bound; Associative memory; Convergence; Error correction; Hebbian theory; Hopfield neural networks; Intelligent networks; Learning systems; Neural networks; Neurons; Upper bound;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/72.655028
Filename
655028
Link To Document