DocumentCode :
1336938
Title :
Neural networks, orientations of the hypercube, and algebraic threshold functions
Author :
Baldi, Pierre
Author_Institution :
California Inst. of Technol., Pasadena, CA, USA
Volume :
34
Issue :
3
fYear :
1988
fDate :
5/1/1988 12:00:00 AM
Firstpage :
523
Lastpage :
530
Abstract :
A class of possible generalizations of current neural networks models is described using local improvement algorithms and orientations of graphs. A notation of dynamical capacity is defined and, by computing bounds on the number of algebraic threshold functions, it is proven that for neural networks of size n and energy function of degree d, this capacity is O(nd+1). Stable states are studied, and it is shown that for the same networks the storage capacity is O(nd+1). In the case of random orientations, it is proven that the expected number of stable states is exponential. Applications to coding theory are indicated, and it is shown that usual codes can be embedded in neural networks but only at high cost. Cycles and their storage are also examined
Keywords :
automata theory; encoding; neural nets; Hopfield model; algebraic threshold functions; coding theory; dynamical capacity; hypercube orientations; neural networks; neural-type automata; random orientations; stable states; storage capacity; Automata; Computer networks; Context modeling; Hardware; Helium; Hypercubes; Mathematics; Neural networks; Neurons; Physics computing;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.6032
Filename :
6032
Link To Document :
بازگشت