Title :
A generalized convergence theorem for neural networks
Author :
Bruck, Jehoshua ; Goodman, Joseph W.
Author_Institution :
Dept. of Electr. Eng., Stanford Univ., CA, USA
fDate :
9/1/1988 12:00:00 AM
Abstract :
A neural network model is presented in which each neuron performs a threshold logic function. The model always converges to a stable state when operating in a serial mode and to a cycle of length at most 2 when operating in a fully parallel mode. This property is the basis for the potential applications of the model, such as associative memory devices and combinatorial optimization. The two convergence theorems (for serial and fully parallel modes of operation) are reviewed, and a general convergence theorem is presented that unifies the two known cases. New relations between the neural network model and the problem of finding a minimum cut in a graph are obtained
Keywords :
combinatorial switching; convergence; neural nets; optimisation; associative memory devices; combinatorial optimisation; convergence theorems; fully parallel mode; neural networks; neuron; serial mode; stable state; threshold logic function; Associative memory; Computer networks; Convergence; Logic functions; Military computing; Multidimensional systems; Neural networks; Neurons; Performance evaluation; Symmetric matrices;
Journal_Title :
Information Theory, IEEE Transactions on