DocumentCode :
1142261
Title :
Computational-complexity reduction for neural network algorithms
Author :
Guez, A. ; Kam, Moshe ; Eilbert, J.L.
Author_Institution :
Drexel Univ., Philadelphia, PA
Volume :
19
Issue :
2
fYear :
1989
Firstpage :
409
Lastpage :
414
Abstract :
An important class of neural models is described as a set of coupled nonlinear differential equations with state variables corresponding to the axon hillock potential of neurons. Through a nonlinear transformation, these models can be converted to an equivalent system of differential equations whose state variables correspond to firing rates. The firing rate formulation has certain computational advantages over the potential formulation of the model. The computational and storage burdens per cycle in simulations are reduced, and the resulting equations become quasilinear in a large significant subset of the state space. Moreover, the dynamic range of the state space is bounded, alleviating the numerical stability problems in network simulation. These advantages are demonstrated through an example, using the authors´ model for the so-called neural solution to the traveling salesman problem proposed by J.J. Hopfield and D.W. Tank (1985)
Keywords :
computational complexity; neural nets; nonlinear differential equations; state-space methods; axon hillock potential; firing rate; neural network; neurons; nonlinear differential equations; state space; Computational modeling; Computer networks; Couplings; Differential equations; Dynamic range; Nerve fibers; Neural networks; Neurons; Nonlinear equations; State-space methods;
fLanguage :
English
Journal_Title :
Systems, Man and Cybernetics, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9472
Type :
jour
DOI :
10.1109/21.31043
Filename :
31043
Link To Document :
بازگشت