Title :
Computational-complexity reduction for neural network algorithms
Author :
Guez, A. ; Kam, Moshe ; Eilbert, J.L.
Author_Institution :
Drexel Univ., Philadelphia, PA
Abstract :
An important class of neural models is described as a set of coupled nonlinear differential equations with state variables corresponding to the axon hillock potential of neurons. Through a nonlinear transformation, these models can be converted to an equivalent system of differential equations whose state variables correspond to firing rates. The firing rate formulation has certain computational advantages over the potential formulation of the model. The computational and storage burdens per cycle in simulations are reduced, and the resulting equations become quasilinear in a large significant subset of the state space. Moreover, the dynamic range of the state space is bounded, alleviating the numerical stability problems in network simulation. These advantages are demonstrated through an example, using the authors´ model for the so-called neural solution to the traveling salesman problem proposed by J.J. Hopfield and D.W. Tank (1985)
Keywords :
computational complexity; neural nets; nonlinear differential equations; state-space methods; axon hillock potential; firing rate; neural network; neurons; nonlinear differential equations; state space; Computational modeling; Computer networks; Couplings; Differential equations; Dynamic range; Nerve fibers; Neural networks; Neurons; Nonlinear equations; State-space methods;
Journal_Title :
Systems, Man and Cybernetics, IEEE Transactions on