Title :
A Novel Recurrent Neural Network with a Continuous Activation Function for Winner-Take-All
Author :
Qingshan Liu ; Yan Zhao
Author_Institution :
Sch. of Autom., Southeast Univ., Nanjing, China
Abstract :
In this paper, a novel recurrent neural network with a continuous activation function is proposed for solving the winner-take-all (WTA) problem. Compared with the existing WTA networks, the proposed network has a continuous activation function and lower model complexity. Moreover, global convergence of the proposed neural network is proved using the Lyapunov method. The WTA problem is first converted equivalently into a linear programming problem. Then a recurrent neural network with a single state variable is proposed to get the largest input of the WTA problem. In addition, simulation results on a numerical example show the effectiveness and performance of the proposed WTA network.
Keywords :
Lyapunov methods; computational complexity; linear programming; recurrent neural nets; transfer functions; Lyapunov method; WTA networks; continuous activation function; global convergence; linear programming problem; model complexity; recurrent neural network; single state variable; winner-take-all; Biological neural networks; Convergence; Linear programming; Lyapunov methods; Recurrent neural networks; Simulation; Lyapunov function; Recurrent neural network; global convergence; winners-take-all;
Conference_Titel :
Intelligent Systems Modelling & Simulation (ISMS), 2013 4th International Conference on
Conference_Location :
Bangkok
Print_ISBN :
978-1-4673-5653-4
DOI :
10.1109/ISMS.2013.14