DocumentCode :
1953444
Title :
A Novel Recurrent Neural Network with a Continuous Activation Function for Winner-Take-All
Author :
Qingshan Liu ; Yan Zhao
Author_Institution :
Sch. of Autom., Southeast Univ., Nanjing, China
fYear :
2013
fDate :
29-31 Jan. 2013
Firstpage :
36
Lastpage :
39
Abstract :
In this paper, a novel recurrent neural network with a continuous activation function is proposed for solving the winner-take-all (WTA) problem. Compared with the existing WTA networks, the proposed network has a continuous activation function and lower model complexity. Moreover, global convergence of the proposed neural network is proved using the Lyapunov method. The WTA problem is first converted equivalently into a linear programming problem. Then a recurrent neural network with a single state variable is proposed to get the largest input of the WTA problem. In addition, simulation results on a numerical example show the effectiveness and performance of the proposed WTA network.
Keywords :
Lyapunov methods; computational complexity; linear programming; recurrent neural nets; transfer functions; Lyapunov method; WTA networks; continuous activation function; global convergence; linear programming problem; model complexity; recurrent neural network; single state variable; winner-take-all; Biological neural networks; Convergence; Linear programming; Lyapunov methods; Recurrent neural networks; Simulation; Lyapunov function; Recurrent neural network; global convergence; winners-take-all;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Systems Modelling & Simulation (ISMS), 2013 4th International Conference on
Conference_Location :
Bangkok
ISSN :
2166-0662
Print_ISBN :
978-1-4673-5653-4
Type :
conf
DOI :
10.1109/ISMS.2013.14
Filename :
6498231
Link To Document :
بازگشت