Title :
Large Memory Capacity in Chaotic Artificial Neural Networks: A View of the Anti-Integrable Limit
Author :
Lin, Wei ; Chen, Guanrong
Author_Institution :
Key Lab. of Math. for Nonlinear Sci., Fudan Univ., Shanghai, China
Abstract :
In the literature, it was reported that the chaotic artificial neural network model with sinusoidal activation functions possesses a large memory capacity as well as a remarkable ability of retrieving the stored patterns, better than the conventional chaotic model with only monotonic activation functions such as sigmoidal functions. This paper, from the viewpoint of the anti-integrable limit, elucidates the mechanism inducing the superiority of the model with periodic activation functions that includes sinusoidal functions. Particularly, by virtue of the anti-integrable limit technique, this paper shows that any finite-dimensional neural network model with periodic activation functions and properly selected parameters has much more abundant chaotic dynamics that truly determine the model´s memory capacity and pattern-retrieval ability. To some extent, this paper mathematically and numerically demonstrates that an appropriate choice of the activation functions and control scheme can lead to a large memory capacity and better pattern-retrieval ability of the artificial neural network models.
Keywords :
chaos; content-addressable storage; neural nets; antiintegrable limit technique; chaotic artificial neural networks; chaotic dynamics; finite-dimensional neural network; large memory capacity; memory capacity; monotonic activation functions; pattern-retrieval ability; periodic activation functions; sigmoidal functions; sinusoidal activation functions; Anti-integrable limit; artificial neural network; chaos; periodic activation function; Action Potentials; Algorithms; Computer Simulation; Neural Networks (Computer); Neurons; Periodicity; Time Factors;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2009.2024148