Title :
How chaos boosts the encoding capacity of small recurrent neural networks : learning consideration
Author :
Molter, Colin ; Salihoglu, Utku ; Bersini, Hugues
Author_Institution :
Laboratory of artificial intelligence IRIDIA, Universite Libre de Bruxelles, Brussels, Belgium
Abstract :
So far, recurrent networks, when adopting fixed point dynamics, show a very poor encoding capacity. However, these same networks, when preferentially maintained in chaotic dynamics, can encode an enormous amount of information in their cyclic attractors and this boosts their encoding capacity. It has been described in a previous paper a simple way to encode such information by robustly associating each vector in a N-dimensional space with one "symbolic" cyclic attractor. The main message was the monotonous increase of chaotic spontaneous regimes as a function of the number of attractors to learn. However, no algorithm was provided to adjust the connection\´s weight in order to encode a given input set. For this purpose, this paper revisits the classical gradient-based BPTT learning algorithm. It shows that this algorithm gives poor results and furthermore that by using it the "chaoticity" of the network dampens strongly, hence it\´s encoding capacity.
Keywords :
backpropagation; chaos; gradient methods; recurrent neural nets; back propagation through time; chaotic dynamics; cyclic attractors; gradient-based BPTT learning algorithm; recurrent neural networks; Artificial neural networks; Biological neural networks; Chaos; Chaotic communication; Convergence; Encoding; Laboratories; Learning; Recurrent neural networks; Switches;
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
Print_ISBN :
0-7803-8359-1
DOI :
10.1109/IJCNN.2004.1379874