Title :
Generalization of the maximum capacity of recurrent neural networks
Author :
Chen, Chang-Jiu ; Cheung, John Y.
Author_Institution :
Dept. of Comput. Sci., Oklahoma Univ., Norman, OK, USA
Abstract :
The authors have previously proposed a novel model which presents the maximum capacity of 1-layer recurrent neural networks by using an initiator, A, to construct the weight matrix and threshold and to define an equation, which produces all memorized vectors. In this paper, the authors generalize that model by lifting the restriction of A and give the new version of their model. Besides the explanation of the new version of that model, they give more information about it. The authors also compare their model with the SOR method.
Keywords :
content-addressable storage; matrix algebra; recurrent neural nets; SOR method; initiator; maximum capacity; memorized vectors; recurrent neural networks; threshold; weight matrix; Computer science; Electronic mail; Equations; Neurofeedback; Recurrent neural networks; State feedback; Symmetric matrices; Testing;
Conference_Titel :
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN :
0-7803-1421-2
DOI :
10.1109/IJCNN.1993.714247