DocumentCode :
1534455
Title :
Sparse Neural Networks With Large Learning Diversity
Author :
Gripon, Vincent ; Berrou, Claude
Author_Institution :
Electron. Dept., Telecom Bretagne (Inst. Telecom), Brest, France
Volume :
22
Issue :
7
fYear :
2011
fDate :
7/1/2011 12:00:00 AM
Firstpage :
1087
Lastpage :
1096
Abstract :
Coded recurrent neural networks with three levels of sparsity are introduced. The first level is related to the size of messages that are much smaller than the number of available neurons. The second one is provided by a particular coding rule, acting as a local constraint in the neural activity. The third one is a characteristic of the low final connection density of the network after the learning phase. Though the proposed network is very simple since it is based on binary neurons and binary connections, it is able to learn a large number of messages and recall them, even in presence of strong erasures. The performance of the network is assessed as a classifier and as an associative memory.
Keywords :
content-addressable storage; encoding; learning (artificial intelligence); recurrent neural nets; associative memory; binary connections; binary neurons; coded recurrent neural networks; coding rule; learning phase; neural activity; sparse neural networks; Artificial neural networks; Associative memory; Encoding; Maximum likelihood decoding; Neurons; Parity check codes; Associative memory; capacity; classification; clique; diversity; error correcting code; learning machine; recurrent neural network; sparse coding; Computer Simulation; Humans; Learning; Mental Recall; Models, Neurological; Neural Networks (Computer); Neurons;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2011.2146789
Filename :
5784337
Link To Document :
بازگشت