DocumentCode :
3112535
Title :
Multi-level error-resilient neural networks
Author :
Salavati, Amir Hesam ; Karbasi, Amin
Author_Institution :
Sch. of Comput. & Commun. Sci., Ecole Polytech. Fed. de Lausanne (EPFL), Lausanne, Switzerland
fYear :
2012
fDate :
1-6 July 2012
Firstpage :
1064
Lastpage :
1068
Abstract :
The problem of neural network association is to retrieve a previously memorized pattern from its noisy version using a network of neurons. An ideal neural network should include three components simultaneously: a learning algorithm, a large pattern retrieval capacity and resilience against noise. Prior works in this area usually improve one or two aspects at the cost of the third. Our work takes a step forward in closing this gap. More specifically, we show that by forcing natural constraints on the set of learning patterns, we can drastically improve the retrieval capacity of our neural network. Moreover, we devise a learning algorithm whose role is to learn those patterns satisfying the above mentioned constraints. Finally we show that our neural network can cope with a fair amount of noise.
Keywords :
learning (artificial intelligence); neural nets; pattern recognition; ideal neural network; large pattern retrieval capacity; learning algorithm; learning patterns; multilevel error-resilient neural networks; neural network association problem; noise resilience; previously memorized pattern retrieval; retrieval capacity improvement; Associative memory; Biological neural networks; Error analysis; Neurons; Noise; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory Proceedings (ISIT), 2012 IEEE International Symposium on
Conference_Location :
Cambridge, MA
ISSN :
2157-8095
Print_ISBN :
978-1-4673-2580-6
Electronic_ISBN :
2157-8095
Type :
conf
DOI :
10.1109/ISIT.2012.6283014
Filename :
6283014
Link To Document :
بازگشت