DocumentCode :
290811
Title :
Application of coding theory to neural net capacity
Author :
Al-Mashouq, Khalid A.
Author_Institution :
King Saud Univ., Riyadh, Saudi Arabia
fYear :
1994
fDate :
27 Jun-1 Jul 1994
Firstpage :
221
Abstract :
There are different definitions of neural net capacity. One of these definitions is what is called statistical pattern capacity. This capacity is defined as the average number of random patterns with random binary desired responses that can be “recognized” using a neural net. Another definition is what we may call worst case capacity. Consider a multilayer net with M input nodes and k output nodes. The “storage” capacity of this net is defined as the maximum number of input patterns for which the network can produce all possible output binary k-tuples. We use information theory, especially Shannon capacity theorem, to relate the neural net capacity to the channel capacity. As a practical example we demonstrate the effectiveness of error correcting codes to mitigate the imperfections of neural nets
Keywords :
channel capacity; error correction codes; feedforward neural nets; multilayer perceptrons; Shannon capacity theorem; channel capacity; coding theory; error correcting codes; information theory; input nodes; multilayer nets; neural net capacity; output nodes; random binary desired responses; random patterns; statistical pattern capacity; storage capacity; worst case capacity; Bandwidth; Communication channels; Error correction codes; Feedforward neural networks; Feedforward systems; Information theory; Multi-layer neural network; Neural networks; Nonhomogeneous media; Speech;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 1994. Proceedings., 1994 IEEE International Symposium on
Conference_Location :
Trondheim
Print_ISBN :
0-7803-2015-8
Type :
conf
DOI :
10.1109/ISIT.1994.394747
Filename :
394747
Link To Document :
بازگشت