Title :
Improved input representation for enhancement of neural network performance
Author :
Aldrich ; Lee, Kahyun ; Lee, Y.C.
Abstract :
Summary form only given. An important consideration for the implementation of associative memory is the storage capacity of the network. For a Hopfield net, the memory capacity for uncorrelated patterns is approximately 0.25 N/log N, where N is the number of neurons. In general, the capacity for information storage is proportional to the number of synapses. For fully connected networks the number of synapses scales as N/sup m+1/, where m is the order of the network. Higher order networks have a much greater storage capacity than a Hopfield net for an equivalent number of neurons. Simply increasing the number of neurons will not always increase the storage capacity. This is shown by simulation results. It is also shown that a simple modification of the pattern vector to have zero bias will provide an even more significant increase in the performance of an associative memory network.<>
Keywords :
content-addressable storage; memory architecture; neural nets; Hopfield net; associative memory; associative memory network; enhancement; input representation; memory capacity; neural network performance; pattern vector; storage capacity; synapses; uncorrelated patterns; zero bias; Associative memories; Memory architecture; Neural networks;
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
DOI :
10.1109/IJCNN.1989.118300