• DocumentCode
    296119
  • Title

    Relaxing backpropagation networks as associative memories

  • Author

    Peng, Yun ; Zhou, Zonglin ; McClenney, Erik

  • Author_Institution
    Dept. of Comput. Sci. & Electr. Eng., Maryland Univ., Baltimore, MD, USA
  • Volume
    4
  • fYear
    1995
  • fDate
    Nov/Dec 1995
  • Firstpage
    1777
  • Abstract
    One of the most attractive features of associative memories (AM) is their abilities of associative recall, especially recall by incomplete or noisy inputs. However, most existing neural network AM models suffer from their limited storage capacities. In this paper a new model for autoassociative memory based on a novel use of backpropagation (BP) networks is proposed. In this model, recording is done by establishing pattern auto-associations using BP learning, and recall by continuously feeding back the output to the input until the network relaxes into a stable state. The convergence of the recall process is analyzed. The validity of the model is tested by computer experiments. The experiment results show that O(2n) patterns of length n can be correctly stored, and the recall quality with noisy inputs compares very favorably to conventional AM models
  • Keywords
    backpropagation; content-addressable storage; convergence; neural nets; relaxation theory; associative memories; associative recall; autoassociative memory; continuous feedback; incomplete inputs; limited storage capacities; neural network; noisy inputs; recall process convergence; relaxing backpropagation networks; Active noise reduction; Associative memory; Backpropagation; Computer science; Convergence; Correlation; Feedforward neural networks; Neural networks; Symmetric matrices; Testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1995. Proceedings., IEEE International Conference on
  • Conference_Location
    Perth, WA
  • Print_ISBN
    0-7803-2768-3
  • Type

    conf

  • DOI
    10.1109/ICNN.1995.488890
  • Filename
    488890