• DocumentCode
    3292974
  • Title

    A sequential adder using recurrent networks

  • Author

    Tsung, Fu-Sheng ; Cottrell, Garrison W.

  • Author_Institution
    Dept. of Comput. Sci. & Eng., California Univ., San Diego, CA, USA
  • fYear
    1989
  • fDate
    0-0 1989
  • Firstpage
    133
  • Abstract
    D.E. Rumelhart et al.´s proposal (1986) of how symbolic processing is achieved in PDP (parallel distributed processing) networks is tested by training two types of recurrent networks to learn to add two numbers of arbitrary lengths. A method of combining old and new training sets is developed which enables the network to learn and generalize with very large training sets. Through this model of addition, these networks demonstrated capability to do simple conditional branching, while loops, and sequences, mechanisms essential for a universal computer. Differences between the two types of recurrent networks are discussed, as well as implications for human learning.<>
  • Keywords
    adders; learning systems; neural nets; parallel architectures; combined subset training; conditional branching; learning; neural networks; parallel distributed processing; recurrent networks; sequential adder; symbolic processing; Adders; Learning systems; Neural networks; Parallel architectures;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1989. IJCNN., International Joint Conference on
  • Conference_Location
    Washington, DC, USA
  • Type

    conf

  • DOI
    10.1109/IJCNN.1989.118690
  • Filename
    118690