• DocumentCode
    3309533
  • Title

    Reducing computations in incremental learning for feedforward neural network with long-term memory

  • Author

    Kobyashi, M. ; Zamani, Anuar ; Ozawa, Seiichi ; Abe, Shigeo

  • Author_Institution
    Graduate Sch. of Sci. & Technol., Kobe Univ., Japan
  • Volume
    3
  • fYear
    2001
  • fDate
    2001
  • Firstpage
    1989
  • Abstract
    When neural networks are trained incrementally, input-output relationships that are trained formerly tend to be collapsed by the learning of new training data. This phenomenon is called “interference”. To suppress the interference, we have proposed an incremental learning system (called RAN-LTM), in which long-term memory (LTM) is introduced into a resource allocating network (RAN). Since RAN-LTM needs to train not only new data but also some LTM data to suppress the interference, if many LTM data are retrieved large computations are required. Therefore, it is important to design appropriate procedures for producing and retrieving LTM data in RAN-LTM. In the paper, these procedures in the previous version of RAN-LTM are improved. In simulations, the improved RAN-LTM is applied to the approximation of a one-dimensional function, and the approximation error and the training speed are evaluated as compared with RAN and the previous RAN-LTM
  • Keywords
    feedforward neural nets; learning (artificial intelligence); RAN-LTM; approximation error; computations reduction; feedforward neural network; incremental learning; input-output relationships; long-term memory; resource allocating network; training speed; Buffer storage; Computer networks; Feedforward neural networks; Information retrieval; Intelligent networks; Interference suppression; Learning systems; Neural networks; Radio access networks; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7044-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2001.938469
  • Filename
    938469