Title :
Mitigation of catastrophic forgetting in recurrent neural networks using a Fixed Expansion Layer
Author :
Coop, Robert ; Arel, Itamar
Author_Institution :
Min-Kao Dept. of Electr. Eng. & Comput. Sci., Univ. of Tennessee, Knoxville, TN, USA
Abstract :
Catastrophic forgetting (or catastrophic interference) in supervised learning systems is the drastic loss of previously stored information caused by the learning of new information. While substantial work has been published on addressing catastrophic forgetting in memoryless supervised learning systems (e.g. feedforward neural networks), the problem has received limited attention in the context of dynamic systems, particularly recurrent neural networks. In this paper, we introduce a solution for mitigating catastrophic forgetting in RNNs based on enhancing the Fixed Expansion Layer (FEL) neural network which exploits sparse coding of hidden neuron activations. Simulation results on several non-stationary data sets clearly demonstrate the effectiveness of the proposed architecture.
Keywords :
feedforward neural nets; learning (artificial intelligence); recurrent neural nets; FEL neural network; RNN; catastrophic forgetting mitigation; catastrophic interference; dynamic systems; feedforward neural networks; fixed expansion layer; hidden neuron activations; memoryless supervised learning systems; nonstationary data sets; recurrent neural networks; sparse coding; Biological neural networks; Context; Feedforward neural networks; Interference; Neurons; Recurrent neural networks; Training;
Conference_Titel :
Neural Networks (IJCNN), The 2013 International Joint Conference on
Conference_Location :
Dallas, TX
Print_ISBN :
978-1-4673-6128-6
DOI :
10.1109/IJCNN.2013.6707047