DocumentCode :
3206406
Title :
Mitigation of catastrophic interference in neural networks using a fixed expansion layer
Author :
Coop, Robert ; Arel, Itamar
Author_Institution :
Machine Intell. Lab., Univ. of Tennessee, Knoxville, TN, USA
fYear :
2012
fDate :
5-8 Aug. 2012
Firstpage :
726
Lastpage :
729
Abstract :
In this paper we present the fixed expansion layer (FEL) feedforward neural network designed for balancing plasticity and stability in the presence of non-stationary inputs. Catastrophic interference (or catastrophic forgetting) refers to the drastic loss of previously learned information when a neural network is trained on new or different information. The goal of the FEL network is to reduce the effect of catastrophic interference by augmenting a multilayer perceptron with a layer of sparse neurons with binary activations. We compare the FEL network´s performance to that of other algorithms designed to combat the effects of catastrophic interference and demonstrate that the FEL network is able to retain information for significantly longer periods of time with substantially lower computational requirements.
Keywords :
multilayer perceptrons; binary activations; catastrophic forgetting; catastrophic interference; fixed expansion layer feedforward neural network; multilayer perceptron; non-stationary inputs; sparse neurons; Accuracy; Biological neural networks; Feedforward neural networks; Interference; Neurons; Training;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems (MWSCAS), 2012 IEEE 55th International Midwest Symposium on
Conference_Location :
Boise, ID
ISSN :
1548-3746
Print_ISBN :
978-1-4673-2526-4
Electronic_ISBN :
1548-3746
Type :
conf
DOI :
10.1109/MWSCAS.2012.6292123
Filename :
6292123
Link To Document :
بازگشت