DocumentCode :
3661107
Title :
Reduction of catastrophic forgetting with transfer learning and ternary output codes
Author :
Steven Gutstein;Ethan Stump
Author_Institution :
United States Army, Research Lab, Adelphi, MD, USA
fYear :
2015
fDate :
7/1/2015 12:00:00 AM
Firstpage :
1
Lastpage :
8
Abstract :
Historically, neural nets have learned new things at the cost of forgetting what they already know. This problem is known as ´catastrophic forgetting´. Here, we examine how training a neural net in accordance with latently learned [1] output encodings drastically reduces catastrophic forgetting. Previous approaches to dealing with catastrophic forgetting have tended either to add extra samples to new training sets, modify the training of hidden nodes or model the interaction between short term and long term memory. Our approach is unique in that it both uses transfer learning to mitigate catastrophic forgetting and focuses upon the output nodes of a neural network. This results in a technique that makes it easier rather than harder to learn new tasks while retaining existing knowledge; is architecture independent and trivial to implement on any existing net. Additionally, we examine the use of ternary output codes. Binary codes assign a value to each output bit that may be thought of as either affirmative or negative. Ternary codes allow for the possibility that not every output bit has a meaningful response to every given input. By not forcing each output bit to train for a specific response for each new class, we hope to lessen catastrophic forgetting.
Keywords :
Training
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2015 International Joint Conference on
Electronic_ISBN :
2161-4407
Type :
conf
DOI :
10.1109/IJCNN.2015.7280416
Filename :
7280416
Link To Document :
بازگشت