DocumentCode :
1509926
Title :
Incremental learning of complex temporal patterns
Author :
Wang, DeLiang ; Yuwono, Budi
Author_Institution :
Lab. for Artificial Intelligence Res., Ohio State Univ., Columbus, OH, USA
Volume :
7
Issue :
6
fYear :
1996
fDate :
11/1/1996 12:00:00 AM
Firstpage :
1465
Lastpage :
1481
Abstract :
A neural model for temporal pattern generation is used and analyzed for training with multiple complex sequences in a sequential manner. The network exhibits some degree of interference when new sequences are acquired. It is proven that the model is capable of incrementally learning a finite number of complex sequences. The model is then evaluated with a large set of highly correlated sequences. While the number of intact sequences increases linearly with the number of previously acquired sequences, the amount of retraining due to interference appears to be independent of the size of existing memory. The model is extended to include a chunking network which detects repeated subsequences between and within sequences. The chunking mechanism substantially reduces the amount of retraining in sequential training. Thus, the network investigated here constitutes an effective sequential memory. Various aspects of such a memory are discussed
Keywords :
learning (artificial intelligence); neural nets; chunking network; complex temporal patterns; highly correlated sequences; incremental learning; multiple complex sequences; neural model; repeated subsequence detection; temporal pattern generation; Associative memory; Biological system modeling; Detectors; Encoding; Information science; Interference; Multilayer perceptrons; Natural languages; Pattern analysis; Speech;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.548174
Filename :
548174
Link To Document :
بازگشت