DocumentCode :
2629091
Title :
Adaptive history compression for learning to divide and conquer
Author :
Schmidhuber, Jürgen
Author_Institution :
Dept. of Comput. Sci., Colorado Univ., Boulder, CO, USA
fYear :
1991
fDate :
18-21 Nov 1991
Firstpage :
1130
Abstract :
An attempt is made to determine how a system can learn to reduce the descriptions of event sequences without losing information. It is shown that the learning system ought to concentrate on unexpected inputs and ignore expected ones. This insight leads to the construction of neural systems which learn to `divide and conquer´ by recursively composing sequences. The first system creates a self-organizing multilevel hierarchy of recurrent predictors. The second system involves only two recurrent networks: it tries to collapse a multi level predictor hierarchy into a single recurrent net. Experiments show that the system can require less computation per time step and much fewer training sequences than the conventional training algorithms for recurrent nets
Keywords :
learning systems; neural nets; self-adjusting systems; adaptive history compression learning; event sequence description; learning system; neural nets; recurrent networks; recurrent predictors; recursively composing sequences; self-organizing multilevel hierarchy; Computer science; History; Learning systems; Prediction algorithms; Recurrent neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
Type :
conf
DOI :
10.1109/IJCNN.1991.170548
Filename :
170548
Link To Document :
بازگشت