DocumentCode :
423973
Title :
Recursive PCA and the structure of time series
Author :
Voegtlin, Thomas
Author_Institution :
Inst. for Theor. Biol., Humboldt Univ., Berlin, Germany
Volume :
3
fYear :
2004
fDate :
25-29 July 2004
Firstpage :
1893
Abstract :
A recurrent linear network can be trained with Oja´s constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of principal components analysis (PCA) to time-series, called recursive PCA. During learning, the weights of the network are adapted to the temporal statistics of its input, in a way that maximizes the information retained by the network. Sequences stored in the network may be retrieved in the reverse order of presentation, thus providing a straight-forward implementation of a logical stack.
Keywords :
Hebbian learning; generalisation (artificial intelligence); principal component analysis; recurrent neural nets; recursive estimation; time series; Oja constrained Hebbian learning rule; generalization; principal components analysis; recurrent linear network; recursive PCA; statistics; temporal context representation; time series; Constraint theory; Electronic mail; Encoding; Neural networks; Neurons; Principal component analysis; Sequences; Statistics; Unsupervised learning; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
ISSN :
1098-7576
Print_ISBN :
0-7803-8359-1
Type :
conf
DOI :
10.1109/IJCNN.2004.1380899
Filename :
1380899
Link To Document :
بازگشت