Title :
Recursive PCA and the structure of time series
Author :
Voegtlin, Thomas
Author_Institution :
Inst. for Theor. Biol., Humboldt Univ., Berlin, Germany
Abstract :
A recurrent linear network can be trained with Oja´s constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of principal components analysis (PCA) to time-series, called recursive PCA. During learning, the weights of the network are adapted to the temporal statistics of its input, in a way that maximizes the information retained by the network. Sequences stored in the network may be retrieved in the reverse order of presentation, thus providing a straight-forward implementation of a logical stack.
Keywords :
Hebbian learning; generalisation (artificial intelligence); principal component analysis; recurrent neural nets; recursive estimation; time series; Oja constrained Hebbian learning rule; generalization; principal components analysis; recurrent linear network; recursive PCA; statistics; temporal context representation; time series; Constraint theory; Electronic mail; Encoding; Neural networks; Neurons; Principal component analysis; Sequences; Statistics; Unsupervised learning; Vectors;
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
Print_ISBN :
0-7803-8359-1
DOI :
10.1109/IJCNN.2004.1380899