DocumentCode :
2971063
Title :
Nonlinear generalizations of principal component learning algorithms
Author :
Karhunen, Juha ; Joutsensalo, Jyrki
Author_Institution :
Lab. of Comput. & Inf. Sci., Helsinki Univ. of Technol., Espoo, Finland
Volume :
3
fYear :
1993
fDate :
25-29 Oct. 1993
Firstpage :
2599
Abstract :
In this paper, we introduce and study nonlinear generalizations of several neural algorithms that learn the principal eigenvectors of the data covariance matrix. We first consider the robust versions that optimize a nonquadratic criterion under orthonormality constraints. As an important byproduct, Sanger´s GHA and Oja´s SGA algorithms for learning principal components are derived from a natural optimization problem. We also introduce a fully nonlinear generalization that has signal separation capabilities not possessed by standard principal component analysis learning algorithms.
Keywords :
covariance matrices; eigenvalues and eigenfunctions; generalisation (artificial intelligence); learning (artificial intelligence); neural nets; optimisation; Oja SGA algorithm; Sanger GHA algorithm; data covariance matrix; neural algorithms; neural networks; nonlinear generalizations; optimization; orthonormality constraints; principal component analysis; principal component learning; principal eigenvectors; Covariance matrix; Electronics packaging; Information science; Laboratories; Neurons; Principal component analysis; Robustness; Signal processing; Signal processing algorithms; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN :
0-7803-1421-2
Type :
conf
DOI :
10.1109/IJCNN.1993.714256
Filename :
714256
Link To Document :
بازگشت