Title :
An adaptive learning algorithm for principal component analysis
Author :
Chen, Liang-Hwa ; Chang, Shyang
Author_Institution :
Dept. of Electr. Eng., Nat. Tsing Hua Univ., Hsinchu, Taiwan
fDate :
9/1/1995 12:00:00 AM
Abstract :
Principal component analysis (PCA) is one of the most general purpose feature extraction methods. A variety of learning algorithms for PCA has been proposed. Many conventional algorithms, however, will either diverge or converge very slowly if learning rate parameters are not properly chosen. In this paper, an adaptive learning algorithm (ALA) for PCA is proposed. By adaptively selecting the learning rate parameters, we show that the m weight vectors in the ALA converge to the first m principle component vectors with almost the same rates. Comparing with the Sanger´s generalized Hebbian algorithm (GHA), the ALA can quickly find the desired principal component vectors while the GHA fails to do so. Finally, simulation results are also included to illustrate the effectiveness of the ALA
Keywords :
feature extraction; learning (artificial intelligence); neural nets; adaptive learning algorithm; feature extraction methods; generalized Hebbian algorithm; principal component analysis; principal component vectors; Covariance matrix; Data mining; Eigenvalues and eigenfunctions; Feature extraction; Hardware; Information processing; Neural networks; Pattern recognition; Principal component analysis; Very large scale integration;
Journal_Title :
Neural Networks, IEEE Transactions on