DocumentCode :
1144542
Title :
An adaptive learning algorithm for principal component analysis
Author :
Chen, Liang-Hwa ; Chang, Shyang
Author_Institution :
Dept. of Electr. Eng., Nat. Tsing Hua Univ., Hsinchu, Taiwan
Volume :
6
Issue :
5
fYear :
1995
fDate :
9/1/1995 12:00:00 AM
Firstpage :
1255
Lastpage :
1263
Abstract :
Principal component analysis (PCA) is one of the most general purpose feature extraction methods. A variety of learning algorithms for PCA has been proposed. Many conventional algorithms, however, will either diverge or converge very slowly if learning rate parameters are not properly chosen. In this paper, an adaptive learning algorithm (ALA) for PCA is proposed. By adaptively selecting the learning rate parameters, we show that the m weight vectors in the ALA converge to the first m principle component vectors with almost the same rates. Comparing with the Sanger´s generalized Hebbian algorithm (GHA), the ALA can quickly find the desired principal component vectors while the GHA fails to do so. Finally, simulation results are also included to illustrate the effectiveness of the ALA
Keywords :
feature extraction; learning (artificial intelligence); neural nets; adaptive learning algorithm; feature extraction methods; generalized Hebbian algorithm; principal component analysis; principal component vectors; Covariance matrix; Data mining; Eigenvalues and eigenfunctions; Feature extraction; Hardware; Information processing; Neural networks; Pattern recognition; Principal component analysis; Very large scale integration;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.410369
Filename :
410369
Link To Document :
بازگشت