DocumentCode :
1948219
Title :
Optimal Learning Rates for Some Principal Component Analysis Algorithms
Author :
Chiu, Shih-Yu ; Lan, Leu-Shing ; Hwang, Yu-Cheng
fYear :
2007
fDate :
12-17 Aug. 2007
Firstpage :
2223
Lastpage :
2226
Abstract :
Principal component analysis (PCA) has been shown to be very fruitful for extracting the most useful information from a given sequence of observations. Quite a number of PCA methods can be found in the literature. In this work, we concentrate on the derivation of optimal learning rates for some well-known adaptive PCA algorithms. A detailed derivation procedure is described which results in closed-form formulae of the optimal learning rates. These optimal learning rates can be obtained from the solution of either quadratic or cubic equations which can be analytically solved, where no numerical procedures are needed. The key advantage of the optimal learning rate is to offer a wise mechanism to automatically adjust the learning stepsize.
Keywords :
learning (artificial intelligence); principal component analysis; cubic equations; optimal learning rates; principal component analysis algorithms; quadratic equations; Adaptive algorithm; Convergence; Data mining; Data visualization; Equations; Information theory; Lagrangian functions; Neural networks; Principal component analysis; Stochastic processes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
ISSN :
1098-7576
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
Type :
conf
DOI :
10.1109/IJCNN.2007.4371303
Filename :
4371303
Link To Document :
بازگشت