Title of article :
Global Convergence of a PCA Learning Algorithm with a Constant Learning Rate
Author/Authors :
Jian Cheng Lv، نويسنده , , Zhang Yi، نويسنده ,
Issue Information :
دوهفته نامه با شماره پیاپی سال 2006
Pages :
14
From page :
1425
To page :
1438
Abstract :
In most of existing principal components analysis (PCA) learning algorithms, the learning rates are required to approach zero as learning step increases. However, in many practical applications, due to computational round-off limitations and tracking requirements, constant learning rates must be used. This paper proposes a PCA learning algorithm with a constant learning rate. It will prove via DDT (Deterministic Discrete Time) method that this PCA learning algorithm is globally convergent. Simulations are carried out to illustrate the theory.
Keywords :
Neural networks , global convergence , Principal component analysis , Constant learning rate , Deterministic discrete time system
Journal title :
Computers and Mathematics with Applications
Serial Year :
2006
Journal title :
Computers and Mathematics with Applications
Record number :
920578
Link To Document :
بازگشت