Title :
Global Convergence of GHA Learning Algorithm With Nonzero-Approaching Adaptive Learning Rates
Author :
Lv, Jian Cheng ; Yi, Zhang ; Tan, Kok Kiong
Author_Institution :
Univ. of Electron. Sci. & Technol. of China, Chengdu
Abstract :
The generalized Hebbian algorithm (GHA) is one of the most widely used principal component analysis (PCA) neural network (NN) learning algorithms. Learning rates of GHA play important roles in convergence of the algorithm for applications. Traditionally, the learning rates of GHA are required to converge to zero so that its convergence can be analyzed by studying the corresponding deterministic continuous-time (DCT) equations. However, the requirement for learning rates to approach zero is not a practical one in applications due to computational roundoff limitations and tracking requirements. In this paper, nonzero-approaching adaptive learning rates are proposed to overcome this problem. These proposed adaptive learning rates converge to some positive constants, which not only speed up the algorithm evolution considerably, but also guarantee global convergence of the GHA algorithm. The convergence is studied in detail by analyzing the corresponding deterministic discrete-time (DDT) equations. Extensive simulations are carried out to illustrate the theory.
Keywords :
Hebbian learning; continuous time systems; discrete time systems; principal component analysis; PCA; computational roundoff limitations; deterministic continuous-time equations; deterministic discrete-time equations; generalized Hebbian algorithm; global convergence; nonzero-approaching adaptive learning rates; principal component analysis; Deterministic discrete-time (DDT) equation; generalized Hebbian algorithm (GHA); global convergence; neural networks (NNs); principal component analysis (PCA);
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2007.895824