DocumentCode :
1089787
Title :
A symmetric linear neural network that learns principal components and their variances
Author :
Peper, Ferdinand ; Noda, Hideki
Author_Institution :
Commun. Res. Lab., Japanese Minist. of Posts & Telecommun., Kobe, Japan
Volume :
7
Issue :
4
fYear :
1996
fDate :
7/1/1996 12:00:00 AM
Firstpage :
1042
Lastpage :
1047
Abstract :
This paper proposes a linear neural network for principal component analysis whose weight vector lengths converge to the variances of the principal components in the input data. The neural network breaks the symmetry in its learning process by the differences in weight vector lengths and, as opposed to other linear neural networks described in literature, does not need to assume any asymmetries in its structure to extract the principal components. We prove the asymptotic stability of a stationary solution of the network´s learning equation. Simulations show that the set of weight vectors converge to this solution. Comparison of convergence speeds shows that in the simulations the proposed neural network is about as fast as Sanger´s generalized Hebbian algorithm (GHA) network, the weighted subspace rule network of Oja et al., and Xu´s LMSER network (weighted linear version)
Keywords :
asymptotic stability; learning (artificial intelligence); neural nets; statistical analysis; asymptotic stability; convergence speeds; principal component analysis; symmetric linear neural network; weight vector lengths; Artificial neural networks; Asymptotic stability; Data mining; Decorrelation; Equations; Neural networks; Neurons; Principal component analysis; Telecommunication computing; Vectors;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.508948
Filename :
508948
Link To Document :
بازگشت