DocumentCode :
1090026
Title :
Comments on "Principal component extraction using recursive least squares learning"
Author :
Yongfeng Miao
Author_Institution :
Dept. of Electr. & Electron. Eng., Melbourne Univ., Parkville, Vic., Australia
Volume :
7
Issue :
4
fYear :
1996
fDate :
7/1/1996 12:00:00 AM
Firstpage :
1052
Abstract :
In the above paper, (Bannour and Azimi-Sanjadi, 1995) we point out and correct flaws in the proofs of the orthonormal property of the optimal weight vectors of a two-layer linear auto-associative network used for sequentially extracting the principal components of a stationary vector stochastic process.
Keywords :
covariance matrices; learning (artificial intelligence); multilayer perceptrons; optimal weight vectors; orthonormal property; principal component extraction; recursive least squares learning; stationary vector stochastic process; two-layer linear auto-associative network; Covariance matrix; Eigenvalues and eigenfunctions; Least squares methods; Multilayer perceptrons; Neural networks; Neurons; Resonance light scattering; Stochastic processes; Vectors;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.508950
Filename :
508950
Link To Document :
بازگشت