DocumentCode :
1112571
Title :
Adaptive Principal component EXtraction (APEX) and applications
Author :
Kung, S.Y. ; Diamantaras, K.I. ; Taur, J.S.
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., NJ, USA
Volume :
42
Issue :
5
fYear :
1994
fDate :
5/1/1994 12:00:00 AM
Firstpage :
1202
Lastpage :
1217
Abstract :
The authors describe a neural network model (APEX) for multiple principal component extraction. All the synaptic weights of the model are trained with the normalized Hebbian learning rule. The network structure features a hierarchical set of lateral connections among the output units which serve the purpose of weight orthogonalization. This structure also allows the size of the model to grow or shrink without need for retraining the old units. The exponential convergence of the network is formally proved while there is significant performance improvement over previous methods. By establishing an important connection with the recursive least squares algorithm they have been able to provide the optimal size for the learning step-size parameter which leads to a significant improvement in the convergence speed. This is in contrast with previous neural PCA models which lack such numerical advantages. The APEX algorithm is also parallelizable allowing the concurrent extraction of multiple principal components. Furthermore, APEX is shown to be applicable to the constrained PCA problem where the signal variance is maximized under external orthogonality constraints. They then study various principal component analysis (PCA) applications that might benefit from the adaptive solution offered by APEX. In particular they discuss applications in spectral estimation, signal detection and image compression and filtering, while other application domains are also briefly outlined
Keywords :
Hebbian learning; data compression; filtering and prediction theory; image processing; neural nets; signal detection; signal processing; spectral analysis; APEX; convergence speed; daptive Principal component extraction; exponential convergence; filtering; image compression; learning step-size parameter; multiple principal component extraction; network structure; neural network model; normalized Hebbian learning rule; recursive least squares algorithm; signal detection; spectral estimation; synaptic weights; weight orthogonalization; Autocorrelation; Biological system modeling; Convergence; Hebbian theory; Image coding; Least squares methods; Neural networks; Neurons; Principal component analysis; Signal detection;
fLanguage :
English
Journal_Title :
Signal Processing, IEEE Transactions on
Publisher :
ieee
ISSN :
1053-587X
Type :
jour
DOI :
10.1109/78.295198
Filename :
295198
Link To Document :
بازگشت