Title :
An alternative proof of convergence for Kung-Diamantaras APEX algorithm
Author :
Chen, H. ; Liu, R.
Author_Institution :
Dept. of Electr. Eng., Notre Dame Univ., IN, USA
fDate :
30 Sep-1 Oct 1991
Abstract :
The problem of adaptive principal components extraction (APEX) has gained much interest. In 1990, a new neuro-computation algorithm for this purpose was proposed by S. Y. Kung and K. I. Diamautaras. (see ICASSP 90, p.861-4, vol.2, 1990). An alternative proof is presented to illustrate that the K-D algorithm is in fact richer than has been proved before. The proof shows that the neural network will converge and the principal components can be extracted, without assuming that some of projections of synaptic weight vectors have diminished to zero. In addition, the authors show that the K-D algorithm converges exponentially
Keywords :
convergence; neural nets; signal processing; Kung-Diamantaras APEX algorithm; adaptive principal components extraction; convergence; neural network; neuro-computation algorithm; signal processing; synaptic weight vectors; Computer networks; Convergence; Covariance matrix; Eigenvalues and eigenfunctions; Joining processes; Neural networks; Principal component analysis; Signal processing; Signal processing algorithms; Vectors;
Conference_Titel :
Neural Networks for Signal Processing [1991]., Proceedings of the 1991 IEEE Workshop
Conference_Location :
Princeton, NJ
Print_ISBN :
0-7803-0118-8
DOI :
10.1109/NNSP.1991.239537