• DocumentCode
    284749
  • Title

    An adaptive approach for optimal data reduction using recursive least squares learning method

  • Author

    Bannour, S. ; Azimi-Sadjadi, M.R.

  • Author_Institution
    Dept. of Electr. Eng., Colorado State Univ., Fort Collins, CO, USA
  • Volume
    2
  • fYear
    1992
  • fDate
    23-26 Mar 1992
  • Firstpage
    297
  • Abstract
    An approach is introduced for the recursive computation of the principal components of a vector stochastic process. The neurons of a single-layer perceptron are sequentially trained using a recursive least squares (RLS)-type algorithm to extract the principal components of the input process. The proof of the convergence of the weights at the n th neuron to the nth principal component, given that the previous (n-1) training steps have determined the first (n -1) principal components, is established. Simulation results are given to show the accuracy and speed of this algorithm in comparison with previous methods
  • Keywords
    convergence; learning (artificial intelligence); least squares approximations; neural nets; stochastic processes; adaptive approach; convergence; optimal data reduction; recursive least squares learning method; single-layer perceptron; vector stochastic process; Convergence; Eigenvalues and eigenfunctions; Learning systems; Least squares methods; Neural networks; Neurons; Principal component analysis; Resonance light scattering; Stochastic processes; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Acoustics, Speech, and Signal Processing, 1992. ICASSP-92., 1992 IEEE International Conference on
  • Conference_Location
    San Francisco, CA
  • ISSN
    1520-6149
  • Print_ISBN
    0-7803-0532-9
  • Type

    conf

  • DOI
    10.1109/ICASSP.1992.226061
  • Filename
    226061