• DocumentCode
    290529
  • Title

    Reducing the computational requirement of the orthogonal least squares algorithm

  • Author

    Chng, E.S. ; Chen, S. ; Mulgrew, B.

  • Author_Institution
    Dept. of Electr. Eng., Edinburgh Univ., UK
  • Volume
    iii
  • fYear
    1994
  • fDate
    19-22 Apr 1994
  • Abstract
    The orthogonal, least squares (OLS) algorithm is an efficient implementation of the forward regression procedure for subset model selection. The ability to find good subset parameters with only linear increase in computational complexity makes this method attractive for practical implementations. We examine the computation requirement of the OLS algorithm to reduce a model of K terms to a subset model of R terms when the number of training data available is N. We show that in the case where N≫K, we can reduce the computation requirement by introducing an unitary transformation on the problem
  • Keywords
    computational complexity; least squares approximations; parameter estimation; prediction theory; statistical analysis; OLS algorithm; computation requirement; computational complexity; forward regression procedure; nonlinear predictors; orthogonal least squares algorithm; subset model selection; subset parameters; training data; unitary transformation; Degradation; Equations; Least squares methods; Linear regression; Predictive models; Reflection; Systems engineering and theory; Testing; Training data; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Acoustics, Speech, and Signal Processing, 1994. ICASSP-94., 1994 IEEE International Conference on
  • Conference_Location
    Adelaide, SA
  • ISSN
    1520-6149
  • Print_ISBN
    0-7803-1775-0
  • Type

    conf

  • DOI
    10.1109/ICASSP.1994.389973
  • Filename
    389973