• DocumentCode
    384271
  • Title

    PCA in autocorrelation space

  • Author

    Popovici, Vlad ; Thiran, Jean-Philippe

  • Author_Institution
    Signal Process. Inst., Swiss Fed. Inst. of Technol., Lausanne, Switzerland
  • Volume
    2
  • fYear
    2002
  • fDate
    2002
  • Firstpage
    132
  • Abstract
    The use of higher order autocorrelations as features for pattern classification has been usually restricted to second or third orders due to high computational costs. Since the autocorrelation space is a high dimensional space we are interested in reducing the dimensionality of feature vectors for the benefit of the pattern classification task. An established technique is Principal Component Analysis (PCA) which, however, cannot be applied directly in autocorrelation space. In this paper we develop a new method for performing PCA in autocorrelation space, without explicitly computing the autocorrelations. Connections with nonlinear PCA and possible extensions are also discussed.
  • Keywords
    correlation methods; higher order statistics; pattern classification; principal component analysis; vectors; autocorrelation space; classification rate; feature vector dimensionality reduction; high dimensional space; multi-order autocorrelation vectors; pattern classification; principal component analysis; Autocorrelation; Computational efficiency; Covariance matrix; High performance computing; Pattern recognition; Principal component analysis; Signal processing; Space technology; Topology;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Pattern Recognition, 2002. Proceedings. 16th International Conference on
  • ISSN
    1051-4651
  • Print_ISBN
    0-7695-1695-X
  • Type

    conf

  • DOI
    10.1109/ICPR.2002.1048255
  • Filename
    1048255