• DocumentCode
    65952
  • Title

    Likelihood-Ratio-Based Verification in High-Dimensional Spaces

  • Author

    Hendrikse, Anne ; Veldhuis, Raymond ; Spreeuwers, Luuk

  • Author_Institution
    Signals & Syst. Group, Univ. of Twente, Overijsel, Netherlands
  • Volume
    36
  • Issue
    1
  • fYear
    2014
  • fDate
    Jan. 2014
  • Firstpage
    127
  • Lastpage
    139
  • Abstract
    The increase of the dimensionality of data sets often leads to problems during estimation, which are denoted as the curse of dimensionality. One of the problems of second-order statistics (SOS) estimation in high-dimensional data is that the resulting covariance matrices are not full rank, so their inversion, for example, needed in verification systems based on the likelihood ratio, is an ill-posed problem, known as the singularity problem. A classical solution to this problem is the projection of the data onto a lower dimensional subspace using principle component analysis (PCA) and it is assumed that any further estimation on this dimension-reduced data is free from the effects of the high dimensionality. Using theory on SOS estimation in high-dimensional spaces, we show that the solution with PCA is far from optimal in verification systems if the high dimensionality is the sole source of error. For moderate dimensionality, it is already outperformed by solutions based on euclidean distances and it breaks down completely if the dimensionality becomes very high. We propose a new method, the fixed-point eigenwise correction, which does not have these disadvantages and performs close to optimal.
  • Keywords
    covariance matrices; geometry; knowledge verification; principal component analysis; PCA; covariance matrices; dimension-reduced data; euclidean distances; fixed-point eigenwise correction; high-dimensional data; high-dimensional spaces; ill-posed problem; likelihood-ratio-based verification; lower dimensional subspace; principle component analysis; second-order statistics estimation; singularity problem; Covariance matrices; Eigenvalues and eigenfunctions; Estimation; Euclidean distance; Principal component analysis; Training; Training data; High-dimensional verification; MarÄ´enko Pastur equation; eigenvalue bias correction; eigenwise correction; euclidean distance; fixed-point eigenvalue correction; principle component analysis; variance correction;
  • fLanguage
    English
  • Journal_Title
    Pattern Analysis and Machine Intelligence, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0162-8828
  • Type

    jour

  • DOI
    10.1109/TPAMI.2013.93
  • Filename
    6517177