• DocumentCode
    91952
  • Title

    Nonlinear Projection Trick in Kernel Methods: An Alternative to the Kernel Trick

  • Author

    Nojun Kwak

  • Author_Institution
    Grad. Sch. of Convergence Sci. & Technol., Seoul Nat. Univ., Seoul, South Korea
  • Volume
    24
  • Issue
    12
  • fYear
    2013
  • fDate
    Dec. 2013
  • Firstpage
    2113
  • Lastpage
    2119
  • Abstract
    In kernel methods such as kernel principal component analysis (PCA) and support vector machines, the so called kernel trick is used to avoid direct calculations in a high (virtually infinite) dimensional kernel space. In this brief, based on the fact that the effective dimensionality of a kernel space is less than the number of training samples, we propose an alternative to the kernel trick that explicitly maps the input data into a reduced dimensional kernel space. This is easily obtained by the eigenvalue decomposition of the kernel matrix. The proposed method is named as the nonlinear projection trick in contrast to the kernel trick. With this technique, the applicability of the kernel methods is widened to arbitrary algorithms that do not use the dot product. The equivalence between the kernel trick and the nonlinear projection trick is shown for several conventional kernel methods. In addition, we extend PCA-L1, which uses L1-norm instead of L2-norm (or dot product), into a kernel version and show the effectiveness of the proposed approach.
  • Keywords
    eigenvalues and eigenfunctions; matrix algebra; principal component analysis; support vector machines; PCA-L1; eigenvalue decomposition; kernel matrix; kernel principal component analysis; kernel trick; nonlinear projection trick; reduced dimensional kernel space; support vector machine; Eigenvalues and eigenfunctions; Kernel; Principal component analysis; Support vector machines; Training; Training data; Vectors; Dimensionality reduction; KPCA-L1; kernel PCA (KPCA); kernel methods; nonlinear projection trick; support vector machines;
  • fLanguage
    English
  • Journal_Title
    Neural Networks and Learning Systems, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    2162-237X
  • Type

    jour

  • DOI
    10.1109/TNNLS.2013.2272292
  • Filename
    6584012