• DocumentCode
    3593253
  • Title

    Improved Proximal Support Vector Machine via Generalized Eigenvalues

  • Author

    Ye, Qiaolin ; Ye, Ning

  • Author_Institution
    Sch. of Inf. Technol., Nanjing Forestry Univ., Nanjing, China
  • Volume
    1
  • fYear
    2009
  • Firstpage
    705
  • Lastpage
    709
  • Abstract
    GEPSVM [1, 2, 3] does not need to solve quadratic programming problem as for SVM. It can also obtain comparable test set correctness compared to that of SVM. Despite of its successes, GEPSVM may get poor performance when the generalized eigen-equation problem is ill-conditioned. Moreover, it is sensitive to data noise. Aiming at the orientation problems, in this paper, we propose two algorithms: IGEPSVM and IDGEPSVM. Computational results on public datasets from UCI [4] indicate that the proposed IGPSVM can overcome the singular problem appearing in GEPSVM; IDGEPSVM, when influenced by data noise, can obtain better test set correctness than that of GEPSVM, and with comparable training time. All two algorithms obtain two nonparallel planes only through solving the simple eigenvalues problems instead of the generalized eigenvalues problems.
  • Keywords
    eigenvalues and eigenfunctions; learning (artificial intelligence); matrix algebra; pattern classification; support vector machines; GEPSVM classifier; data classification problem; data noise; generalized eigenvalue problem; matrix algebra; proximal support vector machine; training time; Eigenvalues and eigenfunctions; Electronic mail; Forestry; Information technology; Least squares methods; Nonlinear equations; Quadratic programming; Support vector machine classification; Support vector machines; Testing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Computational Sciences and Optimization, 2009. CSO 2009. International Joint Conference on
  • Print_ISBN
    978-0-7695-3605-7
  • Type

    conf

  • DOI
    10.1109/CSO.2009.295
  • Filename
    5193791