• DocumentCode
    1944262
  • Title

    Optimizing SVR Hyperparameters via Fast Cross-Validation using AOSVR

  • Author

    Karasuyama, Masayuki ; Nakano, Ryohei

  • Author_Institution
    Nagoya Inst. of Technol., Nagoya
  • fYear
    2007
  • fDate
    12-17 Aug. 2007
  • Firstpage
    1186
  • Lastpage
    1191
  • Abstract
    The performance of support vector regression (SVR) deeply depends on its hyperparameters such as an insensitive zone thickness, a penalty factor, and kernel parameters. A method called MCV-SVR was once proposed, which optimizes SVR hyperparameters so that cross-validation error is minimized. However, the computational cost of CV is usually high. In this paper we apply accurate online support vector regression (AOSVR) to the MCV-SVR cross-validation procedure. The AOSVR enables an efficient update of a trained SVR function when a sample is removed from training data. We show the AOSVR dramatically accelerates the MCV-SVR. Moreover, our experiments using real-world data showed our faster MCV-SVR has better generalization than other existing methods such as Bayesian SVR or practical setting.
  • Keywords
    optimisation; support vector machines; SVR hyperparameter; accurate online support vector regression; fast cross-validation; optimization; Acceleration; Bayesian methods; Computational efficiency; Kernel; Neural networks; Optimization methods; Pattern classification; Support vector machine classification; Support vector machines; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2007. IJCNN 2007. International Joint Conference on
  • Conference_Location
    Orlando, FL
  • ISSN
    1098-7576
  • Print_ISBN
    978-1-4244-1379-9
  • Electronic_ISBN
    1098-7576
  • Type

    conf

  • DOI
    10.1109/IJCNN.2007.4371126
  • Filename
    4371126