• DocumentCode
    396741
  • Title

    Model selection for k-nearest neighbors regression using VC bounds

  • Author

    Cherkassky, Vladimir ; Ma, Yunqian ; Tang, Jun

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Minnesota Univ., USA
  • Volume
    2
  • fYear
    2003
  • fDate
    20-24 July 2003
  • Firstpage
    1143
  • Abstract
    We discuss an analytic model selection for k-nearest neighbors regression method using VC generalization bounds. Whereas existing implementations of k-nn regression estimate the model complexity as n/k, where n is the number of samples, we propose a new model complexity estimate. The proposed new complexity index used as the VC-dimension in VC bounds yields a new analytic method for model selection. Empirical results for low dimensional and high dimensional data sets indicate that the proposed model selection approach provides accurate model selection that is consistently better than the previously used complexity measure. In fact, prediction accuracy of the proposed analytic method is similar to resampling (cross-validation) approach for optimal selection of k.
  • Keywords
    learning (artificial intelligence); regression analysis; VC generalization bounds; analytic method; complexity index; cross-validation approach; k-nearest neighbors regression; k-nn regression; model selection approach; optimal selection; prediction accuracy; resampling approach; Accuracy; Learning systems; Loss measurement; Multidimensional systems; Nearest neighbor searches; Parameter estimation; Predictive models; Risk analysis; Training data; Virtual colonoscopy;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2003. Proceedings of the International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7898-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2003.1223852
  • Filename
    1223852