Title of article :
Asymptotic normality of support vector machine variants and other regularized kernel methods
Author/Authors :
Hable، نويسنده , , Robert، نويسنده ,
Issue Information :
دوفصلنامه با شماره پیاپی سال 2012
Pages :
26
From page :
92
To page :
117
Abstract :
In nonparametric classification and regression problems, regularized kernel methods, in particular support vector machines, attract much attention in theoretical and in applied statistics. In an abstract sense, regularized kernel methods (simply called SVMs here) can be seen as regularized M-estimators for a parameter in a (typically infinite dimensional) reproducing kernel Hilbert space. For smooth loss functions L , it is shown that the difference between the estimator, i.e. the empirical SVM f L , D n , λ D n , and the theoretical SVM f L , P , λ 0 is asymptotically normal with rate n . That is, n ( f L , D n , λ D n − f L , P , λ 0 ) converges weakly to a Gaussian process in the reproducing kernel Hilbert space. As common in real applications, the choice of the regularization parameter D n in f L , D n , λ D n may depend on the data. The proof is done by an application of the functional delta-method and by showing that the SVM-functional P ↦ f L , P , λ is suitably Hadamard-differentiable.
Keywords :
Nonparametric regression , Support vector machine , Regularized kernel method , Asymptotic normality , Hadamard-differentiability , Functional delta-method
Journal title :
Journal of Multivariate Analysis
Serial Year :
2012
Journal title :
Journal of Multivariate Analysis
Record number :
1565714
Link To Document :
بازگشت