DocumentCode :
2151861
Title :
Sparsity-regularized support vector machine with stationary mixing input sequence
Author :
Ding, Yi ; Tang, Yi
Author_Institution :
Wuhan Vocational Coll. of Software & Eng., Wuhan, China
fYear :
2010
fDate :
11-14 July 2010
Firstpage :
195
Lastpage :
200
Abstract :
It has been shown that a sparse target can be well learned by the l1-regularized learning methods when samples are independent and identically distributed (i.i.d.). In this paper we go far beyond this classical framework by bounding the generalization errors and excess risks of l1-regularized support vector machine(l1-svm) for stationary β-mixing observations. Utilizing a technique introduced by that constructs a sequence of independent blocks close in distribution to the original samples, such bounds are developed by Rademacher average technique. The results replied partly an open question in of wether Rademacher average technique can be extended to deal with dependent status.
Keywords :
generalisation (artificial intelligence); support vector machines; excess risks; generalization errors; l1-regularized learning methods; sparse target; sparsity-regularized support vector machine; stationary mixing input sequence; wether Rademacher average technique; Robustness; Excess risk; Rademacher average; Stationary β-mixing sequence; l1-regularized support vector machine;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Wavelet Analysis and Pattern Recognition (ICWAPR), 2010 International Conference on
Conference_Location :
Qingdao
Print_ISBN :
978-1-4244-6530-9
Type :
conf
DOI :
10.1109/ICWAPR.2010.5576330
Filename :
5576330
Link To Document :
بازگشت