DocumentCode
3601248
Title
Sparse Representation in Kernel Machines
Author
Hongwei Sun ; Qiang Wu
Author_Institution
Sch. of Math. Sci., Univ. of Jinan, Jinan, China
Volume
26
Issue
10
fYear
2015
Firstpage
2576
Lastpage
2582
Abstract
We study the properties of least square kernel regression with ℓ1 coefficient regularization. The kernels can be flexibly chosen to be either positive definite or indefinite. Asymptotic learning rates are deduced under smoothness condition on the kernel. Sparse representation of the solution is characterized theoretically. Empirical simulations and real applications indicate that both good learning performance and sparse representation could be guaranteed.
Keywords
learning (artificial intelligence); least squares approximations; regression analysis; ℓ1 coefficient regularization; asymptotic learning rates; kernel machines; least square kernel regression; smoothness condition; sparse representation; Approximation methods; Compressed sensing; Kernel; Learning systems; Mathematical model; Polynomials; Probability distribution; $ell _{1}$ regularization; ℓ₁ regularization; Indefinite kernel; kernel machine; learning theory; regression; sparsity; sparsity.;
fLanguage
English
Journal_Title
Neural Networks and Learning Systems, IEEE Transactions on
Publisher
ieee
ISSN
2162-237X
Type
jour
DOI
10.1109/TNNLS.2014.2375209
Filename
7024168
Link To Document