Title :
Robust nonparametric regression by controlling sparsity*
Author :
Mateos, Gonzalo ; Giannakis, Georgios B.
Author_Institution :
Dept. of ECE, Univ. of Minnesota, 200 Union Street SE, Minneapolis, 55455, USA
Abstract :
Nonparametric methods are widely applicable to statistical learning problems, since they rely on a few modeling assumptions. In this context, the fresh look advocated here permeates benefits from variable selection and compressive sampling, to robustify nonparametric regression against outliers. A variational counterpart to least-trimmed squares regression is shown closely related to an ℓ0-(pseudo)norm-regularized estimator, that encourages sparsity in a vector explicitly modeling the outliers. This connection suggests efficient (approximate) solvers based on convex relaxation, which lead naturally to a variational M-type estimator equivalent to Lasso. Outliers are identified by judiciously tuning regularization parameters, which amounts to controlling the sparsity of the outlier vector along the whole robustification path of Lasso solutions. An improved estimator with reduced bias is obtained after replacing the ℓ0-(pseudo)norm with a nonconvex surrogate, as corroborated via simulated tests on robust thin-plate smoothing splines.
Keywords :
Approximation methods; Context; Estimation; Linear regression; Robustness; Training; Tuning; Lasso; Robustness; nonparametric regression; outlier rejection; sparsity;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2011 IEEE International Conference on
Conference_Location :
Prague, Czech Republic
Print_ISBN :
978-1-4577-0538-0
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2011.5947199