Title of article :
Regularized least-squares regression: Learning from a sequence
Author/Authors :
Farahmand، نويسنده , , Amir-massoud and Szepesvلri، نويسنده , , Csaba، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2012
Pages :
13
From page :
493
To page :
505
Abstract :
We analyze the rate of convergence of the estimation error in regularized least-squares regression when the data is exponentially β - mixing . The results are proven under the assumption that the metric entropy of the balls in the chosen function space grows at most polynomially. In order to prove our main result, we also derive a relative deviation concentration inequality for β - mixing processes, which might be of independent interest. The other major techniques that we use are the independent-blocks technique and the peeling device. An interesting aspect of our analysis is that in order to obtain fast rates we have to make the block sizes dependent on the layer of peeling. With this approach, up to a logarithmic factor, we recover the optimal minimax rates available for the i.i.d. case. In particular, our rate asymptotically matches the optimal rate of convergence when the regression function belongs to a Sobolev space.
Keywords :
Dependent stochastic processes , Convergence Rate , Regularized least-squares regression
Journal title :
Journal of Statistical Planning and Inference
Serial Year :
2012
Journal title :
Journal of Statistical Planning and Inference
Record number :
2221757
Link To Document :
بازگشت