Title :
Analysis of Regularized Least Square Algorithms with Beta-Mixing Input Sequences
Author :
Li, Luoqing ; Zou, Bin
Author_Institution :
Hubei Univ., Wuhan
Abstract :
The generalization performance is the important property of learning machines. It has been shown previously by Vapnik, Cucker and Smale, et.al. that, the empirical risks of learning machines based on an i.i.d. sequence must uniformly converge to their expected risks as the number of samples approaches infinity. This paper considers regularization schemes associated with the least square loss and reproducing kernel Hilbert spaces. It develops a theoretical analysis of generalization performances of regularized least squares on reproducing kernel Hilbert spaces for supervised learning with beta-mixing input sequences.
Keywords :
Hilbert spaces; generalisation (artificial intelligence); learning (artificial intelligence); least squares approximations; beta-mixing input sequence; generalization performance; regularized least square algorithm; reproducing kernel Hilbert space; supervised learning machine; Algorithm design and analysis; Computer science; Hilbert space; Kernel; Least squares methods; Machine learning; Mathematics; Performance analysis; Probability distribution; Resonance light scattering;
Conference_Titel :
Natural Computation, 2007. ICNC 2007. Third International Conference on
Conference_Location :
Haikou
Print_ISBN :
978-0-7695-2875-5
DOI :
10.1109/ICNC.2007.237