DocumentCode :
1583041
Title :
Analysis of Regularized Least Square Algorithms with Beta-Mixing Input Sequences
Author :
Li, Luoqing ; Zou, Bin
Author_Institution :
Hubei Univ., Wuhan
Volume :
1
fYear :
2007
Firstpage :
89
Lastpage :
93
Abstract :
The generalization performance is the important property of learning machines. It has been shown previously by Vapnik, Cucker and Smale, et.al. that, the empirical risks of learning machines based on an i.i.d. sequence must uniformly converge to their expected risks as the number of samples approaches infinity. This paper considers regularization schemes associated with the least square loss and reproducing kernel Hilbert spaces. It develops a theoretical analysis of generalization performances of regularized least squares on reproducing kernel Hilbert spaces for supervised learning with beta-mixing input sequences.
Keywords :
Hilbert spaces; generalisation (artificial intelligence); learning (artificial intelligence); least squares approximations; beta-mixing input sequence; generalization performance; regularized least square algorithm; reproducing kernel Hilbert space; supervised learning machine; Algorithm design and analysis; Computer science; Hilbert space; Kernel; Least squares methods; Machine learning; Mathematics; Performance analysis; Probability distribution; Resonance light scattering;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Natural Computation, 2007. ICNC 2007. Third International Conference on
Conference_Location :
Haikou
Print_ISBN :
978-0-7695-2875-5
Type :
conf
DOI :
10.1109/ICNC.2007.237
Filename :
4344160
Link To Document :
بازگشت