DocumentCode :
1373592
Title :
On Complexity Issues of Online Learning Algorithms
Author :
Yao, Yuan
Author_Institution :
Key Lab. of Machine Perception (MOE), Peking Univ., Beijing, China
Volume :
56
Issue :
12
fYear :
2010
Firstpage :
6470
Lastpage :
6481
Abstract :
In this paper, some new probabilistic upper bounds are presented for the online learning algorithm proposed in , and more generally for linear stochastic approximations in Hilbert spaces. With these upper bounds not only does one recover almost sure convergence, but also relaxes the square summable condition on the step size appeared in the early work. Furthermore two probabilistic upper bounds are given for an averaging process, both of which achieve the same rate with respect to sample size as in “batch learning” algorithms, and one of which is tight in both sample size and regularization parameter.
Keywords :
Hilbert spaces; computational complexity; learning (artificial intelligence); stochastic processes; Hilbert spaces; batch learning algorithms; complexity issues; linear stochastic approximations; online learning algorithms; probabilistic upper bounds; regularization parameter; square summable condition; Approximation methods; Complexity theory; Convergence; Hilbert space; Probabilistic logic; Upper bound; Averaging process; online learning; regularization; reproducing kernel Hilbert space; stochastic approximation;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2010.2079010
Filename :
5625646
Link To Document :
بازگشت