DocumentCode :
3282814
Title :
Stochastic boosting for large-scale image classification
Author :
Junbiao Pang ; Qingming Huang ; Baocai Yin ; Lei Qin ; Dan Wang
Author_Institution :
Coll. of Comput. Sci. & Technol., Beijing Univ. of Technol., Beijing, China
fYear :
2013
fDate :
15-18 Sept. 2013
Firstpage :
3274
Lastpage :
3277
Abstract :
Boosting has been extensively used in image processing. Many work focuses on the design or the usage of boosting, but training boosting on large-scale datasets tends to be ignored. To handle the large-scale problem, we present stochastic boosting (StocBoost) that relies on stochastic gradient descent (SGD) which uses one sample at each iteration. To understand the efficacy of StocBoost, the convergence of training algorithm is theoretically analyzed. Experimental results show that StocBoost is faster than the batch ones, and is also comparable with the state-of-the-arts.
Keywords :
gradient methods; image classification; learning (artificial intelligence); stochastic processes; SGD; StocBoost; image processing; large-scale datasets; large-scale image classification; stochastic boosting; stochastic gradient descent; Boosting; Classification; Large scale problem; Stochastic gradient descent;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Image Processing (ICIP), 2013 20th IEEE International Conference on
Conference_Location :
Melbourne, VIC
Type :
conf
DOI :
10.1109/ICIP.2013.6738674
Filename :
6738674
Link To Document :
بازگشت