DocumentCode
626671
Title
Salient object cutout using Google images
Author
Hongyuan Zhu ; Jianfei Cai ; Jianmin Zheng ; Jianxin Wu ; Thalmann, Nadia
Author_Institution
Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore
fYear
2013
fDate
19-23 May 2013
Firstpage
905
Lastpage
908
Abstract
Given any image input by users, how to automatically cutout the object-of-interest is a challenging problem due to lack of information of the object-of-interest and the background. Saliency detection techniques are able to provide some rough information about object-of-interest since they highlight high-contrast or high attention regions or pixels. However, the generated saliency map is often noisy and directly applying it for segmentation often leads to erroneous results. Motivated by the recent progress on image co-segmentation and internet image retrieval techniques, in this paper, we propose to use the user input image for segmentation as a query image to Google Images and then employ the top returned Google images to build up the knowledge about the object-of-interest in the user input image. Particularly, we develop a lightweight algorithm to learn the knowledge of the object-of-interest in the retrieved images to enhance the saliency map of the input image. Then, the enhanced saliency map is used to initialize the graph-cut to extract the object-of-interest. Experiments with the Mcgill dataset and multiple challenge cases demonstrate the effectiveness of our method in terms of producing a clean cutout.
Keywords
Internet; graph theory; image retrieval; image segmentation; object detection; Google images; Internet image retrieval techniques; graph cut; image cosegmentation; lightweight algorithm; object-of-interest; rough information; saliency detection techniques; saliency map; salient object cutout; Frequency estimation; Google; Histograms; Image color analysis; Image segmentation; Internet;
fLanguage
English
Publisher
ieee
Conference_Titel
Circuits and Systems (ISCAS), 2013 IEEE International Symposium on
Conference_Location
Beijing
ISSN
0271-4302
Print_ISBN
978-1-4673-5760-9
Type
conf
DOI
10.1109/ISCAS.2013.6571994
Filename
6571994
Link To Document