Title :
Fast concurrent object localization and recognition
Author :
Yeh, Teng-Hao ; Lee, John Jaehwan ; Darrell, Trevor
Author_Institution :
EECS, MIT, Cambridge, MA, USA
Abstract :
Object localization and recognition are important problems in computer vision. However, in many applications, exhaustive search over all object models and image locations is computationally prohibitive. While several methods have been proposed to make either recognition or localization more efficient, few have dealt with both tasks simultaneously. This paper proposes an efficient method for concurrent object localization and recognition based on a data-dependent multi-class branch-and-bound formalism. Existing bag-of-features recognition techniques which can be expressed as weighted combinations of feature counts can be readily adapted to our method. We present experimental results that demonstrate the merit of our algorithm in terms of recognition accuracy, localization accuracy, and speed, compared to baseline approaches including exhaustive search, implicit-shape model (ISM), and efficient sub-window search (ESS). Moreover, we develop two extensions to consider non-rectangular bounding regions-composite boxes and polygons-and demonstrate their ability to achieve higher recognition scores compared to traditional rectangular bounding boxes.
Keywords :
object recognition; tree searching; bag-of-features recognition; computer vision; efficient sub-window search; exhaustive search; fast concurrent object localization; image location; implicit-shape model; multiclass branch-and-bound formalism; nonrectangular bounding region; object model; object recognition; recognition score; Application software; Computer vision; Data structures; Electronic switching systems; Face detection; Histograms; Image segmentation; Large-scale systems; Object detection; Vocabulary;
Conference_Titel :
Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on
Conference_Location :
Miami, FL
Print_ISBN :
978-1-4244-3992-8
DOI :
10.1109/CVPR.2009.5206805