DocumentCode :
2380158
Title :
Articulated object tracking by rendering consistent appearance parts
Author :
Pezzementi, Zachary ; Voros, Sandrine ; Hager, Gregory D.
Author_Institution :
Laboratory for Computational Science and Robotics (LCSR), Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218, USA
fYear :
2009
fDate :
12-17 May 2009
Firstpage :
3940
Lastpage :
3947
Abstract :
We describe a general methodology for tracking 3-dimensional objects in monocular and stereo video that makes use of GPU-accelerated filtering and rendering in combination with machine learning techniques. The method operates on targets consisting of kinematic chains with known geometry. The tracked target is divided into one or more areas of consistent appearance. The appearance of each area is represented by a classifier trained to assign a class-conditional probability to image feature vectors. A search is then performed on the configuration space of the target to find the maximum likelihood configuration. In the search, candidate hypotheses are evaluated by rendering a 3D model of the target object and measuring its consistency with the class probability map. The method is demonstrated for tool tracking on videos from two surgical domains, as well as in a human hand-tracking task.
Keywords :
Histograms; Humans; Image edge detection; Information geometry; Kinematics; Rendering (computer graphics); Robots; Solid modeling; Surgery; Target tracking;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robotics and Automation, 2009. ICRA '09. IEEE International Conference on
Conference_Location :
Kobe
ISSN :
1050-4729
Print_ISBN :
978-1-4244-2788-8
Electronic_ISBN :
1050-4729
Type :
conf
DOI :
10.1109/ROBOT.2009.5152374
Filename :
5152374
Link To Document :
بازگشت