DocumentCode :
2401668
Title :
Unsupervised learning of human perspective context using ME-DT for efficient human detection in surveillance
Author :
Li, Liyuan ; Leung, Maylor K H
Author_Institution :
Inst. for Infocomm Res., Singapore
fYear :
2008
fDate :
23-28 June 2008
Firstpage :
1
Lastpage :
8
Abstract :
A novel and automated technique for learning human perspective context (HPC) from a scene is proposed in this paper. It is found that two models are required to describe HPC for camera tilt angle ranging from 0deg to 50deg. From a scene, the tilt angle can be inferred from the observed human shapes and head/foot positions. Afterward, a novel ME-DT (model estimation - data tuning) algorithm is proposed to learn human perspective context from live data of various degrees of uncertainties. The uncertainties may come from the variations of human individual heights and poses, and segmentation/recognition errors. ME-DT not only estimates the model parameters from the training data but also tunes the data to achieve a better head-foot correlation. The human perspective context provides a feasible constraint on the scales, positions, and orientations of humans in the scene. Applying this constraint to the HOG human detection, great reduction of the detection windows and improved performances have been obtained compared to conventional methods.
Keywords :
estimation theory; object detection; unsupervised learning; video surveillance; head-foot correlation; human detection; human perspective context; model estimation-data tuning algorithm; unsupervised learning; video surveillance; Cameras; Context modeling; Foot; Humans; Layout; Parameter estimation; Shape; Surveillance; Uncertainty; Unsupervised learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Vision and Pattern Recognition, 2008. CVPR 2008. IEEE Conference on
Conference_Location :
Anchorage, AK
ISSN :
1063-6919
Print_ISBN :
978-1-4244-2242-5
Electronic_ISBN :
1063-6919
Type :
conf
DOI :
10.1109/CVPR.2008.4587725
Filename :
4587725
Link To Document :
بازگشت