DocumentCode
3458264
Title
Hands by hand: Crowd-sourced motion tracking for gesture annotation
Author
Spiro, Ian ; Taylor, Graham ; Williams, George ; Bregler, Christoph
Author_Institution
Dept. of Comput. Sci., New York Univ., New York, NY, USA
fYear
2010
fDate
13-18 June 2010
Firstpage
17
Lastpage
24
Abstract
We describe a method for using crowd-sourced labor to track motion and ultimately annotate gestures of humans in video. Our chosen platform for deployment, Amazon Mechanical Turk, divides labor into HITs (Human Intelligence Tasks). Given the informational density of video, our task is potentially larger than a traditional HIT that involves processing a block of text or a single image. We exploit redundancies in video data in such a way that workers´ efforts can be multiplied in effect. In the end, a fraction of frames need to be annotated by hand, but we can still achieve complete coverage of all video frames. This is achieved with a combination of HITs using a novel user interface, combined with automatic techniques such as template tracking and affinity propagation clustering. We show in a case study how we can annotate a video database of political speeches with 2D positions and 3D hand pose configurations. This data is then used for some preliminary analytical tasks.
Keywords
gesture recognition; motion estimation; video signal processing; Amazon mechanical turk; affinity propagation clustering; crowd sourced motion tracking; gesture annotation; human intelligence tasks; political speeches; template tracking; user interface; video data; video database; Computer science; Humans; Image databases; Labeling; Natural language processing; Shape; Spatial databases; Speech; Tracking; User interfaces;
fLanguage
English
Publisher
ieee
Conference_Titel
Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on
Conference_Location
San Francisco, CA
ISSN
2160-7508
Print_ISBN
978-1-4244-7029-7
Type
conf
DOI
10.1109/CVPRW.2010.5543191
Filename
5543191
Link To Document