DocumentCode
2986355
Title
ViHASi: Virtual human action silhouette data for the performance evaluation of silhouette-based action recognition methods
Author
Ragheb, Hossein ; Velastin, Sergio ; Remagnino, Paolo ; Ellis, Tim
Author_Institution
Digital Imaging Res. Centre, Kingston Univ. London, London
fYear
2008
fDate
7-11 Sept. 2008
Firstpage
1
Lastpage
10
Abstract
In this paper we introduce a large body of virtual human action silhouette (ViHASi) data that we have recently generated for the purpose of evaluating a family of action recognition methods. These are the silhouette-based human action recognition methods. This synthetic multi-camera video data-set consists of 20 action classes, 9 actors and up to 40 synchronized perspective cameras. This data-set has been recently made available online for other researchers to download. In order to demonstrate the usefulness of the ViHASi data we make use of an existing action recognition method that is simple and relatively fast. Moreover, to deal with long video sequences containing several action samples, a practical temporal segmentation algorithm is introduced and tested that is tightly coupled with the action recognition method used. The experimental methodologies outlined here provides a route towards quantitatively comparing silhouette-based action recognition methods.
Keywords
image recognition; image segmentation; image sensors; image sequences; ViHASi; silhouette-based action recognition methods; synthetic multi-camera video data-set; temporal segmentation algorithm; video sequences; virtual human action silhouette data; Cameras; Digital images; Hidden Markov models; Humans; Image recognition; Image segmentation; Motion detection; Testing; Video sequences; Virtual environment; Silhouette; action recognition; performance evaluation; temporal segmentation; virtual action;
fLanguage
English
Publisher
ieee
Conference_Titel
Distributed Smart Cameras, 2008. ICDSC 2008. Second ACM/IEEE International Conference on
Conference_Location
Stanford, CA
Print_ISBN
978-1-4244-2664-5
Electronic_ISBN
978-1-4244-2665-2
Type
conf
DOI
10.1109/ICDSC.2008.4635730
Filename
4635730
Link To Document