Title :
Spatio-temporal Depth Cuboid Similarity Feature for Activity Recognition Using Depth Camera
Author :
Lu Xia ; Aggarwal, J.K.
Author_Institution :
Dept. of ECE, Univ. of Texas at Austin, Austin, TX, USA
Abstract :
Local spatio-temporal interest points (STIPs) and the resulting features from RGB videos have been proven successful at activity recognition that can handle cluttered backgrounds and partial occlusions. In this paper, we propose its counterpart in depth video and show its efficacy on activity recognition. We present a filtering method to extract STIPs from depth videos (called DSTIP) that effectively suppress the noisy measurements. Further, we build a novel depth cuboid similarity feature (DCSF) to describe the local 3D depth cuboid around the DSTIPs with an adaptable supporting size. We test this feature on activity recognition application using the public MSRAction3D, MSRDailyActivity3D datasets and our own dataset. Experimental evaluation shows that the proposed approach outperforms state-of-the-art activity recognition algorithms on depth videos, and the framework is more widely applicable than existing approaches. We also give detailed comparisons with other features and analysis of choice of parameters as a guidance for applications.
Keywords :
cameras; feature extraction; filtering theory; image denoising; image recognition; video signal processing; 3D depth cuboid; DCSF; DSTIP; MSRDailyActivity3D datasets; RGB videos; STIPs; activity recognition algorithm; cluttered backgrounds; depth camera; depth video; filtering method; local spatiotemporal interest points; noisy measurement suppression; partial occlusions; public MSRAction3D dataset; spatiotemporal depth cuboid similarity feature extraction; Cameras; Detectors; Feature extraction; Histograms; Noise; Three-dimensional displays; Videos; Kinect; Spatio temporal interest point; activity recognition; depth image;
Conference_Titel :
Computer Vision and Pattern Recognition (CVPR), 2013 IEEE Conference on
Conference_Location :
Portland, OR
DOI :
10.1109/CVPR.2013.365