Title :
Grasp type revisited: A modern perspective on a classical feature for vision
Author :
Yezhou Yang;Cornelia Fermüller;Yi Li;Yiannis Aloimonos
Author_Institution :
Computer Vision Lab, University of Maryland, College Park, USA
fDate :
6/1/2015 12:00:00 AM
Abstract :
The grasp type provides crucial information about human action. However, recognizing the grasp type from unconstrained scenes is challenging because of the large variations in appearance, occlusions and geometric distortions. In this paper, first we present a convolutional neural network to classify functional hand grasp types. Experiments on a public static scene hand data set validate good performance of the presented method. Then we present two applications utilizing grasp type classification: (a) inference of human action intention and (b) fine level manipulation action segmentation. Experiments on both tasks demonstrate the usefulness of grasp type as a cognitive feature for computer vision. This study shows that the grasp type is a powerful symbolic representation for action understanding, and thus opens new avenues for future research.
Keywords :
"Grasping","Training","Videos","Computer vision","Neural networks","Visualization"
Conference_Titel :
Computer Vision and Pattern Recognition (CVPR), 2015 IEEE Conference on
Electronic_ISBN :
1063-6919
DOI :
10.1109/CVPR.2015.7298637