Title :
Head Pose Estimation Using Sparse Representation
Author :
Ma, Bingpeng ; Wang, Tianjiang
Author_Institution :
Sch. of Comput. Sci., Huazhong Univ. of Sci. & Technol., Wuhan, China
Abstract :
This paper proposes a novel method using sparse representation to improve the performance of head pose estimation. Sparse Representation Classifier (SRC) has been applied in face recognition and the related problem. In this paper, we first argue that SRC is efficient in head pose estimation. Then we propose the Block based Spare Representation Classifier (BSRC) method to reduce the influence of the background in the multi-view face images. The motivation of BSRC is that since the face shapes are different for the discretional head poses, the background in the multi-view face image is difficult to be eliminated by the traditional ways, such as feature extraction. As a result, the features of the multi-view face contain the noise of the background, which caused the descend of the performance of head pose estimation. In this paper, a face region is transformed into many images with the different blocks are discarded based on the assume that the block maybe the background at the specifically pose. By this way, there is an image in which the influence of the background can be reduced greatly. Specifically, SRC is applied on each image and the final class label of the input face region is decided by maximizing of Sparsity Concentration Index (SCI). The experimental results on the head pose database show the effectiveness of BSRC.
Keywords :
face recognition; image representation; pose estimation; block based spare representation classifier; face recognition; head pose estimation; multiview face images; sparsity concentration index; Application software; Computer applications; Computer science; Face detection; Face recognition; Head; Image databases; Layout; Shape; System testing;
Conference_Titel :
Computer Engineering and Applications (ICCEA), 2010 Second International Conference on
Conference_Location :
Bali Island
Print_ISBN :
978-1-4244-6079-3
Electronic_ISBN :
978-1-4244-6080-9
DOI :
10.1109/ICCEA.2010.226