DocumentCode :
1799060
Title :
Facial expression tracking from head-mounted, partially observing cameras
Author :
Romera-Paredes, Bernardino ; Cha Zhang ; Zhengyou Zhang
Author_Institution :
Univ. Coll. London, London, UK
fYear :
2014
fDate :
14-18 July 2014
Firstpage :
1
Lastpage :
6
Abstract :
Head-mounted displays (HMDs) have gained more and more interest recently. They can enable people to communicate with each other from anywhere, at anytime. However, since most HMDs today are only equipped with cameras pointing outwards, the remote party would not be able to see the user wearing the HMD. In this paper, we present a system for facial expression tracking based on head-mounted, inward looking cameras, such that the user can be represented with animated avatars at the remote party. The main challenge is that the cameras can only observe partial faces since they are very close to the face. We experiment with multiple machine learning algorithms to estimate facial expression parameters based on training data collected with the assistance of a Kinect depth sensor. Our results show that we can reliably track people´s facial expression even from very limited view angles of the cameras.
Keywords :
cameras; helmet mounted displays; learning (artificial intelligence); HMD; Kinect depth sensor; facial expression parameters; facial expression tracking; head-mounted displays; head-mounted inward looking cameras; head-mounted partially observing cameras; multiple machine learning algorithms; sensor; Cameras; Machine learning algorithms; Neural networks; Shape; Skin; Three-dimensional displays; Training; Facial expression tracking; Kinect; convolutional neural networks; head-mounted display; machine learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Multimedia and Expo (ICME), 2014 IEEE International Conference on
Conference_Location :
Chengdu
Type :
conf
DOI :
10.1109/ICME.2014.6890278
Filename :
6890278
Link To Document :
بازگشت