DocumentCode :
2215045
Title :
What perceptible information can be implemented in talking head animations
Author :
Kuratate, Takaaki ; Masuda, Saeko ; Vatikiotis-Bateson, Eric
Author_Institution :
Inf. Sci. Div., ATR Int., Kyoto, Japan
fYear :
2001
fDate :
2001
Firstpage :
430
Lastpage :
435
Abstract :
In this paper, we describe our talking head animation system. We scan and analyze high resolution static 3D faces from multiple subjects and extract deformation characteristics by principal component analysis. The extracted components have similarities across subjects and are used to reconstruct arbitrary facial postures for face animation driven by 3D kinematic data. We also build a simple lip model to estimate lip surfaces not visible to our 3D scanner. We present some results of 3D face analysis, including the effects of varying the number of principal components used to animate a talking head
Keywords :
computer animation; face recognition; 3D face analysis; 3D kinematic data; arbitrary facial postures; deformation characteristics; face animation; high resolution static 3D faces; lip model; lip surfaces; multiple subjects; perceptible information; principal component analysis; talking head animations; Data mining; Facial animation; Home computing; Humans; Kinematics; Magnetic heads; Motion control; Muscles; Speech synthesis; Surface reconstruction;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robot and Human Interactive Communication, 2001. Proceedings. 10th IEEE International Workshop on
Conference_Location :
Bordeaux, Paris
Print_ISBN :
0-7803-7222-0
Type :
conf
DOI :
10.1109/ROMAN.2001.981942
Filename :
981942
Link To Document :
بازگشت