DocumentCode
312250
Title
Characterizing audiovisual information during speech
Author
Vatikiotis-Bateson, E. ; Munhall, K.G. ; Kasahara, Y. ; Garcia, F. ; Yehia, H.
Author_Institution
ATR Human Inf. Process. Res. Labs., Kyoto, Japan
Volume
3
fYear
1996
fDate
3-6 Oct 1996
Firstpage
1485
Abstract
Several analyses relating facial motion with perioral muscle behavior and speech acoustics are described. The results suggest that linguistically relevant visual information is distributed over large regions of the face and can be modeled from the same control source as the acoustics
Keywords
physiology; speech intelligibility; speech processing; audiovisual information; control source; facial motion; linguistically relevant visual information; perioral muscle behavior; speech acoustics; Acceleration; Acoustics; Amplitude estimation; Apertures; Electromyography; Motion estimation; Muscles; Shape; Speech; Tracking;
fLanguage
English
Publisher
ieee
Conference_Titel
Spoken Language, 1996. ICSLP 96. Proceedings., Fourth International Conference on
Conference_Location
Philadelphia, PA
Print_ISBN
0-7803-3555-4
Type
conf
DOI
10.1109/ICSLP.1996.607897
Filename
607897
Link To Document