DocumentCode :
270134
Title :
Affect burst recognition using multi-modal cues
Author :
Türker, Bekir Berker ; Marzban, Sara ; Erzin, E. ; Yemez, Y. ; Sezgin, T.M.
Author_Institution :
Muhendislik Fak., Koc Univ., İstanbul, Turkey
fYear :
2014
fDate :
23-25 April 2014
Firstpage :
1608
Lastpage :
1611
Abstract :
Affect bursts, which are nonverbal expressions of emotions in conversations, play a critical role in analyzing affective states. Although there exist a number of methods on affect burst detection and recognition using only audio information, little effort has been spent for combining cues in a multi-modal setup. We suggest that facial gestures constitute a key component to characterize affect bursts, and hence have potential for more robust affect burst detection and recognition. We take a data-driven approach to characterize affect bursts using Hidden Markov Models (HMM), and employ a multimodal decision fusion scheme that combines cues from audio and facial gestures for classification of affect bursts. We demonstrate the contribution of facial gestures to affect burst recognition by conducting experiments on an audiovisual database which comprise speech and facial motion data belonging to various dyadic conversations.
Keywords :
emotion recognition; face recognition; hidden Markov models; HMM; Hidden Markov models; affect burst detection; affect burst recognition; audio information; audiovisual database; dyadic conversations; emotion expressions; facial gestures; facial motion data; multimodal cues; multimodal decision fusion scheme; speech data; Conferences; Face recognition; Hidden Markov models; Markov processes; Signal processing; Speech; Speech recognition; affect burst; multimodal recognition;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signal Processing and Communications Applications Conference (SIU), 2014 22nd
Conference_Location :
Trabzon
Type :
conf
DOI :
10.1109/SIU.2014.6830552
Filename :
6830552
Link To Document :
بازگشت