DocumentCode :
3153146
Title :
Classification of emotional content of sighs in dyadic human interactions
Author :
Gupta, Rahul ; Lee, Chi-Chun ; Narayanan, Shrikanth
Author_Institution :
Signal Anal. & Interpretation Lab. (SAIL), Univ. of Southern California, Los Angeles, CA, USA
fYear :
2012
fDate :
25-30 March 2012
Firstpage :
2265
Lastpage :
2268
Abstract :
Emotions are an important part of human communication and are expressed both verbally and non-verbally. Common nonverbal vocalizations such as laughter, cries and sighs carry important emotional content in conversations. Sighs often are associated with negative emotion. In this work, we show that emotional sighs exist along both ends of the valence axis (positive-emotion vs. negative-emotion sighs) in spontaneous affective dialogs and that they have certain distinct multimodal characteristics. Classification results show that it is possible to differentiate between the two types of emotionally valenced sighs, using a combination of acoustic and gestural features with an overall unweighted accuracy of 58.26%.
Keywords :
speaker recognition; acoustic features; distinct multimodal characteristics; dyadic human interactions; gestural features; human communication; negative emotion; nonverbal vocalizations; sigh emotional content classification; vocalization recognition; Accuracy; Acoustics; Databases; Emotion recognition; Humans; Speech; Support vector machines; Nonverbal vocalizations; multimodal fusion; support vector machine;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2012 IEEE International Conference on
Conference_Location :
Kyoto
ISSN :
1520-6149
Print_ISBN :
978-1-4673-0045-2
Electronic_ISBN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.2012.6288365
Filename :
6288365
Link To Document :
بازگشت