Title :
Multimodal arousal rating using unsupervised fusion technique
Author :
Wei-Chen Chen ; Po-Tsun Lai ; Yu Tsao ; Chi-Chun Lee
Author_Institution :
Dept. of Electr. Eng., Nat. Tsing Hua Univ., Hsinchu, Taiwan
Abstract :
Arousal is essential in understanding human behavior and decision-making. In this work, we present a multimodal arousal rating framework that incorporates minimal set of vocal and non-verbal behavior descriptors. The rating framework and fusion techniques are unsupervised in nature to ensure that it can be readily-applicable and interpretable. Our proposed multimodal framework improves correlation to human judgment from 0.66 (vocal-only) to 0.68 (multimodal); analysis shows that the supervised fusion framework does not improve correlation. Lastly, an interesting empirical evidence demonstrates that the signal-based quantification of arousal achieves a higher agreement with each individual rater than the agreement among raters themselves. This further strengthens that machine-based rating is a viable way of measuring subjective humans´ internal states through observing behavior features objectively.
Keywords :
behavioural sciences computing; signal processing; unsupervised learning; correlation judgment; human judgment; machine-based rating; multimodal arousal rating framework; nonverbal behavior descriptors; signal-based quantification; subjective humans internal states; supervised fusion framework; unsupervised fusion technique; Correlation; Databases; Face; Psychology; Robustness; Speech; affective computing; arousal rating; behavioral signal processing; multimodal signal processing;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2015 IEEE International Conference on
Conference_Location :
South Brisbane, QLD
DOI :
10.1109/ICASSP.2015.7178982