Title :
Multi-modal analysis of human computer interaction using automatic inference of aural expressions in speech
Author :
Shikler, Tal Sobol
Author_Institution :
Dept. of Ind. Eng. & Manage., Ben-Gurion Univ. of the Negev, Beer-Sheva
Abstract :
This paper presents multi-modal analysis of human-computer interactions based on automatic inference of expressions in speech. It describes an automatic inference system that recognizes aural expressions of emotions, complex mental states and expression mixtures. The implementation is based on the observation that different vocal features distinguish different expressions. The system was trained on an English database (MindReading), and then was applied to a Hebrew multi-modal database of naturally evoked expressions (Doors). This paper describes the statistical and dynamic analysis of sustained interactions from the Doors database. The analysis is based on the correlation between the inferred expressions with events, physiological cues such as galvanic skin response and behavioural cues. The presented analysis indicates that the vocal expression of complex mental states such as thinking, certainty and interest are not necessarily unique to one language and culture. The system provides an analysis tool for sustained human computer interactions.
Keywords :
emotion recognition; human computer interaction; inference mechanisms; speech processing; aural expressions; automatic inference; human computer interaction; multi-modal analysis; speech; Automatic speech recognition; Engineering management; Feature extraction; Frequency; Human computer interaction; Human factors; Industrial engineering; Laboratories; Spatial databases; Speech analysis;
Conference_Titel :
Systems, Man and Cybernetics, 2008. SMC 2008. IEEE International Conference on
Conference_Location :
Singapore
Print_ISBN :
978-1-4244-2383-5
Electronic_ISBN :
1062-922X
DOI :
10.1109/ICSMC.2008.4811309