Title :
Emotion in user interface, voice interaction system
Author :
Kostov, V. ; Fukuda, S.
Author_Institution :
Dept. of Production, Inf. & Syst. Eng., Tokyo Metropolitan Inst. of Technol., Japan
Abstract :
An approach towards a personalized voice-emotion user interface regardless of the speaker´s age, sex or language is presented. An extensive set of carefully chosen utterances provided a speech database for investing acoustic similarities among eight emotional states: (unemotional) neutral, anger, sadness, happiness, disgust, surprised, stressed/troubled and scared. Based on those results, a voice interaction system (VIS) capable of sensing the user´s emotional message was developed. In efforts to detect emotions, several primary parameters from human speech were analyzed: pitch, formants, tempo (rhythm) and power of human voice. First the individual basic speaker´s voice characteristics were extracted (pitch or/and formants in neutral speech, normal speech rate, neutral speech power) and based on those parameters the emotional message of the subject´s utterance was successfully extracted. The VIS interacts with the user while changing its response according to the user´s utterances
Keywords :
human factors; natural language interfaces; speech recognition; speech-based user interfaces; emotion; human speech; personalized voice-emotion user interface; speech database; voice interaction system; Acoustical engineering; Databases; Emotion recognition; Engines; Face; Facial animation; Humans; Production systems; Speech analysis; User interfaces;
Conference_Titel :
Systems, Man, and Cybernetics, 2000 IEEE International Conference on
Conference_Location :
Nashville, TN
Print_ISBN :
0-7803-6583-6
DOI :
10.1109/ICSMC.2000.885947