Title :
Emotion Animation of Embodied Conversational Agents with Contextual Control Model
Author :
Xiaobu Yuan ; Vijayarangan, R.
Author_Institution :
Sch. of Comput. Sci., Univ. of Windsor, Windsor, ON, Canada
Abstract :
Presented in this paper is part of an ongoing project on software customization, with a focus on emotion animation for the development of embodied conversational agents. This paper first highlights an interactive approach of online software customization, and then suggests a modified POMDP (Partially Observable Markov Decision Processes) model for the introduction of system´s response time into the control of dialogue management. By integrating response time into reward calculation, a novel algorithm is created to direct conversation in different contextual control modes. The modes and their dynamic changes further provide hints to determine the emotion for the animation of agents´ facial expressions and voice tunes. Experiment results demonstrate that the proposed method not only yields better performance for intention discovery, but also makes embodied conversational agents more appealing with emotion animation at run time.
Keywords :
Markov processes; computer animation; emotion recognition; face recognition; interactive systems; multi-agent systems; POMDP model; agent facial expression animation; contextual control model; dialogue management; embodied conversational agent; emotion animation; interactive approach; partially observable Markov decision process; response time integration; reward calculation; software customization; voice tune; Computational modeling; Context; Facial animation; Internet; Software; Time factors;
Conference_Titel :
Green Computing and Communications (GreenCom), 2013 IEEE and Internet of Things (iThings/CPSCom), IEEE International Conference on and IEEE Cyber, Physical and Social Computing
Conference_Location :
Beijing
DOI :
10.1109/GreenCom-iThings-CPSCom.2013.123