Title :
An Expressive Virtual Audiencewith Flexible Behavioral Styles
Author :
Ni Kang ; Brinkman, Willem-Paul ; van Riemsdijk, M. Birna ; Neerincx, Mark A.
Author_Institution :
Dept. of Intell. Syst., Delft Univ. of Technol., Delft, Netherlands
Abstract :
Currently, expressive virtual humans are used in psychological research, training, and psychotherapy. However, the behavior of these virtual humans is usually scripted and therefore cannot be modified freely at runtime. To address this, we created a virtual audience with parameterized behavioral styles. This paper presents a parameterized audience model based on probabilistic models abstracted from the observation of real human audiences (n = 16). The audience´s behavioral style is controlled by model parameters that define virtual humans´ moods, attitudes, and personalities. Employing these parameters as predictors, the audience model significantly predicts audience behavior. To investigate if people can recognize the designed behavioral styles generated by this model, 12 audience styles were evaluated by two groups of participants. One group (n = 22) was asked to describe the virtual audience freely, and the other group (n = 22) was asked to rate the audiences on eight dimensions. The results indicated that people could recognize different audience attitudes and even perceive the different degrees of certain audience attitudes. In conclusion, the audience model can generate expressive behavior to show different attitudes by modulating model parameters.
Keywords :
natural language interfaces; virtual reality; audience attitudes; expressive virtual audience; expressive virtual humans; flexible behavioral styles; model parameters; parameterized audience model; parameterized behavioral styles; probabilistic models; psychological research; psychotherapy; real human audiences; virtual audience; Attitude control; Encoding; Energy measurement; Energy states; Mood; Speech; Videos; Expressive listening behavior; parameterized audience model; public speaking; virtual agents;
Journal_Title :
Affective Computing, IEEE Transactions on
DOI :
10.1109/TAFFC.2013.2297104