DocumentCode :
652757
Title :
Mutual Behaviors during Dyadic Negotiation: Automatic Prediction of Respondent Reactions
Author :
Sunghyun Park ; Scherer, Stefan ; Gratch, Jonathan ; Carnevale, Peter ; Morency, Louis-Philippe
Author_Institution :
Inst. for Creative Technol., Univ. of Southern California, Playa Vista, CA, USA
fYear :
2013
fDate :
2-5 Sept. 2013
Firstpage :
423
Lastpage :
428
Abstract :
In this paper, we analyze face-to-face negotiation interactions with the goal of predicting the respondent´s immediate reaction (i.e., accept or reject) to a negotiation offer. Supported by the theory of social rapport, we focus on mutual behaviors which are defined as nonverbal characteristics that occur due to interactional influence. These patterns include behavioral symmetry (e.g., synchronized smiles) as well as asymmetry (e.g., opposite postures) between the two negotiators. In addition, we put emphasis on finding audio-visual mutual behaviors that can be extracted automatically, with the vision of a real-time decision support tool. We introduce a dyadic negotiation dataset consisting of 42 face-to-face interactions and show experiments confirming the importance of multimodal and mutual behaviors.
Keywords :
behavioural sciences computing; decision support systems; interactive systems; real-time systems; social sciences computing; audio-visual mutual behaviors; dyadic negotiation; face-to-face negotiation interactions; real-time decision support tool; respondent reactions; social rapport; Accuracy; Acoustics; Context; Face; Proposals; Speech; Visualization; multimodal; mutual behavior; negotiation; nonverbal; reaction prediction;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Affective Computing and Intelligent Interaction (ACII), 2013 Humaine Association Conference on
Conference_Location :
Geneva
ISSN :
2156-8103
Type :
conf
DOI :
10.1109/ACII.2013.76
Filename :
6681467
Link To Document :
بازگشت