Title :
A comparison of acoustic-prosodic entrainment in face-to-face and remote collaborative learning dialogues
Author :
Lubold, Nichola ; Pon-Barry, Heather
Author_Institution :
CIDSE, Arizona State Univ., Tempe, AZ, USA
Abstract :
Today, people are just as likely to have a business meeting remotely as they are face-to-face. Individuals obtain college degrees remotely and sick patients can visit the doctor from home. Especially important in light of this popularity, remote settings are posing communication challenges that are not present in face-to-face settings. Visual cues such as facial expressions and body language are either degraded or nonexistent. In this paper, we are interested in how remote settings affect spoken dialogue when compared to face-to-face settings. We focus on entrainment, a phenomenon of conversation where individuals adapt to each other during the interaction. Specifically, we investigate acoustic-prosodic entrainment, where individuals become more similar in their pitch, loudness, or speaking rate. We explore three different measures of acoustic-prosodic entrainment, comparing remote settings to face-to-face settings on a turn-by-turn basis. Our results indicate that the two settings do differ for different forms of entrainment, suggesting that the presence or absence of visual cues such as facial expressions and body language has an impact on the degree of entrainment.
Keywords :
computer aided instruction; distance learning; speech processing; acoustic-prosodic entrainment; body language; face-to-face dialogues; facial expressions; remote collaborative learning dialogues; spoken dialogue; visual cues; Acoustic measurements; Collaboration; Convergence; Correlation; Jitter; Speech; Visualization;
Conference_Titel :
Spoken Language Technology Workshop (SLT), 2014 IEEE
DOI :
10.1109/SLT.2014.7078589