DocumentCode :
2336846
Title :
Hearing how you touch: Real-time synthesis of contact sounds for multisensory interaction
Author :
Liu, Juan ; Ando, Hiroshi
Author_Institution :
Universal Media Res. Center, Nat. Inst. of Inf. & Commun. Technol. (NICT), Kyoto
fYear :
2008
fDate :
25-27 May 2008
Firstpage :
275
Lastpage :
280
Abstract :
This paper describes a multisensory interface in which continuous contact sound effects of userpsilas actions on virtual objects are synthesized and rendered with haptic and visual stimuli. The auditory output is generated by convolving the force profile with the impulse response of an object, thus tightly coupled with the haptic output. Since each action, such as striking, scratching, and inhibiting, affects vibration modes distinctively, the nondeterministic nature of userpsilas real-time interactions requires more responsive approach than that for animations. We developed a technique to adjust parameters and drive corresponding modal sound models with low latency (lms). The implemented system produces realistic sound effects and smooth transitions between various actions.
Keywords :
transient response; user interfaces; virtual reality; contact sounds; haptic stimuli; impulse response; multisensory interaction; multisensory interface; real-time synthesis; realistic sound effects; virtual objects; visual stimuli; Acoustics; Animation; Auditory system; Computational modeling; Deformable models; Haptic interfaces; Rendering (computer graphics); Signal synthesis; Timbre; Virtual reality; Contact sound synthesis; Modal vibration; Multisensory interaction; Physical based modeling; Virtual reality;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Human System Interactions, 2008 Conference on
Conference_Location :
Krakow
Print_ISBN :
978-1-4244-1542-7
Electronic_ISBN :
978-1-4244-1543-4
Type :
conf
DOI :
10.1109/HSI.2008.4581448
Filename :
4581448
Link To Document :
بازگشت