• DocumentCode
    3715345
  • Title

    Perceptual coherence as an analytic for procedural music and audio mappings in virtual space

  • Author

    Rob Hamilton

  • Author_Institution
    Center for Computer Research in Music and Acoustics Stanford University
  • fYear
    2015
  • fDate
    3/1/2015 12:00:00 AM
  • Firstpage
    1
  • Lastpage
    6
  • Abstract
    Real-time data generated by virtual actors and their mediated interactions in simulated space can be repurposed to dynamically generate sound and music. Procedural audio and music systems afford interaction designers, composers and sound artists the opportunity to create tight couplings between the visual and auditory modalities. Designing procedural mapping schemata can become problematic when players or observers are presented with audio-visual events within novel environments wherein the validity of their own prior knowledge and learned expectations about sound, image and interactivity are put into question. This paper presents the results of a user-study measuring users´ perceptions of audio-visual crossmodal correspondences between low-level attributes of motion and sound. Study results were analyzed using the Bradley-Terry statistical model, effectively calculating the relative contribution of each crossmodal attribute within each attribute pairing to the perceived coherence or ´fit´ between audio and visual data.
  • Keywords
    "Videos","Avatars","Visualization","Instruments","Music","Coherence","Observers"
  • Publisher
    ieee
  • Conference_Titel
    Sonic Interactions for Virtual Environments (SIVE), 2015 IEEE 2nd VR Workshop on
  • Type

    conf

  • DOI
    10.1109/SIVE.2015.7361288
  • Filename
    7361288