• DocumentCode
    2018052
  • Title

    MotionLab sonify: a framework for the sonification of human motion data

  • Author

    Effenberg, Alfred ; Melzer, Joachim ; Weber, Andreas ; Zinke, Arno

  • Author_Institution
    Deutsche Sporthochschule, Koln, Germany
  • fYear
    2005
  • fDate
    6-8 July 2005
  • Firstpage
    17
  • Lastpage
    23
  • Abstract
    Sonification of human movement offers a wide range of new kinds of information for supporting motor learning in sports and rehabilitation. Even though motor learning is dominated visually, auditory perception offers unique subtle temporal resolution as well as enormous integrative capacity - both are important features on perception of human movement patterns. But how to address the auditory system adequately? A sonification based on kinematic movement data can mediate structural features of movement via the auditory system, like polyrhythms of movement etc. And sonification of dynamic movement data makes muscle forces audible approximately. Here, a flexible framework for the sonification of human movement data is presented, capable of processing standard kinematic motion capture data as well as derived quantities such as force data. Force data are computed by inverse dynamics algorithms and can be used as input parameters for real time sonification. Simultaneous visualization is provided using OpenGL.
  • Keywords
    acoustic signal processing; data visualisation; hearing; image motion analysis; MIDI; MotionLab sonify; OpenGL; auditory perception; auditory system; data visualization; force data; human motion; human movement pattern; inverse dynamics; kinematic motion capture; kinematic movement; motor learning; polyrhythms; sonification; Auditory system; Data visualization; Ear; Heuristic algorithms; Humans; Kinematics; Motor drives; Muscles; Process design; Time factors; Inverse Dynamics; MIDI; Motion Capture; Sonification;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Visualisation, 2005. Proceedings. Ninth International Conference on
  • ISSN
    1550-6037
  • Print_ISBN
    0-7695-2397-8
  • Type

    conf

  • DOI
    10.1109/IV.2005.84
  • Filename
    1509054