• DocumentCode
    652845
  • Title

    Towards an Affective Brain-Computer Interface Monitoring Musical Engagement

  • Author

    Leslie, Grace ; Ojeda, Alejandro ; Makeig, Scott

  • Author_Institution
    Dept. of Music, Univ. of California San Diego, La Jolla, CA, USA
  • fYear
    2013
  • fDate
    2-5 Sept. 2013
  • Firstpage
    871
  • Lastpage
    875
  • Abstract
    A non-invasive way to monitor a music listener´s level of engagement could give us a valuable tool for music classification, technology, and therapy. To investigate whether musical engagement can be monitored, we developed an experimental protocol using the mobile brain/body imaging (MoBI) paradigm in which participants make expressive rhythmic arm gestures to encourage and/or index musical engagement. Participants communicate the feeling pulse of music they are hearing via simple rhythmic U-shaped back-and-forth hand/arm ´conducting´ gesture cycles that animate, in real time, the mirroring movement of a spot of light on a video display in front of them. Participants are asked to imagine that this display is also being viewed remotely by a deaf friend to whom they are attempting to communicate the feeling of the music they are hearing. In an Engaged condition, listeners are encouraged to fully engage themselves in this musical/emotional communication task. In a Not Engaged condition, a concurrent internal arithmetic distractor task is introduced to induce less fully engaged listening. Here, we report results of training a classifier using a frequency-based common spatial patterns (FBCSP) approach to correctly distinguish Engaged and Not Engaged conditions from concurrently recorded EEG data. Here the approach gave 67% classification accuracy across subjects (versus 50% chance), and 85% accuracy within subjects, cross-validated using a block wise paradigm.
  • Keywords
    biomedical MRI; brain-computer interfaces; electroencephalography; gesture recognition; handicapped aids; human computer interaction; image classification; music; video signal processing; EEG data; FBCSP; MoBI; affective brain-computer interface; blockwise paradigm; concurrent internal arithmetic distractor task; deaf friend; emotional communication task; frequency-based common spatial patterns; machine learning; mobile brain-body imaging paradigm; music classification; music technology; music therapy; musical communication task; musical engagement monitoring; rhythmic U-shaped back-and-forth arm conducting gesture cycles; rhythmic U-shaped back-and-forth hand conducting gesture cycles; video display; Accuracy; Brain modeling; Electroencephalography; Mathematical model; Monitoring; Music; Sensors; EEG; brain-computer interface; emotion; engagement; listening; machine learning; music;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Affective Computing and Intelligent Interaction (ACII), 2013 Humaine Association Conference on
  • Conference_Location
    Geneva
  • ISSN
    2156-8103
  • Type

    conf

  • DOI
    10.1109/ACII.2013.163
  • Filename
    6681555