• DocumentCode
    3694985
  • Title

    Towards a synchronised Grammars framework for adaptive musical human-robot collaboration

  • Author

    Miguel Sarabia;Kyuhwa Lee;Yiannis Demiris

  • Author_Institution
    Personal Robotics Lab, Department of Electrical and Electronic Engineering, Imperial College London, United Kingdom
  • fYear
    2015
  • Firstpage
    715
  • Lastpage
    721
  • Abstract
    We present an adaptive musical collaboration framework for interaction between a human and a robot. The aim of our work is to develop a system that receives feedback from the user in real time and learns the music progression style of the user over time. To tackle this problem, we represent a song as a hierarchically structured sequence of music primitives. By exploiting the sequential constraints of these primitives inferred from the structural information combined with user feedback, we show that a robot can play music in accordance with the user´s anticipated actions. We use Stochastic Context-Free Grammars augmented with the knowledge of the learnt user´s preferences. We provide synthetic experiments as well as a pilot study with a Baxter robot and a tangible music table. The synthetic results show the synchronisation and adaptivity features of our framework and the pilot study suggest these are applicable to create an effective musical collaboration experience.
  • Keywords
    "Robots","Grammar","Probability distribution","Collaboration","Synchronization","Real-time systems","Prediction algorithms"
  • Publisher
    ieee
  • Conference_Titel
    Robot and Human Interactive Communication (RO-MAN), 2015 24th IEEE International Symposium on
  • Type

    conf

  • DOI
    10.1109/ROMAN.2015.7333649
  • Filename
    7333649