DocumentCode :
2547894
Title :
Particle-filter based audio-visual beat-tracking for music robot ensemble with human guitarist
Author :
Itohara, Tatsuhiko ; Otsuka, Takuma ; Mizumoto, Takeshi ; Ogata, Tetsuya ; Okuno, Hiroshi G.
Author_Institution :
Grad. Sch. of Inf., Kyoto Univ., Kyoto, Japan
fYear :
2011
fDate :
25-30 Sept. 2011
Firstpage :
118
Lastpage :
124
Abstract :
This paper presents an audio-visual beat-tracking method for ensemble robots with a human guitarist. Beat-tracking, or estimation of tempo and beat times of music, is critical to the high quality of musical ensemble performance. Since a human plays the guitar in out-beat in back beat and syncopation, the main problems of beat-tracking of a human´s guitar playing are twofold: tempo changes and varying note lengths. Most conventional methods have not addressed human´s guitar playing. Therefore, they lack the adaptation of either of the problems. To solve the problems simultaneously, our method uses not only audio but visual features. We extract audio features with Spectro-Temporal Pattern Matching (STPM) and visual features with optical flow, mean shift and Hough transform. Our beat-tracking estimates tempo and beat time using a particle filter; both acoustic feature of guitar sounds and visual features of arm motions are represented as particles. The particle is determined based on prior distribution of audio and visual features, respectively Experimental results confirm that our integrated audio-visual approach is robust against tempo changes and varying note lengths. In addition, they also show that estimation convergence rate depends only a little on the number of particles. The real-time factor is 0.88 when the number of particles is 200, and this shows out method works in real-time.
Keywords :
Hough transforms; acoustic signal processing; audio signal processing; audio-visual systems; feature extraction; human-robot interaction; image matching; image sequences; music; musical acoustics; musical instruments; particle filtering (numerical methods); Hough transform; acoustic feature; arm motions; audio feature extraction; back beat; beat times; estimation convergence rate; guitar sounds; human guitarist; integrated audio-visual approach; mean shift; music robot ensemble; note length variation; optical flow; particle filter based audio-visual beat tracking; spectrotemporal pattern matching; syncopation; tempo changes; tempo estimation; visual features; Argon; Noise; Robots;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Robots and Systems (IROS), 2011 IEEE/RSJ International Conference on
Conference_Location :
San Francisco, CA
ISSN :
2153-0858
Print_ISBN :
978-1-61284-454-1
Type :
conf
DOI :
10.1109/IROS.2011.6094773
Filename :
6094773
Link To Document :
بازگشت