DocumentCode
3225927
Title
Conducting digitally stored music by computer vision tracking
Author
Behringer, Reinhold
Author_Institution
Leeds Metropolitan Univ., UK
fYear
2005
fDate
30 Nov.-2 Dec. 2005
Abstract
In this paper, a method for controlling electronic digital music instruments is proposed, based on visual capture of baton and hand motion of a conductor. This approach is suitable for being applied in mixed ensembles of human musicians and electronic instruments. Computer vision methods that are well established, are used to track the motion of the baton and to deduce musical parameters (volume, pitch, expression) for the sound creation or cues for the time-synchronized replay of previously recorded music notation sequences (beat, tempo, expression). Combined with acoustic signal processing, this method can enable the automatic playing of a computer-based instrument in an orchestra, in which the conductor conducts both this instrument as well as the human musicians. This allows an intuitive control of the timing and expression towards a unique interpretation. In this paper, the concept is introduced and the feasibility is discussed.
Keywords
acoustic signal processing; computer vision; electronic music; music; musical acoustics; musical instruments; tracking; acoustic signal processing; automatic playing; computer vision tracking; computer-based instrument; electronic digital music instrument; electronic instruments; human musicians; Acoustic signal processing; Automatic control; Computer vision; Conductors; Digital control; Humans; Instruments; Motion control; Music; Tracking;
fLanguage
English
Publisher
ieee
Conference_Titel
Automated Production of Cross Media Content for Multi-Channel Distribution, 2005. AXMEDIS 2005. First International Conference on
Print_ISBN
0-7695-2348-X
Type
conf
DOI
10.1109/AXMEDIS.2005.14
Filename
1592100
Link To Document