DocumentCode :
547326
Title :
A study of the integrated automated emotion music with the motion gesture synthesis
Author :
Huang, Chih-Fang ; Luo, Yin-Jyun
Author_Institution :
Dept. of Inf. Commun., Yuan Ze Univ., Chungli, Taiwan
Volume :
3
fYear :
2011
fDate :
10-12 June 2011
Firstpage :
267
Lastpage :
272
Abstract :
Automated Music Composition and Algorithmic Composition is based on the logic operation with music parameters setting according to the desired music style or emotion. The computer generative music can be integrated with other domain using proper mapping techniques, such as the intermedia arts with music technology. This Paper mainly discusses the possibility to integrate both automatic composition and motion devices with an Emotional Identification System (EIS) using the emotion classification and parameters. The correspondent music pattern and motion path can be driven simultaneously via the cursor movement of in the Emotion Map (EM) varying with time. An interactive music-motion platform is established accordingly.
Keywords :
music; algorithmic composition; emotion map; emotion music; emotional identification system; interactive music-motion platform; motion gesture synthesis; music composition; Mood; Neck; Presses; Rhythm; Robot motion;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Science and Automation Engineering (CSAE), 2011 IEEE International Conference on
Conference_Location :
Shanghai
Print_ISBN :
978-1-4244-8727-1
Type :
conf
DOI :
10.1109/CSAE.2011.5952678
Filename :
5952678
Link To Document :
بازگشت