Title :
Mood Classfication from Musical Audio Using User Group-Dependent Models
Author :
Lee, Kyogu ; Cho, Minsu
Author_Institution :
Dept. of Digital Contents Convergence, Seoul Nat. Univ., Seoul, South Korea
Abstract :
In this paper, we propose a music mood classification system that reflects a user´s profile based on a belief that music mood perception is subjective and can vary depending on the user´s profile such as age or gender. To this end, we first define a set of generic mood descriptors. Secondly, we make up several user profiles according to the age and gender. We then obtain musical items, for each group, to separately train the statistical models. Using the two different user models, we verify our hypothesis that the user profiles play an important role in mood perception by showing that both models achieve higher classification accuracy when the test data and the mood model are of the same kind. Applying our system to automatic play list generation, we also demonstrate that considering the difference between the user groups in mood perception has a significant effect in computing music similarity.
Keywords :
audio signal processing; cognition; music; pattern classification; statistical analysis; automatic playlist generation; data classification; mood perception; music mood classification; music similarity computing; musical audio; statistical models; user group dependent models; users profile; Accuracy; Acoustics; Computational modeling; Data models; Feature extraction; Mood; Vectors;
Conference_Titel :
Machine Learning and Applications and Workshops (ICMLA), 2011 10th International Conference on
Conference_Location :
Honolulu, HI
Print_ISBN :
978-1-4577-2134-2
DOI :
10.1109/ICMLA.2011.96