DocumentCode :
1766933
Title :
A new information fusion approach for recognition of music-induced emotions
Author :
Naji, M. ; Firoozabadi, Mohammad ; Azadfallah, Parviz
Author_Institution :
Dept. of Biomed. Eng., Islamic Azad Univ., Dezful, Iran
fYear :
2014
fDate :
1-4 June 2014
Firstpage :
205
Lastpage :
208
Abstract :
In the present paper, a new information fusion approach based on 3-channel forehead biosignals (from left temporalis, frontalis, and right temporalis muscles) and electrocardiogram is adopted to classify music-induced emotions in arousal-valence space. The fusion strategy is a combination of feature-level fusion and naive-Bayes decision-level fusion. Optimal feature subsets were derived by using a consistency-based feature evaluation index and sequential forward floating selection technique. An average classification accuracy of 89.24% was achieved, corresponding to valence classification accuracy of 94.86% and average arousal classification accuracy of 94.06%, respectively.
Keywords :
electrocardiography; emotion recognition; medical signal processing; muscle; music; sensor fusion; signal classification; 3-channel forehead biosignals; arousal-valence space; average arousal classification accuracy; consistency-based feature evaluation index; electrocardiogram; feature-level fusion; frontalis muscles; fusion strategy; information fusion approach; left temporalis muscles; music-induced emotion classification; music-induced emotion recognition; naive-Bayes decision-level fusion; optimal feature subsets; right temporalis muscles; sequential forward floating selection technique; valence classification accuracy; Accuracy; Electrocardiography; Electroencephalography; Emotion recognition; Feature extraction; Forehead; Support vector machines;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Biomedical and Health Informatics (BHI), 2014 IEEE-EMBS International Conference on
Conference_Location :
Valencia
Type :
conf
DOI :
10.1109/BHI.2014.6864340
Filename :
6864340
Link To Document :
بازگشت