Title :
A novel fusion-based method for expression-invariant gender classification
Author :
Lu, Li ; Shi, Pengfei
Author_Institution :
Inst. of Image Process. & Pattern Recognition, Shanghai Jiaotong Univ., Shanghai
Abstract :
In this paper, we propose a novel fusion-based gender classification method that is able to compensate for facial expression even when training samples contain only neutral expression. We perform experimental investigation to evaluate the significance of different facial regions in the task of gender classification. Three most significant regions are used in our fusion-based method. The classification is performed by using support vector machines based on the features extracted using two-dimension principal component analysis. Experiments show that our fusion-based method is able to compensate for facial expressions and obtained the highest correct classification rate of 95.33%.
Keywords :
face recognition; feature extraction; principal component analysis; support vector machines; 2D principal component analysis; expression-invariant gender classification; facial expression; facial regions; feature extraction; fusion-based gender classification; support vector machines; Face detection; Feature extraction; Image processing; Mouth; Nose; Pattern recognition; Performance evaluation; Principal component analysis; Support vector machine classification; Support vector machines; Gender classification; feature extraction; two-dimension principal component;
Conference_Titel :
Acoustics, Speech and Signal Processing, 2009. ICASSP 2009. IEEE International Conference on
Conference_Location :
Taipei
Print_ISBN :
978-1-4244-2353-8
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2009.4959771