DocumentCode
21698
Title
Neutral Face Classification Using Personalized Appearance Models for Fast and Robust Emotion Detection
Author
Chiranjeevi, Pojala ; Gopalakrishnan, Viswanath ; Moogi, Pratibha
Author_Institution
Samsung R&D Inst., Bangalore Pvt Ltd., Bangalore, India
Volume
24
Issue
9
fYear
2015
fDate
Sept. 2015
Firstpage
2701
Lastpage
2711
Abstract
Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning-based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, and so on, in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as user stays neutral for majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at key emotion (KE) points using a statistical texture model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.
Keywords
computer vision; emotion recognition; face recognition; image classification; image motion analysis; image texture; learning (artificial intelligence); object detection; ER accuracy; KE points; affine distortion; appearance variability; computational complexity; computer vision; emotion detection; facial expression recognition; key emotion points; neutral face classification; personalized appearance models; statistical texture model; supervised learning; user head motions; Accuracy; Computational modeling; Erbium; Face; Mouth; Robustness; Shape; Action units; Constrained Local Model; Key Emotion Points; Local Binary Pattern Histogram; Neutral Vs. emotion classification; Neutral vs. emotion classification; Procrustes analysis; Statistical model; Structural similarity; action units; constrained local model; key emotion points; local binary pattern histogram; procrustes analysis; statistical model; structural similarity;
fLanguage
English
Journal_Title
Image Processing, IEEE Transactions on
Publisher
ieee
ISSN
1057-7149
Type
jour
DOI
10.1109/TIP.2015.2421437
Filename
7084162
Link To Document