Title :
Face Recognition Challenge: Object Recognition Approaches for Human/Avatar Classification
Author :
Yamasaki, T. ; Tsuhan Chen
Author_Institution :
Sch. of Electr. & Comput. Eng., Dept. Inf. & Commun. Eng., Cornell Univ., Ithaca, NY, USA
Abstract :
Recently, a novel "completely automated public Turing test to tell computers and humans apart (CAPTCHA)\´\´ system has been proposed, in which users are asked to separate natural faces of humans and artificial faces of virtual world avatars. The system is based on the assumption that computers cannot separate them while it is an easy task for humans. Conventional digital forensics approaches to distinguish natural images from computer graphics images are mostly based on statistical analysis of the images such as noise in CMOS image sensors or Bayer matrix estimation. On the other hand, this paper uses face recognition and object classification based approaches. The experiments show that our approaches work surprisingly well and yields more than 99% accuracy. Our object classification based approach can also tell us how likely the input images are regarded as human/avatar faces.
Keywords :
Bayes methods; CMOS image sensors; Turing machines; avatars; computer graphics; digital forensics; face recognition; image classification; matrix algebra; object recognition; statistical analysis; Bayer matrix estimation; CAPTCHA; CMOS image sensors; artificial faces; completely automated public turing test to tell computers and humans apart; computer graphics images; digital forensics approaches; face recognition challenge; human-avatar classification; image statistical analysis; natural images; object classification-based approaches; object recognition approaches; virtual world avatars; Accuracy; Avatars; Humans; Machine learning; Reactive power; Support vector machine classification; Training data; face classification; object recognition;
Conference_Titel :
Machine Learning and Applications (ICMLA), 2012 11th International Conference on
Conference_Location :
Boca Raton, FL
Print_ISBN :
978-1-4673-4651-1
DOI :
10.1109/ICMLA.2012.188