Integrated Person Identification and Emotion Recognition from Facial Images
Dadet PRAMADIHANTO, Yoshio IWAI, Masahiko YACHIDA
pp. 421-429
DOI:
10.5687/iscie.14.421Abstract
Deformation of face caused by its expression change varies among individuals. In this paper we propose an integration of face identification and facial expression recognition to overcome difference of expression change among individuals. A face is modeled as a graph where the nodes represent facial feature points. This model is used for automatic face and facial feature points detection, and facial feature points tracking by applying a flexible features matching. Face identification is performed by comparing the graphs representing the input image and individual face models. Facial expression is modeled by finding the relationship between the motion of facial feature points and expression change using B spline curves. Individual and average expression models are generated and then used to identify facial expressions and the degree of expression changes. The expression model used for facial expression recognition is chosen by the results of face identification.