Study of Emotion Recognition based on Facial Image for Emotional Rehabilitation Biofeedback

정서재활 바이오피드백을 위한 얼굴 영상 기반 정서인식 연구

  • 고광은 (중앙대학교 전자전기공학부) ;
  • 심귀보 (중앙대학교 전자전기공학부)
  • Received : 2010.06.10
  • Accepted : 2010.07.20
  • Published : 2010.10.01


If we want to recognize the human's emotion via the facial image, first of all, we need to extract the emotional features from the facial image by using a feature extraction algorithm. And we need to classify the emotional status by using pattern classification method. The AAM (Active Appearance Model) is a well-known method that can represent a non-rigid object, such as face, facial expression. The Bayesian Network is a probability based classifier that can represent the probabilistic relationships between a set of facial features. In this paper, our approach to facial feature extraction lies in the proposed feature extraction method based on combining AAM with FACS (Facial Action Coding System) for automatically modeling and extracting the facial emotional features. To recognize the facial emotion, we use the DBNs (Dynamic Bayesian Networks) for modeling and understanding the temporal phases of facial expressions in image sequences. The result of emotion recognition can be used to rehabilitate based on biofeedback for emotional disabled.


Supported by : 한국연구재단


  1. P. Ekman, W. V. Friesen, Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto, 1978.
  2. T. F. Cootes and C. J. Taylor, Statistical Models of Appearance for Computer Vision, 2004.
  3. X. Xie and K.-M. Lam, "Facial expression recognition based on shape and texture," Pattern Recognition, vol. 42, pp. 1003-1011, May 2009.
  4. G. Donato, M. S. Barlett, J. C. Hager, P. Ekman, and T. J. Sejnowski, "Classifying facial actions," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, no. 10, pp. 974-989, Oct. 1999.
  5. E. Smith, M. S. Barlett, and J. R. Movellan, "Computer Recognition of facial actions: a study of Co-articulation effects," Proceeding of Eighth Annual Joint Symposium on Neural Computation, 2001.
  6. A. Kapoor, Y. Qi, and R. W. Picard, "Fully automatic upper facial action recognition," Proceedings of IEEE International Workshop, Analysis and Modeling of Faces and Gestures, pp. 195-202, 2003.
  7. Y. Zhang and Q. Ji, "Active and dynamic information fusion for facial expression understanding from image sequences," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 5, pp. 699-714, May 2005.
  8. Y. Tong, W. Liao, and Q. Ji, "Facial action unit recognition by exploiting their dynamic and semantic relationship," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 10, pp. 1683-1699, Oct. 2007.
  9. M. M. Nordstrom, Mad, Larsen, J. Sierakowski, and M. B. Stegmann, "The IMM face database: an annotated dataset of 240 face images," Informatics and Mathematical Modelling, Technical University of Denmark, 2004.
  10. C. Huang and Y. Huang, "Facial expression recognition using model-based feature extraction and action," J. Visual Comm. and Image Representation, vol. 8, no. 3, pp. 278-290, Sep. 1997.
  11. I. Kotsia and I. Pitas, "Facial expression recognition in image sequences using geometric deformation features and support vector machines," IEEE Transactions on Image Processing, vol. 16, no. 1, pp. 172-187, Jan. 2007.

Cited by

  1. A Study on Emotion Recognition Systems based on the Probabilistic Relational Model Between Facial Expressions and Physiological Responses vol.19, pp.6, 2013,