Posture features and emotion predictive models for affective postures recognition

감정 자세 인식을 위한 자세특징과 감정예측 모델

  • 김진옥 (대구한의대학교 국제문화정보대학 모바일콘텐츠학부)
  • Received : 2011.08.04
  • Accepted : 2011.11.03
  • Published : 2011.12.31

Abstract

Main researching issue in affective computing is to give a machine the ability to recognize the emotion of a person and to react it properly. Efforts in that direction have mainly focused on facial and oral cues to get emotions. Postures have been recently considered as well. This paper aims to discriminate emotions posture by identifying and measuring the saliency of posture features that play a role in affective expression. To do so, affective postures from human subjects are first collected using a motion capture system, then emotional features in posture are described with spatial ones. Through standard statistical techniques, we verified that there is a statistically significant correlation between the emotion intended by the acting subjects, and the emotion perceived by the observers. Discriminant Analysis are used to build affective posture predictive models and to measure the saliency of the proposed set of posture features in discriminating between 6 basic emotional states. The evaluation of proposed features and models are performed using a correlation between actor-observer's postures set. Quantitative experimental results show that proposed set of features discriminates well between emotions, and also that built predictive models perform well.

감정 컴퓨팅의 대표적 연구 주제는 기계가 사람의 감정을 인식하여 이에 적절히 대응하는 것이다. 감정 인식 연구에서는 얼굴과 목소리 단서를 이용하여 감정을 포착하는데 집중했으며 최근에 와서 행동자세를 주요 수단으로 이용하고 있다. 본 연구의 목적은 감정 표현에서 중요한 역할을 담당하는 자세 특징을 포착하고 확인하여 감정을 판별하는 것이다. 이를 위해 먼저 자세포착시스템으로 다양한 감정 자세를 수집하여 감정별 특징을 공간적 특징으로 설명한다. 그리고 동작을 취하는 행위자가 의도하는 감정과 관찰자가 인지하는 감정 간에 통계적으로 의미 있는 상관관계가 있음을 표준통계기술을 통해 확인한다. 6가지 주요 감정을 판별하기 위해 판별 분석법을 이용하여 감정 자세 예측 모델을 구축하고 자세 특징을 측정한다. 제안 특징과 모델의 평가는 행위자-관찰자 감정 자세 집단의 상관관계를 이용하여 수행한다. 정량적 실험 결과는 제안된 자세 특징으로 감정을 잘 판별하며 감정 예측 모델이 잘 수행됨을 보여준다.

Keywords

References

  1. R. W. Picard, Affective Computing, MIT Press, 2000M. Pantic, Toward an Affect- Sensitive Multimodal Human-Computer Interaction, Proc, of the IEEE, Vol. 91, No. 9, 2003.
  2. H. M. Paterson, F. E. Pollick, A. J. Sanford, The role of velocity in affect discrimination. In Proceedings of the Twenty-third Annual Conference of the Cognitive Science Society, 2000.
  3. Salovey, J.D. Mayer, Emotional Intelligence, Imagination, Cognition and Personality, Vol. 9, No. 3, pp. 185-211, 1990.
  4. A. Vinciarelli, M. Pantic, H. Bourlard, A. Pentland, Social Signal Processing : State-of-the Art and Future Perspectives of an Emerging Domain, Proceedings of the ACM International Conference on Multimedia, pp. 1061-1070, 2008.
  5. B. de Gelder, j. Snyder, G. Gerard, N. Hadjikhani, Fear fosters fight: A mechanism for fear contagion when perceiving emotion expressed by a whole body, Proceedings of the National Academy of Science, Vol. 101, No. 47, pp. 16701-16706, 2003.
  6. M. Coulson, Attributing emotion to static body postures : Recognition accuracy, Confusions and viewpoint dependence, Journal of Nonverbal Behavior, Vol. 28, No. 2, pp. 117-139, 2004.
  7. P. Fagerberg, A. Stahl, K. Hook, Designing gestures for affective input: an analysis of shape, effort and valence, Proceedings of the National Academy of Science, Vol. 102, No. 45, pp. 16518-16523, 2005. https://doi.org/10.1073/pnas.0507650102
  8. E. Back, T. R. Jordan, S. M. Thomas, The recognition of mental states from dynamic and static facial expressions, Visual Cognition, Vol. 17, No. 8 pp. 1271-1286, 2009. https://doi.org/10.1080/13506280802479998
  9. Z. Zeng, M. Pantic, G. Roisman, T. Huang, A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 31, No. 1, pp, 39-48, 2009. https://doi.org/10.1109/TPAMI.2008.52
  10. M. Minsky, The Society of Mind, Simaon and Shuster, New York, 1985.
  11. M. Minsky, The Eomotion Machine: Commonsense Thinking, Artificial Intelligence and the Future of the Human Mind, Simon and Schuster, New York, 2006.
  12. P. Ekman, An argument for basic emotions. Cognition and Emotion, pp. 169-2000, 1992.
  13. J. A. Panksepp, A critical role for effective neuron science in resolving what is basic about basic emotion. Psychological Review, pp 554-560, 1992.
  14. S. Kimihiro, N. Kenichi, Useful information for face perception is described with FACS. Journal of Nonverbal Behavior, Vol. 27, pp. 43-55, 2003. https://doi.org/10.1023/A:1023666107152
  15. R. J. Neagle, K. C. Ng, R. A. Ruddle, Studying the fidelity requirements for a virtual ballet dancer, In Proceedings of Vision, Video and Graphics Conference (VVG2003), pp. 181-188, 2003.
  16. F. E. Pollick, H. Patterson, A. Bruderlin, A. J. Sanford, Perceiving affect from arm movement, Cognition, Vol. 82, pp.51-61, 2001. https://doi.org/10.1016/S0010-0277(01)00147-0
  17. W. Woo, J. Park, Y. Iwadate, Emotion analysis from dance performance using time-delay neural networks, Proceedings of the JCIS-CVPRIP''00, Vol. 2, pp. 374-377, 2000.
  18. R. von Laban, The Mastery of Movement, Plays Inc., 1971.
  19. M. M. Coulson, Attributing emotion to static body postures: recognition accuracy, confusions and viewpoint dependence, Journal of Nonverbal Behavior, Vol. 28, No. 2, pp. 117-139, 2004. https://doi.org/10.1023/B:JONB.0000023655.25550.be
  20. S. Mota, R. Picard, Automated posture analysis for detecting Learner's Interest Level, Proceedings of the IEEE Workshop on Computer Vision and Pattern Recognition for Human Computer Interaction, pp. 49-50, 2003.
  21. J. A. Beintema, M. Lappe, Perception of biological motion without local image motion. Proceedings of the National Academy of Sciences, Vol. 99, pp. 5661-5663, 2002. https://doi.org/10.1073/pnas.082483699
  22. 김진옥, 사용자 행동 자세를 이용한 시각계 기반의 감정 인식 연구, 한국정보처리학회논문지 B, 제 18-B권, 5호, pp. 1-10, 2011.