Sound-based Emotion Estimation and Growing HRI System for an Edutainment Robot

에듀테인먼트 로봇을 위한 소리기반 사용자 감성추정과 성장형 감성 HRI시스템

  • 김종철 (KT 중앙연구소 Exploration담당) ;
  • 박귀홍 (KT 중앙연구소 Exploration담당)
  • Received : 2009.09.23
  • Accepted : 2010.01.14
  • Published : 2010.02.26

Abstract

This paper presents the sound-based emotion estimation method and the growing HRI (human-robot interaction) system for a Mon-E robot. The method of emotion estimation uses the musical element based on the law of harmony and counterpoint. The emotion is estimated from sound using the information of musical elements which include chord, tempo, volume, harmonic and compass. In this paper, the estimated emotions display the standard 12 emotions including Eckman's 6 emotions (anger, disgust, fear, happiness, sadness, surprise) and the opposite 6 emotions (calmness, love, confidence, unhappiness, gladness, comfortableness) of those. The growing HRI system analyzes sensing information, estimated emotion and service log in an edutainment robot. So, it commands the behavior of the robot. The growing HRI system consists of the emotion client and the emotion server. The emotion client estimates the emotion from sound. This client not only transmits the estimated emotion and sensing information to the emotion server but also delivers response coming from the emotion server to the main program of the robot. The emotion server not only updates the rule table of HRI using information transmitted from the emotion client and but also transmits the response of the HRI to the emotion client. The proposed system was applied to a Mon-E robot and can supply friendly HRI service to users.

Keywords

References

  1. Fong. I. Nourbakhsh, and K. Dautenhahn, "A Survey of Social Interactive Robotics: Concepts, Design, and Applications", CMU-RI_TR-025-29, 2002.
  2. C. Breazeal, "Social Interaction in HRI: The Robot View", IEEE Trans. On Systems, Man, and Cyber.-Part C:, Vol.34, No.2, pp.181-185, 2004. https://doi.org/10.1109/TSMCC.2004.826268
  3. T. Shibata, T. Tashima and K. Tanie, "Emergence of Emotional Behavior through Physical Interaction between Human and Robot", In Proc. of IEEE Int'l. on Robotics and Automation, USA, pp.2868-2873, May, 1999.
  4. R.C. Arkim, M. Fujita, T. Takagi, and R. Hasegawa, "An Ethological and Emotional basis for Human-robot Interaction", Robotics and Autonomous Systems, Vol.42, pp.191-201, 2003. https://doi.org/10.1016/S0921-8890(02)00375-5
  5. Yo-Chan Kim, Hyuk-Tae Kwon, Wan-Chul Yoon, Jong-Cheol Kim, "Designing Emontional and Interactive Behaviors for an Entertainment Robot", Proc. of Int'l Conf. on Human Computer Interaction, LNCS 5611. pp.321-330, 2009.
  6. 이태근, 이동욱, 소병욱, 이호길, "인간 친화적 상호작용을 위한 안드로이드 로봇의 감성 시스템", Proc. Of KFIS Spring Conf., Vol.17, pp.95-98, 2007.
  7. 김형록, 김영민, 박종찬, 박경숙, 강태운, 권동수, "서비스 로봇을 위한 리액티브 감정 생성 모델", 로봇공학회 논문지, 제2권, 제2호, 2007.
  8. P. Eckman and W. V. Friesen, "Facial Action Coding System/Investigator's Guide", Consulting Psychologists Press, 1978.
  9. 문병현, 심귀보, "음성신호에 기만한 엔테테인먼트 로봇 감정 표현 시스템", Proc. of KIIS Fall Conf., Vol.18, No.2, 2008.
  10. 박천수, 류정우, 손주찬, "로봇 감성 기술", 전자통신동향분석, 제22권, 제2호, pp.1-9, 2007.
  11. 박면웅, 안승민, 하성도, 정도언, 류인균, "감정 및 정서상태 전이를 위한 감성 컨텐츠 추천 시스템 개발", 감성과학, Vol.10, No.1, pp.1-11, 3, 2007.