DOI QR코드

DOI QR Code

소셜 로봇의 표정 커스터마이징 구현 및 분석

The Implementation and Analysis of Facial Expression Customization for a Social Robot

  • 투고 : 2022.12.27
  • 심사 : 2023.02.07
  • 발행 : 2023.05.31

초록

Social robots, which are mainly used by individuals, emphasize the importance of human-robot relationships (HRR) more compared to other types of robots. Emotional expression in robots is one of the key factors that imbue HRR with value; emotions are mainly expressed through the face. However, because of cultural and preference differences, the desired robot facial expressions differ subtly depending on the user. It was expected that a robot facial expression customization tool may mitigate such difficulties and consequently improve HRR. To prove this, we created a robot facial expression customization tool and a prototype robot. We implemented a suitable emotion engine for generating robot facial expressions in a dynamic human-robot interaction setting. We conducted experiments and the users agreed that the availability of a customized version of the robot has a more positive effect on HRR than a predefined version of the robot. Moreover, we suggest recommendations for future improvements of the customization process of robot facial expression.

키워드

과제정보

This paper was partly supported by National Research Foundation of Korea (NRF) grant funded by the Korea Government (MSIT) (NRF-2020R1F1A1066397) and the Technology Innovation Program (20015056, Commercialization design and development of Intelligent Product-Service System for personalized full silver life cycle care) funded By the Ministry of Trade, Industry & Energy (MOTIE, Korea)

참고문헌

  1. T. Kanda, T. Hirano, D. Eaton, and H. Ishiguro, "Interactive Robots as Social Partners and Peer Tutors for Children: A Field Trial," Hum Comput Interact, vol. 19, pp. 61-84, 2004, [Online], https://www.tandfonline.com/doi/abs/10.1080/07370024.2004.9667340. 
  2. F. Tanaka, A. Cicourel, and J. R. Movellan, "Socialization between toddlers and robots at an early childhood education center," PNAS, vol. 104, no. 46, pp. 17954-17958, Nov., 2007, DOI: 10.1073/pnas.0707769104. 
  3. S. M. S. Khaksar, R. Khosla, M. T. Chu, and F. S. Shahmehr, "Service Innovation Using Social Robot to Reduce Social Vulnerability among Older People in Residential Care Facilities," Technol Forecast Soc Change, vol. 113, pp. 438-453, Dec., 2016, DOI: 10.1016/j.techfore.2016.07.009. 
  4. H. Kozima, C. Nakagawa, and Y. Yasuda, "Interactive robots for communication-care: a case-study in autism therapy," IEEE International Workshop on Robot and Human Interactive Communication (ROMAN), Nashville, USA, 2005, DOI: 10.1109/ROMAN.2005.1513802. 
  5. G. Hoffman, O. Zuckerman, G. Hirschberger, M. Luria, and T. Shani Sherman, "Design and Evaluation of a Peripheral Robotic Conversation Companion," The Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, Portland Oregon, USA, pp. 3-10, 2015, DOI: 10.1145/2696454.2696495. 
  6. J. Hudson, R. Ungar, L. Albright, R. Tkatch, J. Schaeffer, and E. R. Wicker, "Robotic Pet Use Among Community-Dwelling Older Adults," The Journals of Gerontology: Series B, vol. 75, no. 9, pp. 2018-2028, Aug., 2020, DOI: 10.1093/geronb/gbaa119. 
  7. G. F. Melson, P. H. Kahn, Jr., A. Beck, and B. Friedman, "Robotic Pets in Human Lives: Implications for the Human-Animal Bond and for Human Relationships with Personified Technologies," Journal of Social Issues, vol. 65, no. 3, pp. 545-567, Sept., 2009, DOI: 10.1111/j.1540-4560.2009.01613.x. 
  8. T. Fong, I. Nourbakhsh, and K. Dautenhahn, "A survey of socially interactive robots," Robotics and Autonomous Systems, vol. 42, no. 3-4, pp. 143-166, Mar., 2003, DOI: 10.1016/S0921-8890(02)00372-X. 
  9. I. Leite, A. Pereira, S. Mascarenhas, C. Martinho, R. Prada, and A. Paiva, "The influence of empathy in human-robot relations," International Journal of Human-Computer Studies, vol. 71, no. 3, pp. 250-260, Mar., 2013, DOI: 10.1016/J.IJHCS.2012.09.005. 
  10. R. Stock and M. A. Nguyen, "Robotic Psychology. What Do We Know about Human-Robot Interaction and What Do We Still Need to Learn?," the 52nd Hawaii International Conference on System Sciences, Grand Wailea Hawaii, USA, 2019, DOI: 10.24251/HICSS.2019.234. 
  11. A.-H. Chiang, S. Trimi, and Y.-J. Lo, "Emotion and service quality of anthropomorphic robots," Technological Forecasting and Social Change, vol. 177, Apr., 2022, DOI: 10.1016/j.techfore.2022.121550. 
  12. M.-H. Huang and R. T. Rust, "Engaged to a Robot? The Role of AI in Service," Journal of Service Research, vol. 24, no. 1, pp. 30-41, Feb., 2021, DOI: 10.1177/1094670520902266. 
  13. P. Salovey and J. D. Mayer, "Emotional Intelligence," Imagination Cognition and Personality, vol. 9, no. 3, Nov., 2016, DOI: 10.2190/DUGG-P24E-52WK-6CDG. 
  14. A. Mehrabian, "Communication Without Words," Communication Theory, 2nd ed. Routledge, 2017, DOI: 10.4324/9781315080918-15. 
  15. A. Schade, "Customization vs. Personalization in the User Experience," Nielsen Norman Group, 2016, [Online], https://www.nngroup.com/articles/customization-personalization/, Accessed: Sept. 07, 2022. 
  16. P. Alves-Oliveira, M. Bavier, S. Malandkar, R. Eldridge, J. Sayigh, E. A. Bjorling, and M. Cakmak, "FLEXI: A Robust and Flexible Social Robot Embodiment Kit," Designing Interactive Systems Conference, Jun., 2022, pp. 1177-1191. DOI: 10.1145/3532106.3533541. 
  17. A. S. Kim, E. A. Bjorling, S. Bhatia, and D. Li, "Designing a Collaborative Virtual Reality Game for Teen-Robot Interactions," The 18th ACM International Conference on Interaction Design and Children, Jun., 2019, pp. 470-475. DOI: 10.1145/3311927.325314. 
  18. P. Ekman and W. V. Friesen, "Constants across cultures in the face and emotion," Journal of Personality Social Psychology, vol. 17, no. 2, pp. 124-129, 1971, DOI: 10.1037/h0030377. 
  19. C. Breazeal, Designing Sociable Robots, The MIT Press, 2004. DOI: 10.7551/mitpress/2376.001.0001. 
  20. L. Canamero and J. Fredslund, "I show you how I like you - can you read it in my face? [robotics]," IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, vol. 31, no. 5, pp. 454-459, Sept., 2001, DOI: 10.1109/3468.952719. 
  21. F. Hara and H. Kobayashi, "Real-time facial interaction between human and 3D face robot agent," IEEE International Workshop on Robot and Human Communication (ROMAN), Tsukuba, Japan, 1996, DOI: 10.1109/ROMAN.1996.568870. 
  22. K. Kuhnlenz, S. Sosnowski, and M. Buss, "Impact of Animal-Like Features on Emotion Expression of Robot Head EDDIE," Advanced Robotics, vol. 24, no. 8-9, pp. 1239-1255, Apr., 2012, DOI: 10.1163/016918610X501309. 
  23. M. Haring, N. Bee, and E. Andre, "Creation and Evaluation of emotion expression with body movement, sound and eye color for humanoid robots," IEEE International Workshop on Robot and Human Communication (ROMAN), Atlanta, USA, 2011, DOI: 10.1109/ROMAN.2011.6005263. 
  24. D. Loffler, N. Schmidt, and R. Tscharn, "Multimodal Expression of Artificial Emotion in Social Robots Using Color, Motion and Sound," The 2018 ACM/IEEE International Conference on Human-Robot Interaction, New York, USA, pp. 334-343, 2018, DOI: 10.1145/3171221.3171261. 
  25. S. Yilmazyildiz, R. Read, T. Belpeame, and W. Verhelst, "Review of Semantic-Free Utterances in Social Human-Robot Interaction," International Journal of Human-Computer Interaction, vol. 32, no. 1, pp. 63-85, Jan., 2016, DOI: 10.1080/10447318.2015.1093856. 
  26. J. Monceaux, J. Becker, C. Boudier, and A. Mazel, "Demonstration: first steps in emotional expression of the humanoid robot Nao," The 2009 international conference on Multimodal interfaces, New York, USA, pp. 235-236, 2009, DOI: 10.1145/1647314.1647362. 
  27. A. Beck, L. Canamero, and K. A. Bard, "Towards an Affect Space for robots to display emotional body language," IEEE International Workshop on Robot and Human Communication (ROMAN), Viareggio, Italy, 2010, DOI: 10.1109/ROMAN.2010.5598649. 
  28. M. Zecca, Y. Mizoguchi, K. Endo, F. Iida, Y. Kawabata, N. Endo, K. Itoh, and A. Takanishit, "Whole body emotion expressions for KOBIAN humanoid robot - preliminary experiments with different Emotional patterns -," IEEE International Workshop on Robot and Human Communication (ROMAN), Toyama, Japan, 2009, DOI: 10.1109/ROMAN.2009.5326184. 
  29. R. L. Birdwhistell, Introduction to Kinesics: An Annotation System for Analysis of Body Motion and Gesture, University of Louisville, 1952, [Online], https://books.google.co.kr/books/about/Introduction_to_Kinesics.html?id=eKKOvwEACAAJ&redir_esc=y. 
  30. F. Yan, A. M. Iliyasu, and K. Hirota, "Emotion space modelling for social robots," Engineering Applications of Artificial Intelligence, vol. 100, Apr., 2021, DOI: 10.1016/j.engappai.2021.104178. 
  31. H.-S. Ahn and J.-Y. Choi, "Multi-Dimensional Complex Emotional Model for Various Complex Emotional Expression using Human Friendly Robot System," The Journal of Korea Robotics Society, vol. 4, no. 3, pp. 210-217, 2009, [Online], https://koreascience.kr/article/JAKO200917161877648.page. 
  32. B. C. Stahl, N. McBride, K. Wakunuma, and C. Flick, "The empathic care robot: A prototype of responsible research and innovation," Technological Forecasting and Social Change, vol. 84, pp. 74-85, May, 2014, DOI: 10.1016/j.techfore.2013.08.001. 
  33. H. Miwa, T. Okuchi, K. Itoh, H. Takanobu, and A. Takanishi, "A new mental model for humanoid robots for human friendly communication introduction of learning system, mood vector and second order equations of emotion," IEEE International Conference on Robotics and Automation (ICRA), Taipei, Taiwan, 2003, DOI: 10.1109/ROBOT.2003.1242146. 
  34. K.-G. Oh, M.-S. Jang, and S.-J. Kim, "Automatic emotional expression of a face robot by using a reactive behavior decision model," Journal of Mechanical Science and Technology, vol. 24, no. 3, Mar., 2010, DOI: 10.1007/s12206-010-0118-9. 
  35. J. W. Park, H. S. Lee, and M. J. Chung, "Generation of Realistic Robot Facial Expressions for Human Robot Interaction," Journal of Intelligent & Robotic Systems, vol. 78, Jun., 2014, DOI: 10.1007/S10846-014-0066-1. 
  36. J.-W. Park, H.-S. Lee, S.-H. Jo, and M.-J. Chung, "Dynamic Emotion Model in 3D Affect Space for a Mascot-Type Facial Robot," The Journal of Korea Robotics Society, vol. 2, no. 3, pp. 282-288, Sept., 2007, [Online], https://koreascience.kr/article/JAKO200723736028090.page. 
  37. S. Prajapati, C. L. S. Naika, S. S. Jha, and S. B. Nair, "On Rendering Emotions on a Robotic Face," Conference on Advances In Robotics, New York, USA, pp. 1-7, 2013, DOI: 10.1145/2506095.2506151. 
  38. S. Andrist, B. Mutlu, and A. Tapus, "Look Like Me: Matching Robot Personality via Gaze to Increase Motivation," The 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3603-3612, 2015, DOI: 10.1145/2702123.2702592. 
  39. S. Jung, H.-Lim, S. Kwak, and F. Biocca, "Personality and facial expressions in human-robot interaction," The seventh annual ACM/IEEE international conference on Human-Robot Interaction, pp. 161-162, 2012, DOI: 10.1145/2157689.2157735. 
  40. E. Park, D. Jin, and A. P. del Pobil, "The Law of Attraction in Human-Robot Interaction," International Journal of Advanced Robotic Systems, Jan., 2012, DOI: 10.5772/50228. 
  41. G. Trovato, T. Kishi, N. Endo, M. Zecca, K. Hashimoto, and A. Takanishi, "Cross-Cultural Perspectives on Emotion Expressive Humanoid Robotic Head: Recognition of Facial Expressions and Symbols," International Journal of Social Robotics, vol. 5, pp. 515-527, Sept., 2013, DOI: 10.1007/s12369-013-0213-z. 
  42. P. Sripian, M. N. A. M. Anuardi, J. Yu, and M. Sugaya, "The Implementation and Evaluation of Individual Preference in Robot Facial Expression Based on Emotion Estimation Using Biological Signals," Sensors, vol. 21, no. 18, Sept., 2021, DOI: 10.3390/s21186322. 
  43. R. Campbell, "Asymmetries in Interpreting and Expressing a Posed Facial Expression," Cortex, vol. 14, no. 3, pp. 327-342, Sept., 1978, DOI: 10.1016/S0010-9452 (78)80061-6. 
  44. G. S. Urumutta Hewage, Y. Liu, Z. Wang, and H. Mao, "Consumer responses toward symmetric versus asymmetric facial expression emojis," Marketing Letters, vol. 32, Nov., 2020, DOI: 10.1007/s11002-020-09550-8. 
  45. L. F. Barrett, K. A. Lindquist, and M. Gendron, "Language as context for the perception of emotion," Trends in Cognitive Sciences, vol. 11, no. 8, pp. 327-332, Aug., 2007, DOI: 10.1016/j.tics.2007.06.003. 
  46. L. F. Barrett, How emotions are made: The secret life of the brain, HarperCollins, 2017, [Online], https://books.google.co.kr/books/about/How_Emotions_Are_Made.html?id=hN8MBgAAQBAJ&redir_esc=y. 
  47. P. H. Kahn, J. H. Ruckert, T. Kanda, H. Ishiguro, A. Reichert, H. Gary, and S. Shent, "Psychological intimacy with robots? Using interaction patterns to uncover depth of relation," ACM/IEEE International Conference on Human-Robot Interaction (HRI), Osaka, Japan, 2010, DOI: 10.1109/HRI.2010.5453235. 
  48. H. S. Lee, J. W. Park, and M. J. Chung, "A Linear Affect-Expression Space Model and Control Points for Mascot-Type Facial Robots," IEEE Transactions on Robotics, vol. 23, no. 5, pp. 863-873, Oct., 2007, DOI: 10.1109/TRO.2007.907477. 
  49. Paul Ekman Group, What are Emotions?, [Online], https://www.paulekman.com/universal-emotions/, Accessed: Jun. 02, 2023.