DOI QR코드

DOI QR Code

Effects of LED on Emotion-Like Feedback of a Single-Eyed Spherical Robot

  • Onchi, Eiji (Graduate School of Comprehensive Human Sciences, University of Tsukuba) ;
  • Cornet, Natanya (Department of Industrial Design, Eindhoven University of Technology) ;
  • Lee, SeungHee (Faculty of Art and Design, University of Tsukuba)
  • Received : 2021.06.28
  • Accepted : 2021.08.23
  • Published : 2021.09.30

Abstract

Non-verbal communication is important in human interaction. It provides a layer of information that complements the message being transmitted. This type of information is not limited to human speakers. In human-robot communication, increasing the animacy of the robotic agent-by using non-verbal cues-can aid the expression of abstract concepts such as emotions. Considering the physical limitations of artificial agents, robots can use light and movement to express equivalent emotional feedback. This study analyzes the effects of LED and motion animation of a spherical robot on the emotion being expressed by the robot. A within-subjects experiment was conducted at the University of Tsukuba where participants were asked to rate 28 video samples of a robot interacting with a person. The robot displayed different motions with and without light animations. The results indicated that adding LED animations changes the emotional impression of the robot for valence, arousal, and dominance dimensions. Furthermore, people associated various situations according to the robot's behavior. These stimuli can be used to modulate the intensity of the emotion being expressed and enhance the interaction experience. This paper facilitates the possibility of designing more affective robots in the future, using simple feedback.

Keywords

Acknowledgement

This study is supported by the Ministry of Education, Culture, Sport, Science and Technology (MEXT). The authors would like to thank the members of the Kansei Design Lee Lab for their support on conducting the experiment.

References

  1. Admoni, H., & Scassellati, B. (2017). Social eye gaze in human-robot interaction: A review. Journal of Human-Robot Interaction, 6(1), 25-63. https://doi.org/10.5898/JHRI.6.1.Admoni
  2. Andrist, S., Tan, X. Z., Gleicher, M., & Mutlu, B. (2014). Conversational gaze aversion for humanlike robots. Conference Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction - HRI '14. DOI: 10.1145/2559636.2559666
  3. Baraka, K., Paiva, A., & Veloso, M. (2015). Expressive lights for revealing mobile service robot state. Advances in Intelligent Systems and Computing, 107-119.
  4. Baron-Cohen, S., Wheelwright, S., Jolliffe, T. (1997). Is there a "language of the eyes"? evidence from normal adults, and adults with autism or asperger syndrome. Visual Cognition, 4(3), 311-331. DOI: 10.1080/713756761
  5. Bentivoglio, A. R., Bressman, S. B., Cassetta, E., Carretta, D., Tonali, P., & Albanese, A. (1997). Analysis of blink rate patterns in normal subjects. Movement Disorders, 12(6), 1028-1034. DOI: 10.1002/mds.870120629
  6. Betella, A., & Verschure, P. F. (2016). The affective slider: A digital self-assessment scale for the measurement of human emotions. PLOS ONE, 11(2). DOI: 10.1371/journal.pone.0148037
  7. Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry, 25(1), 49-59. DOI: 10.1016/0005-7916(94)90063-9
  8. Breazeal, C. L. (2002). Designing sociable robots. MIT Press. ISBN: 978-0-262-02510-2
  9. Canigueral, R., & Hamilton, A. F. (2019). The role of eye gaze during natural social interactions in typical and autistic people. Frontiers in Psychology, 10. DOI: 10.3389/fpsyg.2019.00560
  10. Ekman, P., & Friesen, W. V. (2003). Unmasking The Face: A guide to recognizing emotions from facial expressions. Malor Books.
  11. Ekman, P., Friesen, W. V., & Ellsworth, P. H. (1972). What emotion categories can observers judge from facial behavior?. Emotion in the Human Face, 57-65. DOI: 10.1016/b978-0-08-016643-8.50024-0
  12. Funakoshi, K., Kobayashi, K., Nakano, M., Yamada, S., Kitamura, Y., & Tsujino, H. (2008). Conference proceedings of the 10th international conference on multimodal interfaces - IMCI '08. DOI: 10.1145/1452392.1452452
  13. Hess, U., Blairy, S., & Kleck, R. E. (1997). The intensity of emotional facial expressions and decoding accuracy. Journal of Nonverbal Behavior, 21(4), 241-257. DOI: 10.1023/a:1024952730333
  14. Homke, P., Holler, J., & Levinson, S. C. (2018). Eye blinks are perceived as communicative signals in human face-to-face interaction. PLOS ONE, 13(12), e0208030. DOI: 10.1371/journal.pone.0208030
  15. Hyeon, Y., Pan, Y. H., & Yoo, H. S. (2019). Analysis of users' emotions on lighting effect of artificial intelligence devices. Korean Society for Emotion and Sensibility, 22(3), 35-46. DOI: 10.14695/kjsos.2018.22.3.35
  16. Jack, R., Garrod, O. B., & Schyns, P. (2014). Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time. Current Biology, 24(2), 187-192. DOI: 10.1016/j.cub.2013.11.064
  17. Johnson, S. C., & Ma, E. (2005). ROMAN 2005. The role of agent behavior in mentalistic attributions by observers. IEEE International Workshop on Robot and Human Interactive Communication, Nashville, TN, USA (pp. 723-728). DOI: 10.1109/ROMAN.2005.1513865
  18. Johnson, S. C., Ok, S. J., & Luo, Y. (2007). The attribution of attention: 9-month-olds' interpretation of gaze as goal-directed action. Developmental Science, 10(5), 530-537. DOI: 10.111/j.1467-7687.2007.00606.x
  19. Kaya, N., & Epps, H. H. (2004). Relationship between color and emotion: A study of college students. College Student Journal, 38(3), 22-63.
  20. Kim, J., Kim, Y., & Jo, E. (2020). Effect of color and emotional context on processing emotional information of biological motion. Korean Society for Emotion and Sensibility, 23(3), 63-78. DOI: 10.14695/kjsos.2020.23.3.63
  21. Kishi, T., Otani, T., Endo, N., Kryczka, P., Hashimoto, K., Nakata, K., & Takanishi, A. (2012). Development of expressive robotic head for bipedal humanoid robot. 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal (pp. 4584-4589). DOI: 10.1109/IROS.2012.6386050
  22. Kleinsmith, A., & Semsar, A. (2019). Perception of emotion in body expressions from gaze behavior. Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems - CHI EA '19. DOI: 10.1145/3290607.3313062
  23. Kuhn, G., Tatler, B. W., & Cole, G. G. (2009). You look where I look! Effect of gaze cues on overt and covert attention in misdirection. Visual Cognition, 17(6-7), 925-944. DOI: 10.1080/13506280902826775
  24. Lee, J., Yang, H., & Lee, D. (2019). Context modulation effect by affective words influencing on the judgment of facial emotio. Korean Society for Emotion and Sensibility, 22(2), 37-48. DOI: 10.14695/kjsos.2018.22.2.37
  25. Levinson, S. C. (2016). Turn-taking in human communication - origins and implications for language processing. Trends in Cognitive Sciences, 20(1), 6-14. DOI: 10.1016/j.tics.2015.10.010
  26. Loffler, D., Schmidt, N., & Tscharn, R. (2018). Multimodal expression of artificial emotion in social robots using color, motion and sound. Conference Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction - HRI '18. DOI: 10.1145/3171221.3171261
  27. Mohammad, S. M. (2018). Obtaining reliable human ratings of valence, arousal, and dominance for 20,000 English words. Proceedings of The Annual Conference of the Association for Computational Linguistics (ACL).
  28. Mori, M., MacDorman, K., & Kageki, N. (2012). The Uncanny Valley [From the Field]. IEEE Robotics & Automation Magazine, 19(2), 98-100. DOI: 10.1109/MRA.2012.2192811
  29. Onchi, E., & Lee, S. H. (2019). Design and evaluation of a spherical robot with emotion-like feedback during human-robot training. Transactions of Japan Society of Kansei Engineering, 19(1), 105-116. DOI: 10.5057/jjske.tjske-d-19-00036
  30. Posner, J., Russell, J. A., & Peterson, B. S. (2005). The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and Psychopathology, 17(03). DOI: 10.1017/s0954579405050340
  31. Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161-1178. DOI: 10.1037/h0077714
  32. Russell, J. A., & Bullock, M. (1986). Fuzzy concepts and the perception of emotion in facial expressions. Social Cognition, 4(3), 309-341. DOI: 10.1521/soco.1986.4.3.309
  33. Song, S., & Yamada, S. (2017). Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. Conference Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction - HRI '17. DOI: 10.1145/2909824.3020239
  34. Sumioka, H., Minato, T., Matsumoto, Y., Salvini, P., & Ishiguro, H. (2013). Design of human Likeness in HRI from Uncanny Valley to minimal design. 2013 8th ACM/IEEE International Conference on Human-robot Interaction (HRI), Tokyo, Japan (pp. 433-434). DOI: 10.1109/HRI.2013.6483633
  35. Terada, K., Takeuchi, C., & Ito, A. (2013). Effect of emotional expression in simple line drawings of a face on human economic behavior. 2013 IEEE RO-MAN, Gyeongju (pp. 51-56). DOI: 10.1109/ROMAN.2013.6628530
  36. Terada, K., Yamauchi, A., & Ito, A. (2012). Artificial emotion expression for a robot by dynamic color change. 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France. DOI: .10.1109/ROMAN.2012.6343772
  37. Thomas, F., & Johnston, O. (1981). The illusion of life: Disney animation, 1st edition. Abbeville Press.
  38. Wilms, L., & Oberfeld, D. (2018). Color and emotion: Effects of hue, saturation, and brightness. Psychological Research, 82(5), 896-914. DOI: 10.1007/s00426-017-0880-8