• Title/Summary/Keyword: Robot Emotion

Search Result 134, Processing Time 0.03 seconds

The Implementation and Analysis of Facial Expression Customization for a Social Robot (소셜 로봇의 표정 커스터마이징 구현 및 분석)

  • Jiyeon Lee;Haeun Park;Temirlan Dzhoroev;Byounghern Kim;Hui Sung Lee
    • The Journal of Korea Robotics Society
    • /
    • v.18 no.2
    • /
    • pp.203-215
    • /
    • 2023
  • Social robots, which are mainly used by individuals, emphasize the importance of human-robot relationships (HRR) more compared to other types of robots. Emotional expression in robots is one of the key factors that imbue HRR with value; emotions are mainly expressed through the face. However, because of cultural and preference differences, the desired robot facial expressions differ subtly depending on the user. It was expected that a robot facial expression customization tool may mitigate such difficulties and consequently improve HRR. To prove this, we created a robot facial expression customization tool and a prototype robot. We implemented a suitable emotion engine for generating robot facial expressions in a dynamic human-robot interaction setting. We conducted experiments and the users agreed that the availability of a customized version of the robot has a more positive effect on HRR than a predefined version of the robot. Moreover, we suggest recommendations for future improvements of the customization process of robot facial expression.

A Study on Infra-Technology of RCP Interaction System

  • Kim, Seung-Woo;Choe, Jae-Il
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1121-1125
    • /
    • 2004
  • The RT(Robot Technology) has been developed as the next generation of a future technology. According to the 2002 technical report from Mitsubishi R&D center, IT(Information Technology) and RT(Robotic Technology) fusion system will grow five times larger than the current IT market at the year 2015. Moreover, a recent IEEE report predicts that most people will have a robot in the next ten years. RCP(Robotic Cellular Phone), CP(Cellular Phone) having personal robot services, will be an intermediate hi-tech personal machine between one CP a person and one robot a person generations. RCP infra consists of $RCP^{Mobility}$, $RCP^{Interaction}$, $RCP^{Integration}$ technologies. For $RCP^{Mobility}$, human-friendly motion automation and personal service with walking and arming ability are developed. $RCP^{Interaction}$ ability is achieved by modeling an emotion-generating engine and $RCP^{Integration}$ that recognizes environmental and self conditions is developed. By joining intelligent algorithms and CP communication network with the three base modules, a RCP system is constructed. Especially, the RCP interaction system is really focused in this paper. The $RCP^{interaction}$(Robotic Cellular Phone for Interaction) is to be developed as an emotional model CP as shown in figure 1. $RCP^{interaction}$ refers to the sensitivity expression and the link technology of communication of the CP. It is interface technology between human and CP through various emotional models. The interactive emotion functions are designed through differing patterns of vibrator beat frequencies and a feeling system created by a smell injection switching control. As the music influences a person, one can feel a variety of emotion from the vibrator's beats, by converting musical chord frequencies into vibrator beat frequencies. So, this paper presents the definition, the basic theory and experiment results of the RCP interaction system. We confirm a good performance of the RCP interaction system through the experiment results.

  • PDF

Kansei Evaluation by a Remote-Controlled Robot Designed for Viewing Art Exhibits

  • Akira, Lkazaki;Hiroya, Igarashi;Shoichi, Maeyama;Akira, Harada
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2000.04a
    • /
    • pp.301-305
    • /
    • 2000
  • The present study is part of the Special Research Project for the Construction of a Kansei Sensory Evaluation Model that is currently underway at the University of Tsukuba. In this study, a robot was operated by remote control at an actual art museum as part of a preliminary experiment. The results obtained therefrom were used to consider how people might view exhibits. In a previous study, a standard lens and s wide-angle lens were used to analyze differences in sensory-based movements, while VRML was used to analyze differences in these movements between a virtual and an actual museum. In the present study, the time delay in remote operation, which is currently unavoidable, placed some restrictions on the degree of freedom with which exhibits could be viewed, but it was apparent that sensory evaluation could be possible depending on the search behavior and viewing time. Furthermore specific viewing behaviors using the robot were observed, suggesting that new Kansei sensory perceptions were derived from these behaviors.

  • PDF

Biosign Recognition based on the Soft Computing Techniques with application to a Rehab -type Robot

  • Lee, Ju-Jang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.29.2-29
    • /
    • 2001
  • For the design of human-centered systems in which a human and machine such as a robot form a human-in system, human-friendly interaction/interface is essential. Human-friendly interaction is possible when the system is capable of recognizing human biosigns such as5 EMG Signal, hand gesture and facial expressions so the some humanintention and/or emotion can be inferred and is used as a proper feedback signal. In the talk, we report our experiences of applying the Soft computing techniques including Fuzzy, ANN, GA and rho rough set theory for efficiently recognizing various biosigns and for effective inference. More specifically, we first observe characteristics of various forms of biosigns and propose a new way of extracting feature set for such signals. Then we show a standardized procedure of getting an inferred intention or emotion from the signals. Finally, we present examples of application for our model of rehabilitation robot named.

  • PDF

Comparison of EEG Topography Labeling and Annotation Labeling Techniques for EEG-based Emotion Recognition (EEG 기반 감정인식을 위한 주석 레이블링과 EEG Topography 레이블링 기법의 비교 고찰)

  • Ryu, Je-Woo;Hwang, Woo-Hyun;Kim, Deok-Hwan
    • The Journal of Korean Institute of Next Generation Computing
    • /
    • v.15 no.3
    • /
    • pp.16-24
    • /
    • 2019
  • Recently, research on emotion recognition based on EEG has attracted great interest from human-robot interaction field. In this paper, we propose a method of labeling using image-based EEG topography instead of evaluating emotions through self-assessment and annotation labeling methods used in MAHNOB HCI. The proposed method evaluates the emotion by machine learning model that learned EEG signal transformed into topographical image. In the experiments using MAHNOB-HCI database, we compared the performance of training EEG topography labeling models of SVM and kNN. The accuracy of the proposed method was 54.2% in SVM and 57.7% in kNN.

A Study on the Depiction of Concept in User Centered for the Supporting Shopping Robot Design Development (쇼핑 서비스 지원로봇 디자인개발을 위한 사용자 중심의 컨셉 도출 연구)

  • Jang, Young-Joo
    • Science of Emotion and Sensibility
    • /
    • v.9 no.spc3
    • /
    • pp.287-297
    • /
    • 2006
  • Human-being's craving for wealthy life and a change of a culture to consume have created mega discount, stores. Therefore, a convenience for use has been required when people shop at the stores. There is tome inconvenience that customers have to find out goods and to purchase them by themselves. This is the fundamental elements for this study about a Robot design which is able to resolve that kinds of problems when shopping at mega discount stores. This study proposes guidelines for Robot design to help shopping business service. It is based on research material by analysis of consumer behavior and its questionnaire.

  • PDF

A Robotic System with Behavioral Intervention facilitating Eye Contact and Facial Emotion Recognition of Children with Autism Spectrum Disorders (자폐 범주성 장애 아동의 눈맞춤과 얼굴표정읽기 기능향상을 위한 행동 중재용 로봇시스템)

  • Yun, Sang-Seok;Kim, Hyuksoo;Choi, JongSuk;Park, Sung-Kee
    • The Journal of Korea Robotics Society
    • /
    • v.10 no.2
    • /
    • pp.61-69
    • /
    • 2015
  • In this paper, we propose and examine the feasibility of the robot-assisted behavioral intervention system so as to strengthen positive response of the children with autism spectrum disorder (ASD) for learning social skills. Based on well-known behavioral treatment protocols, the robot offers therapeutic training elements of eye contact and emotion reading respectively in child-robot interaction, and it subsequently accomplishes pre-allocated meaningful acts by estimating the level of children's reactivity from reliable recognition modules, as a coping strategy. Furthermore, for the purpose of labor saving and attracting children's interest, we implemented the robotic stimulation configuration with semi-autonomous actions capable of inducing intimacy and tension to children in instructional trials. From these configurations, by evaluating the ability of recognizing human activity as well as by showing improved reactivity for social training, we verified that the proposed system has some positive effects on social development, targeted for preschoolers who have a high functioning level.

Emotional Model Focused on Robot's Familiarity to Human

  • Choi, Tae-Yong;Kim, Chang-Hyun;Lee, Ju-Jang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.1025-1030
    • /
    • 2005
  • This paper deals with the emotional model of the software-robot. The software-robot requires several capabilities such as sensing, perceiving, acting, communicating, and surviving. and so on. There are already many studies about the emotional model like KISMET and AIBO. The new emotional model using the modified friendship scheme is proposed in this paper. Quite often, the available emotional models have time invariant human respond architectures. Conventional emotional models make the sociable robot get around with humans, and obey human commands during robot operation. This behavior makes the robot very different from real pets. Similar to real pets, the proposed emotional model with the modified friendship capability has time varying property depending on interaction between human and robot.

  • PDF

Color and Blinking Control to Support Facial Expression of Robot for Emotional Intensity (로봇 감정의 강도를 표현하기 위한 LED 의 색과 깜빡임 제어)

  • Kim, Min-Gyu;Lee, Hui-Sung;Park, Jeong-Woo;Jo, Su-Hun;Chung, Myung-Jin
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.547-552
    • /
    • 2008
  • Human and robot will have closer relation in the future, and we can expect that the interaction between human and robot will be more intense. To take the advantage of people's innate ability of communication, researchers concentrated on the facial expression so far. But for the robot to express emotional intensity, other modalities such as gesture, movement, sound, color are also needed. This paper suggests that the intensity of emotion can be expressed with color and blinking so that it is possible to apply the result to LED. Color and emotion definitely have relation, however, the previous results are difficult to implement due to the lack of quantitative data. In this paper, we determined color and blinking period to express the 6 basic emotions (anger, sadness, disgust, surprise, happiness, fear). It is implemented on avatar and the intensities of emotions are evaluated through survey. We figured out that the color and blinking helped to express the intensity of emotion for sadness, disgust, anger. For fear, happiness, surprise, the color and blinking didn't play an important role; however, we may improve them by adjusting the color or blinking.

  • PDF

Emotion Recognition and Expression System of User using Multi-Modal Sensor Fusion Algorithm (다중 센서 융합 알고리즘을 이용한 사용자의 감정 인식 및 표현 시스템)

  • Yeom, Hong-Gi;Joo, Jong-Tae;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.1
    • /
    • pp.20-26
    • /
    • 2008
  • As they have more and more intelligence robots or computers these days, so the interaction between intelligence robot(computer) - human is getting more and more important also the emotion recognition and expression are indispensable for interaction between intelligence robot(computer) - human. In this paper, firstly we extract emotional features at speech signal and facial image. Secondly we apply both BL(Bayesian Learning) and PCA(Principal Component Analysis), lastly we classify five emotions patterns(normal, happy, anger, surprise and sad) also, we experiment with decision fusion and feature fusion to enhance emotion recognition rate. The decision fusion method experiment on emotion recognition that result values of each recognition system apply Fuzzy membership function and the feature fusion method selects superior features through SFS(Sequential Forward Selection) method and superior features are applied to Neural Networks based on MLP(Multi Layer Perceptron) for classifying five emotions patterns. and recognized result apply to 2D facial shape for express emotion.