• Title/Summary/Keyword: Emotional Robot

Search Result 135, Processing Time 0.031 seconds

Robotics Projects at Pusan National University

  • Kwak, Seung-Chul;Sung, Ji-Hoon;Shim, In-Bo;Yoon, Joong-Sun
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.814-819
    • /
    • 2004
  • Soft engineering, based on symbiotic coexistence of human, machines and environment, is a new engineering field to explore the proper technology and the proper way of engineering. To explore soft engineering intents easily, various robot projects at Pusan National University conducted are presented. Thought experiment, interactive e-leaning, rapid prototyping engineering, biomimciry, tangibility, and ubiquity are concepts to be explored. Thought experiments projects are organized and performed, which include robot assembly game, Turing test, and robotics in science fiction. "Junk robot project" and "ubiquitous Pusan National University (u-PNU) project" have been organized. Also, bug robot project, interactive robot project, and interactive emotional robot projects are introduced. Weekly science fiction films are shown and discussed.

  • PDF

Intelligent Countenance Robot, Humanoid ICHR (지능형 표정로봇, 휴머노이드 ICHR)

  • Byun, Sang-Zoon
    • Proceedings of the KIEE Conference
    • /
    • 2006.10b
    • /
    • pp.175-180
    • /
    • 2006
  • In this paper, we develope a type of humanoid robot which can express its emotion against human actions. To interact with human, the developed robot has several abilities to express its emotion, which are verbal communication with human through voice/image recognition, motion tracking, and facial expression using fourteen Servo Motors. The proposed humanoid robot system consists of a control board designed with AVR90S8535 to control servor motors, a framework equipped with fourteen server motors and two CCD cameras, a personal computer to monitor its operations. The results of this research illustrate that our intelligent emotional humanoid robot is very intuitive and friendly so human can interact with the robot very easily.

  • PDF

Engine of computational Emotion model for emotional interaction with human (인간과 감정적 상호작용을 위한 '감정 엔진')

  • Lee, Yeon Gon
    • Science of Emotion and Sensibility
    • /
    • v.15 no.4
    • /
    • pp.503-516
    • /
    • 2012
  • According to the researches of robot and software agent until now, computational emotion model is dependent on system, so it is hard task that emotion models is separated from existing systems and then recycled into new systems. Therefore, I introduce the Engine of computational Emotion model (shall hereafter appear as EE) to integrate with any robots or agents. This is the engine, ie a software for independent form from inputs and outputs, so the EE is Emotion Generation to control only generation and processing of emotions without both phases of Inputs(Perception) and Outputs(Expression). The EE can be interfaced with any inputs and outputs, and produce emotions from not only emotion itself but also personality and emotions of person. In addition, the EE can be existed in any robot or agent by a kind of software library, or be used as a separate system to communicate. In EE, emotions is the Primary Emotions, ie Joy, Surprise, Disgust, Fear, Sadness, and Anger. It is vector that consist of string and coefficient about emotion, and EE receives this vectors from input interface and then sends its to output interface. In EE, each emotions are connected to lists of emotional experiences, and the lists consisted of string and coefficient of each emotional experiences are used to generate and process emotional states. The emotional experiences are consisted of emotion vocabulary understanding various emotional experiences of human. This study EE is available to use to make interaction products to response the appropriate reaction of human emotions. The significance of the study is on development of a system to induce that person feel that product has your sympathy. Therefore, the EE can help give an efficient service of emotional sympathy to products of HRI, HCI area.

  • PDF

Characterizing Strategy of Emotional sympathetic Robots in Animation and Movie - Focused on Appearance and Behavior tendency Analysis - (애니메이션 및 영화에 등장하는 정서교감형 로봇의 캐릭터라이징 전략 - 외형과 행동 경향성 분석을 중심으로 -)

  • Ryu, Beom-Yeol;Yang, Se-Hyeok
    • Cartoon and Animation Studies
    • /
    • s.48
    • /
    • pp.85-116
    • /
    • 2017
  • The purpose of this study is to analyze conditions that robots depicted in cinematographic works like animations or movies sympathize with and form an attachment with the nuclear person and organize characterizing strategies for emotional sympathetic robots. Along with the development of technology, the areas of artificial intelligence and robots are no longer considered to belong to science fiction but as realistic issues. Therefore, this author assumes that the expressive characteristics of emotional sympathetic robots created by cinematographic works should be used as meaningful factors in expressively embodying human-friendly service robots to be distributed widely afterwards, that is, in establishing the features of characters. To lay the grounds for it, this research has begun. As the subjects of analysis, this researcher has chosen robot characters whose emotional intimacy with the main person is clearly observed among those found in movies and animations produced after the 1920 when robot's contemporary concept was declared. Also, to understand robots' appearance and behavioral tendency, this study (1) has classified robots' external impressions into five types (human-like, cartoon, tool-like, artificial bring, pet or creature) and (2) has classified behavioral tendencies considered to be the outer embodiment of personality by using DiSC, the tool to diagnose behavioral patterns. Meanwhile, it has been observed that robots equipped with high emotional intimacy are all strongly independent about their duties and indicate great emotional acceptance. Therefore, 'influence' and 'Steadiness' types show great emotional acceptance, the influencing type tends to be highly independent, and the 'Conscientiousness' type tends to indicate less emotional acceptance and independency in general. Yet, according to the analysis on external impressions, appearance factors hardly have any significant relationship with emotional sympathy. It implies that regarding the conditions of robots equipped with great emotional sympathy, emotional sympathy grounded on communication exerts more crucial effects than first impression similarly to the process of forming interpersonal relationship in reality. Lastly, to study the characters of robots, it is absolutely needed to have consilient competence embracing different areas widely. This author also has felt that only with design factors or personality factors, it is hard to estimate robot characters and also analyze a vast amount of information demanded in sympathy with humans entirely. However, this researcher will end this thesis as the foundation for it expecting that the general artistic value of animations can be used preciously afterwards in developing robots that have to be studied interdisciplinarily.

Development of FACS-based Android Head for Emotional Expressions (감정표현을 위한 FACS 기반의 안드로이드 헤드의 개발)

  • Choi, Dongwoon;Lee, Duk-Yeon;Lee, Dong-Wook
    • Journal of Broadcast Engineering
    • /
    • v.25 no.4
    • /
    • pp.537-544
    • /
    • 2020
  • This paper proposes the creation of an android robot head based on the facial action coding system(FACS), and the generation of emotional expressions by FACS. The term android robot refers to robots with human-like appearance. These robots have artificial skin and muscles. To make the expression of emotions, the location and number of artificial muscles had to be determined. Therefore, it was necessary to anatomically analyze the motions of the human face by FACS. In FACS, expressions are composed of action units(AUs), which work as the basis of determining the location and number of artificial muscles in the robots. The android head developed in this study had servo motors and wires, which corresponded to 30 artificial muscles. Moreover, the android head was equipped with artificial skin in order to make the facial expressions. Spherical joints and springs were used to develop micro-eyeball structures, and the arrangement of the 30 servo motors was based on the efficient design of wire routing. The developed android head had 30-DOFs and could express 13 basic emotions. The recognition rate of these basic emotional expressions was evaluated at an exhibition by spectators.

Psychomotorik-based Play Activities for Children by In-home Social Robot (어린이를 위한 소셜 로봇의 심리운동 기반 놀이 활동 개발)

  • Kim, Da-Young;Choi, Jihwan;Kim, Juhyun;Kim, Min-Gyu;Chung, Jae Hee;Seo, Kap-Ho;Lee, WonHyong
    • The Journal of Korea Robotics Society
    • /
    • v.17 no.4
    • /
    • pp.447-454
    • /
    • 2022
  • This paper presents the psychomotorik-based play activities executed by the social robot at home which helps children's social and emotional development. Based on the theory and practice of the psychomotorik therapy, the play activities were implemented in the close collaboration between psychmotorik experts, service designers and robotics engineers. The designed play activities are classified into four categories depending on the main areas of child development. The robotic system that can express verbal and nonverbal behaviors was developed in order to play games with children and but also to make children have continuous interest during the play activities with it. Finally, the psychomotorik-based play service scenario and interactive robot system were validated by the expert group from the domain of child psychotherapy. The evaluation results showed that the play service and the robot system were appropriately developed for children from the experts point of view.

A Preliminary Study for Emotional Expression of Software Robot -Development of Hangul Processing Technique for Inference of Emotional Words- (소프트웨어 로봇의 감성 표현을 위한 기반연구 - 감성어 추론을 위한 한글 처리 기술 개발 -)

  • Song, Bok-Hee;Yun, Han-Kyung
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2012.05a
    • /
    • pp.3-4
    • /
    • 2012
  • 사용자 중심의 man machine interface 기술의 발전은 사용자 인터페이스 기술과 인간공학의 접목으로 인하여 많은 진전이 있으며 계속 진행되고 있다. 근래의 정보전달은 사운드와 텍스트 또는 영상을 통하여 이루어지고 있으나, 감성적인 측면에서의 정보전달에 관한 연구는 활발하지 못한 실정이다. 특히, Human Computer Interaction분야에서 음성이나 표정의 전달에 관한 감성연구는 초기단계로 이모티콘이나 플래쉬콘 등이 감정전달을 위하여 사용되고 있으나 부자연스럽고 기계적인 실정이다. 본 연구는 사용자와 상호작용에서 컴퓨터 또는 응용소프트웨어 등이 자신의 가상객체(Software Robot, Sobot)를 활용하여 인간친화적인 상호작용을 제공하기위한 기반연구로써 한글에서 감성어를 추출하여 분류하고 처리하는 기술을 개발하여 컴퓨터가 전달하고자하는 정보에 인공감정을 이입시켜 사용자들의 감성만족도를 향상시키는데 적용하고자한다.

  • PDF

A Study on Infra-Technology of RCP Interaction System

  • Kim, Seung-Woo;Choe, Jae-Il
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1121-1125
    • /
    • 2004
  • The RT(Robot Technology) has been developed as the next generation of a future technology. According to the 2002 technical report from Mitsubishi R&D center, IT(Information Technology) and RT(Robotic Technology) fusion system will grow five times larger than the current IT market at the year 2015. Moreover, a recent IEEE report predicts that most people will have a robot in the next ten years. RCP(Robotic Cellular Phone), CP(Cellular Phone) having personal robot services, will be an intermediate hi-tech personal machine between one CP a person and one robot a person generations. RCP infra consists of $RCP^{Mobility}$, $RCP^{Interaction}$, $RCP^{Integration}$ technologies. For $RCP^{Mobility}$, human-friendly motion automation and personal service with walking and arming ability are developed. $RCP^{Interaction}$ ability is achieved by modeling an emotion-generating engine and $RCP^{Integration}$ that recognizes environmental and self conditions is developed. By joining intelligent algorithms and CP communication network with the three base modules, a RCP system is constructed. Especially, the RCP interaction system is really focused in this paper. The $RCP^{interaction}$(Robotic Cellular Phone for Interaction) is to be developed as an emotional model CP as shown in figure 1. $RCP^{interaction}$ refers to the sensitivity expression and the link technology of communication of the CP. It is interface technology between human and CP through various emotional models. The interactive emotion functions are designed through differing patterns of vibrator beat frequencies and a feeling system created by a smell injection switching control. As the music influences a person, one can feel a variety of emotion from the vibrator's beats, by converting musical chord frequencies into vibrator beat frequencies. So, this paper presents the definition, the basic theory and experiment results of the RCP interaction system. We confirm a good performance of the RCP interaction system through the experiment results.

  • PDF

Is Robot Alive? : Young Children's Perception of a Teacher Assistant Robot in a Classroom (로봇은 살아 있을까? : 우리 반 교사보조로봇에 대한 유아의 인식)

  • Hyun, Eun-Ja;Son, Soo-Ryun
    • Korean Journal of Child Studies
    • /
    • v.32 no.4
    • /
    • pp.1-14
    • /
    • 2011
  • The purpose of this study was to investigate young children's perceptions of a teacher assistant robot, IrobiQ. in a kindergarten classroom. The subjects of this study were 23 6-year-olds attending to G kindergarten located in E city, Korea, where the teacher assistant robot had been in operation since Oct. 2008. Each child responded to questions assessing the child's perceptions of IrobiQ's identity regarding four domains : it's biological, intellectual, emotional and social identity. Some questions asked the child to affirm or deny some characteristics pertaining to the robot and the other questions asked the reasons for the answer given. The results indicated that while majority of children considered an IrobiQ not as a biological entity, but as a machine, they thought it could have an emotion and be their playmate. The implications of these results are two folds : firstly, they force us to reconsider the traditional ontological categories regarding intelligent service robots to understand human-robot interaction and secondly, they open up an ecological perspective on the design of teacher assistant robots for use with young children in early childhood education settings.