• 제목/요약/키워드: Robot Emotion

검색결과 134건 처리시간 0.023초

감정 경계를 이용한 로봇의 생동감 있는 얼굴 표정 구현 (Life-like Facial Expression of Mascot-Type Robot Based on Emotional Boundaries)

  • 박정우;김우현;이원형;정명진
    • 로봇학회논문지
    • /
    • 제4권4호
    • /
    • pp.281-288
    • /
    • 2009
  • Nowadays, many robots have evolved to imitate human social skills such that sociable interaction with humans is possible. Socially interactive robots require abilities different from that of conventional robots. For instance, human-robot interactions are accompanied by emotion similar to human-human interactions. Robot emotional expression is thus very important for humans. This is particularly true for facial expressions, which play an important role in communication amongst other non-verbal forms. In this paper, we introduce a method of creating lifelike facial expressions in robots using variation of affect values which consist of the robot's emotions based on emotional boundaries. The proposed method was examined by experiments of two facial robot simulators.

  • PDF

서비스 로봇을 위한 감성인터페이스 기술 (Emotional Interface Technologies for Service Robot)

  • 양현승;서용호;정일웅;한태우;노동현
    • 로봇학회논문지
    • /
    • 제1권1호
    • /
    • pp.58-65
    • /
    • 2006
  • The emotional interface is essential technology for the robot to provide the proper service to the user. In this research, we developed emotional components for the service robot such as a neural network based facial expression recognizer, emotion expression technologies based on 3D graphical face expression and joints movements, considering a user's reaction, behavior selection technology for emotion expression. We used our humanoid robots, AMI and AMIET as the test-beds of our emotional interface. We researched on the emotional interaction between a service robot and a user by integrating the developed technologies. Emotional interface technology for the service robot, enhance the performance of friendly interaction to the service robot, to increase the diversity of the service and the value-added of the robot for human. and it elevates the market growth and also contribute to the popularization of the robot. The emotional interface technology can enhance the performance of friendly interaction of the service robot. This technology can also increase the diversity of the service and the value-added of the robot for human. and it can elevate the market growth and also contribute to the popularization of the robot.

  • PDF

지능형 표정로봇, 휴머노이드 ICHR (Intelligent Countenance Robot, Humanoid ICHR)

  • 변상준
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2006년도 학술대회 논문집 전문대학교육위원
    • /
    • pp.175-180
    • /
    • 2006
  • In this paper, we develope a type of humanoid robot which can express its emotion against human actions. To interact with human, the developed robot has several abilities to express its emotion, which are verbal communication with human through voice/image recognition, motion tracking, and facial expression using fourteen Servo Motors. The proposed humanoid robot system consists of a control board designed with AVR90S8535 to control servor motors, a framework equipped with fourteen server motors and two CCD cameras, a personal computer to monitor its operations. The results of this research illustrate that our intelligent emotional humanoid robot is very intuitive and friendly so human can interact with the robot very easily.

  • PDF

대화형 감성 로봇의 메커니즘 설계 (Mechanism Design of the Interactive Emotional Robot)

  • 김연훈;윤석준;이동연;곽윤근
    • 한국감성과학회:학술대회논문집
    • /
    • 한국감성과학회 2001년도 추계학술대회 논문집
    • /
    • pp.233-238
    • /
    • 2001
  • The mechanism design of the interactive emotional robot has been carried out. The two-wheeled inverted pendulum type mechanism was adopted to improve the mobility and make the innate clumsy monoaxial bicycle motion. Even though the system is unstable in itself, it is expected for the robot to move freely in a plane, keeping the upright position only with two wheels. Two motors attached on head can make 4 motion sets, and two motors on the wheels can make 8. Therefore, 32 independent motion sets can be achieved from the robot to communicate the emotions with humans. The motion's equation of the robot was derived based on nonholonomic dynamics, and the necessary power to the wheel's rotational axis was found by simulation.

  • PDF

Emotional Robotics based on iT_Media

  • Yoon, Joong-Sun;Yoh, Myeung-Sook;Cho, Bong-Kug
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.804-809
    • /
    • 2004
  • Intelligence is thought to be related to interaction rather than a deep but passive thinking. Interactive tangible media "iT_Media" is proposed to explore these issues. Personal robotics is a major area to investigate these ideas. A new design methodology for personal and emotional robotics is proposed. Sciences of the artificial and intelligence have been investigated. A short history of artificial intelligence is presented in terms of logic, heuristic, and mobility; a science of intelligence is presented in terms of imitation and understanding; intelligence issues for robotics and intelligence measures are described. A design methodology for personal robots based on science of emotion is investigated. We investigate three different aspects of design: visceral, behavioral, reflective. We also discuss affect and emotion in robots, robots that sense emotion, robots that induce emotion in people, and implications and ethical issues of emotional robots. Personal robotics for the elderly seems to be a major area in which to explore these ideas.

  • PDF

모의 지능로봇에서의 음성 감정인식 (Speech Emotion Recognition on a Simulated Intelligent Robot)

  • 장광동;김남;권오욱
    • 대한음성학회지:말소리
    • /
    • 제56호
    • /
    • pp.173-183
    • /
    • 2005
  • We propose a speech emotion recognition method for affective human-robot interface. In the Proposed method, emotion is classified into 6 classes: Angry, bored, happy, neutral, sad and surprised. Features for an input utterance are extracted from statistics of phonetic and prosodic information. Phonetic information includes log energy, shimmer, formant frequencies, and Teager energy; Prosodic information includes Pitch, jitter, duration, and rate of speech. Finally a pattern classifier based on Gaussian support vector machines decides the emotion class of the utterance. We record speech commands and dialogs uttered at 2m away from microphones in 5 different directions. Experimental results show that the proposed method yields $48\%$ classification accuracy while human classifiers give $71\%$ accuracy.

  • PDF

RIDE 감성 표현 기법 (The Representation of Emotion in RIDE)

  • 전성택;한재일
    • 한국콘텐츠학회:학술대회논문집
    • /
    • 한국콘텐츠학회 2004년도 추계 종합학술대회 논문집
    • /
    • pp.489-492
    • /
    • 2004
  • 인공 지능체의 감정을 표현하는 기법으로 제안된 RIDE(Robot Intelligence with Digital Emotion) 감정 표현 기법은 기존의 James-Lange 이론과 Cannon-Bard 이론을 모두 수용하며 Schafter-Singer 이론에서 제기된 감정의 연상 작용을 가능하도록 하는 감정의 기억을 수용하는 모델이다. 또한, 감정 간의 유사성과 상대성을 규정하여 인공 지능체 간의 감정 교류가 가능하도록 되어 있다.

  • PDF

모의 지능로봇에서 음성신호에 의한 감정인식 (Speech Emotion Recognition by Speech Signals on a Simulated Intelligent Robot)

  • 장광동;권오욱
    • 대한음성학회:학술대회논문집
    • /
    • 대한음성학회 2005년도 추계 학술대회 발표논문집
    • /
    • pp.163-166
    • /
    • 2005
  • We propose a speech emotion recognition method for natural human-robot interface. In the proposed method, emotion is classified into 6 classes: Angry, bored, happy, neutral, sad and surprised. Features for an input utterance are extracted from statistics of phonetic and prosodic information. Phonetic information includes log energy, shimmer, formant frequencies, and Teager energy; Prosodic information includes pitch, jitter, duration, and rate of speech. Finally a patten classifier based on Gaussian support vector machines decides the emotion class of the utterance. We record speech commands and dialogs uttered at 2m away from microphones in 5different directions. Experimental results show that the proposed method yields 59% classification accuracy while human classifiers give about 50%accuracy, which confirms that the proposed method achieves performance comparable to a human.

  • PDF

비전 방식을 이용한 감정인식 로봇 개발 (Development of an Emotion Recognition Robot using a Vision Method)

  • 신영근;박상성;김정년;서광규;장동식
    • 산업공학
    • /
    • 제19권3호
    • /
    • pp.174-180
    • /
    • 2006
  • This paper deals with the robot system of recognizing human's expression from a detected human's face and then showing human's emotion. A face detection method is as follows. First, change RGB color space to CIElab color space. Second, extract skin candidate territory. Third, detect a face through facial geometrical interrelation by face filter. Then, the position of eyes, a nose and a mouth which are used as the preliminary data of expression, he uses eyebrows, eyes and a mouth. In this paper, the change of eyebrows and are sent to a robot through serial communication. Then the robot operates a motor that is installed and shows human's expression. Experimental results on 10 Persons show 78.15% accuracy.

감성 로봇 "라이"의 감성적 동작 구현 (Human-Sensitive Mot ion Interpretation of Emotional Robot "Rai")

  • 김연훈;이동연;김병수;곽윤근
    • 한국감성과학회:학술대회논문집
    • /
    • 한국감성과학회 2002년도 춘계학술대회 논문집
    • /
    • pp.327-332
    • /
    • 2002
  • We made a human-sensitive motion interpretation to the interactive emotional robot, "Rai" of which the mechanism design was carried out and completed. Kinematic system of this emotional robot mainly consists of a body and a head. The body contains the total control units , the communicat ion modules and also two wheels and motors for main driving which make kinds of motions 1 ike the inverted pendulum. This robot system is designed under the concept on the human-friendly mot ion and react ion wi th humans around living room and office environments. Therefore, various scenarios are constructed in order to enable the emotional expressions at those places. Especially, we interpreted technically-possible motions while accommodating to the scenarios constructed. And we performed some experiments to make sere of the possibility of the motion interpretation.

  • PDF