• Title/Summary/Keyword: Robot Gestures

Search Result 41, Processing Time 0.029 seconds

An Emotional Gesture-based Dialogue Management System using Behavior Network (행동 네트워크를 이용한 감정형 제스처 기반 대화 관리 시스템)

  • Yoon, Jong-Won;Lim, Sung-Soo;Cho, Sung-Bae
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.10
    • /
    • pp.779-787
    • /
    • 2010
  • Since robots have been used widely recently, research about human-robot communication is in process actively. Typically, natural language processing or gesture generation have been applied to human-robot interaction. However, existing methods for communication among robot and human have their limits in performing only static communication, thus the method for more natural and realistic interaction is required. In this paper, an emotional gesture based dialogue management system is proposed for sophisticated human-robot communication. The proposed system performs communication by using the Bayesian networks and pattern matching, and generates emotional gestures of robots in real-time while the user communicates with the robot. Through emotional gestures robot can communicate the user more efficiently also realistically. We used behavior networks as the gesture generation method to deal with dialogue situations which change dynamically. Finally, we designed a usability test to confirm the usefulness of the proposed system by comparing with the existing dialogue system.

Recognition of Hand gesture to Human-Computer Interaction (손 동작을 통한 인간과 컴퓨터간의 상호 작용)

  • Lee, Lae-Kyoung;Kim, Sung-Shin
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.2930-2932
    • /
    • 2000
  • In this paper. a robust gesture recognition system is designed and implemented to explore the communication methods between human and computer. Hand gestures in the proposed approach are used to communicate with a computer for actions of a high degree of freedom. The user does not need to wear any cumbersome devices like cyber-gloves. No assumption is made on whether the user is wearing any ornaments and whether the user is using the left or right hand gestures. Image segmentation based upon the skin-color and a shape analysis based upon the invariant moments are combined. The features are extracted and used for input vectors to a radial basis function networks(RBFN). Our "Puppy" robot is employed as a testbed. Preliminary results on a set of gestures show recognition rates of about 87% on the a real-time implementation.

  • PDF

Life-like Facial Expression of Mascot-Type Robot Based on Emotional Boundaries (감정 경계를 이용한 로봇의 생동감 있는 얼굴 표정 구현)

  • Park, Jeong-Woo;Kim, Woo-Hyun;Lee, Won-Hyong;Chung, Myung-Jin
    • The Journal of Korea Robotics Society
    • /
    • v.4 no.4
    • /
    • pp.281-288
    • /
    • 2009
  • Nowadays, many robots have evolved to imitate human social skills such that sociable interaction with humans is possible. Socially interactive robots require abilities different from that of conventional robots. For instance, human-robot interactions are accompanied by emotion similar to human-human interactions. Robot emotional expression is thus very important for humans. This is particularly true for facial expressions, which play an important role in communication amongst other non-verbal forms. In this paper, we introduce a method of creating lifelike facial expressions in robots using variation of affect values which consist of the robot's emotions based on emotional boundaries. The proposed method was examined by experiments of two facial robot simulators.

  • PDF

Human Robot Interaction Using Face Direction Gestures

  • Kwon, Dong-Soo;Bang, Hyo-Choong
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.171.4-171
    • /
    • 2001
  • This paper proposes a method of human- robot interaction (HRI) using face directional gesture. A single CCD color camera is used to input face region, and the robot recognizes the face directional gesture based on the facial feature´s positions. One can give a command such as stop, go, left and right turn to the robot using the face directional gesture. Since the robot also has the ultra sonic sensors, it can detect obstacles and determine a safe direction at the current position. By combining the user´s command with the sensed obstacle configuration, the robot selects the safe and efficient motion direction. From simulation results, we show that the robot with HRI is more reliable for the robot´s navigation.

  • PDF

Hand Gesture Recognition Using an Infrared Proximity Sensor Array

  • Batchuluun, Ganbayar;Odgerel, Bayanmunkh;Lee, Chang Hoon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.15 no.3
    • /
    • pp.186-191
    • /
    • 2015
  • Hand gesture is the most common tool used to interact with and control various electronic devices. In this paper, we propose a novel hand gesture recognition method using fuzzy logic based classification with a new type of sensor array. In some cases, feature patterns of hand gesture signals cannot be uniquely distinguished and recognized when people perform the same gesture in different ways. Moreover, differences in the hand shape and skeletal articulation of the arm influence to the process. Manifold features were extracted, and efficient features, which make gestures distinguishable, were selected. However, there exist similar feature patterns across different hand gestures, and fuzzy logic is applied to classify them. Fuzzy rules are defined based on the many feature patterns of the input signal. An adaptive neural fuzzy inference system was used to generate fuzzy rules automatically for classifying hand gestures using low number of feature patterns as input. In addition, emotion expression was conducted after the hand gesture recognition for resultant human-robot interaction. Our proposed method was tested with many hand gesture datasets and validated with different evaluation metrics. Experimental results show that our method detects more hand gestures as compared to the other existing methods with robust hand gesture recognition and corresponding emotion expressions, in real time.

Design of a Humanoid Robot Hand by Mimicking Human Hand's Motion and Appearance (인간손의 동작과 모양을 모방한 휴머노이드 로봇손 설계)

  • Ahn, Sang-Ik;Oh, Yong-Hwan;Kwon, Sang-Joo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.14 no.1
    • /
    • pp.62-69
    • /
    • 2008
  • A specialized anthropomorphic robot hand which can be attached to the biped humanoid robot MAHRU-R in KIST, has been developed. This built-in type hand consists of three fingers and a thumb with total four DOF(Degrees of Freedom) where the finger mechanism is well designed for grasping typical objects stably in human's daily activities such as sphere and cylinder shaped objects. The restriction of possible motions and the limitation of grasping objects arising from the reduction of DOF can be overcome by reflecting a typical human finger's motion profile to the design procedure. As a result, the developed hand can imitate not only human hand's shape but also its motion in a compact and efficient manner. Also this novel robot hand can perform various human hand gestures naturally and grasp normal objects with both power and precision grasping capability.

Behavior-classification of Human Using Fuzzy-classifier (퍼지분류기를 이용한 인간의 행동분류)

  • Kim, Jin-Kyu;Joo, Young-Hoon
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.59 no.12
    • /
    • pp.2314-2318
    • /
    • 2010
  • For human-robot interaction, a robot should recognize the meaning of human behavior. In the case of static behavior such as face expression and sign language, the information contained in a single image is sufficient to deliver the meaning to the robot. In the case of dynamic behavior such as gestures, however, the information of sequential images is required. This paper proposes behavior classification by using fuzzy classifier to deliver the meaning of dynamic behavior to the robot. The proposed method extracts feature points from input images by a skeleton model, generates a vector space from a differential image of the extracted feature points, and uses this information as the learning data for fuzzy classifier. Finally, we show the effectiveness and the feasibility of the proposed method through experiments.

Agent Mobility in Human Robot Interaction

  • Nguyen, To Dong;Oh, Sang-Rok;You, Bum-Jae
    • Proceedings of the KIEE Conference
    • /
    • 2005.07d
    • /
    • pp.2771-2773
    • /
    • 2005
  • In network human-robot interaction, human can access services of a robot system through the network The communication is done by interacting with the distributed sensors via voice, gestures or by using user network access device such as computer, PDA. The service organization and exploration is very important for this distributed system. In this paper we propose a new agent-based framework to integrate partners of this distributed system together and help users to explore the service effectively without complicated configuration. Our system consists of several robots. users and distributed sensors. These partners are connected in a decentralized but centralized control system using agent-based technology. Several experiments are conducted successfully using our framework The experiments show that this framework is good in term of increasing the availability of the system, reducing the time users and robots needs to connect to the network at the same time. The framework also provides some coordination methods for the human robot interaction system.

  • PDF

Functions and Driving Mechanisms for Face Robot Buddy (얼굴로봇 Buddy의 기능 및 구동 메커니즘)

  • Oh, Kyung-Geune;Jang, Myong-Soo;Kim, Seung-Jong;Park, Shin-Suk
    • The Journal of Korea Robotics Society
    • /
    • v.3 no.4
    • /
    • pp.270-277
    • /
    • 2008
  • The development of a face robot basically targets very natural human-robot interaction (HRI), especially emotional interaction. So does a face robot introduced in this paper, named Buddy. Since Buddy was developed for a mobile service robot, it doesn't have a living-being like face such as human's or animal's, but a typically robot-like face with hard skin, which maybe suitable for mass production. Besides, its structure and mechanism should be simple and its production cost also should be low enough. This paper introduces the mechanisms and functions of mobile face robot named Buddy which can take on natural and precise facial expressions and make dynamic gestures driven by one laptop PC. Buddy also can perform lip-sync, eye-contact, face-tracking for lifelike interaction. By adopting a customized emotional reaction decision model, Buddy can create own personality, emotion and motive using various sensor data input. Based on this model, Buddy can interact probably with users and perform real-time learning using personality factors. The interaction performance of Buddy is successfully demonstrated by experiments and simulations.

  • PDF

Gesture Interface for Controlling Intelligent Humanoid Robot (지능형 로봇 제어를 위한 제스처 인터페이스)

  • Bae Ki Tae;Kim Man Jin;Lee Chil Woo;Oh Jae Yong
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.10
    • /
    • pp.1337-1346
    • /
    • 2005
  • In this paper, we describe an algorithm which can automatically recognize human gesture for Human-Robot interaction. In early works, many systems for recognizing human gestures work under many restricted conditions. To eliminate these restrictions, we have proposed the method that can represent 3D and 2D gesture information simultaneously, APM. This method is less sensitive to noise or appearance characteristic. First, the feature vectors are extracted using APM. The next step is constructing a gesture space by analyzing the statistical information of training images with PCA. And then, input images are compared to the model and individually symbolized to one portion of the model space. In the last step, the symbolized images are recognized with HMM as one of model gestures. The experimental results indicate that the proposed algorithm is efficient on gesture recognition, and it is very convenient to apply to humanoid robot or intelligent interface systems.

  • PDF