• Title/Summary/Keyword: HRI(Human Robot Interaction)

Search Result 77, Processing Time 0.025 seconds

Analysis on Psychological and Educational Effects in Children and Home Robot Interaction (아동과 홈 로봇의 심리적.교육적 상호작용 분석)

  • Kim, Byung-Jun;Han, Jeong-Hye
    • Journal of The Korean Association of Information Education
    • /
    • v.9 no.3
    • /
    • pp.501-510
    • /
    • 2005
  • To facilitate interaction between home robot and humans, it's urgently needed to make in-depth research in Human-Robot Interaction(HRI). The purpose of this study was to examine how children interacted with a newly developed home robot named 'iRobi' in a bid to identify how the home robot affected their psychology and the effectiveness of learning through the home robot. Concerning the psychological effects of the home robot, the children became familiar with the robot, and found it possible to interact with it, and their initial anxiety was removed. As to its learning effect, the group that studied by using the home robot outperformed the others utilizing the other types of learning media (books, WBI)in attention, learning interest and academic achievement. Accordingly, home robot could serve as one of successful vehicles to expedite the psychological and educational interaction of children.

  • PDF

Face Recognition Using Tensor Subspace Analysis in Robot Environments (로봇 환경에서 텐서 부공간 분석기법을 이용한 얼굴인식)

  • Kim, Sung-Suk;Kwak, Keun-Chang
    • The Journal of Korea Robotics Society
    • /
    • v.3 no.4
    • /
    • pp.300-307
    • /
    • 2008
  • This paper is concerned with face recognition for human-robot interaction (HRI) in robot environments. For this purpose, we use Tensor Subspace Analysis (TSA) to recognize the user's face through robot camera when robot performs various services in home environments. Thus, the spatial correlation between the pixels in an image can be naturally characterized by TSA. Here we utilizes face database collected in u-robot test bed environments in ETRI. The presented method can be used as a core technique in conjunction with HRI that can naturally interact between human and robots in home robot applications. The experimental results on face database revealed that the presented method showed a good performance in comparison with the well-known methods such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) in distant-varying environments.

  • PDF

Safe and Reliable Intelligent Wheelchair Robot with Human Robot Interaction

  • Hyuk, Moon-In;Hyun, Joung-Sang;Kwang, Kum-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.120.1-120
    • /
    • 2001
  • This paper proposes a prototype of a safe and reliable wheelchair robot with Human Robot Interaction (HRI). Since the wheelchair users are usually the handicapped, the wheelchair robot must guarantee the safety and reliability for the motion while considering users intention, A single color CCD camera is mounted for input user´s command based on human-friendly gestures, and a ultra sonic sensor array is used for sensing external motion environment. We use face and hand directional gestures as the user´s command. By combining the user´s command with the sensed environment configuration, the planner of the wheelchair robot selects an optimal motion. We implement a prototype wheelchair robot, MR, HURI (Mobile Robot with Human Robot Interaction) ...

  • PDF

A Robust Fingertip Extraction and Extended CAMSHIFT based Hand Gesture Recognition for Natural Human-like Human-Robot Interaction (강인한 손가락 끝 추출과 확장된 CAMSHIFT 알고리즘을 이용한 자연스러운 Human-Robot Interaction을 위한 손동작 인식)

  • Lee, Lae-Kyoung;An, Su-Yong;Oh, Se-Young
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.4
    • /
    • pp.328-336
    • /
    • 2012
  • In this paper, we propose a robust fingertip extraction and extended Continuously Adaptive Mean Shift (CAMSHIFT) based robust hand gesture recognition for natural human-like HRI (Human-Robot Interaction). Firstly, for efficient and rapid hand detection, the hand candidate regions are segmented by the combination with robust $YC_bC_r$ skin color model and haar-like features based adaboost. Using the extracted hand candidate regions, we estimate the palm region and fingertip position from distance transformation based voting and geometrical feature of hands. From the hand orientation and palm center position, we find the optimal fingertip position and its orientation. Then using extended CAMSHIFT, we reliably track the 2D hand gesture trajectory with extracted fingertip. Finally, we applied the conditional density propagation (CONDENSATION) to recognize the pre-defined temporal motion trajectories. Experimental results show that the proposed algorithm not only rapidly extracts the hand region with accurately extracted fingertip and its angle but also robustly tracks the hand under different illumination, size and rotation conditions. Using these results, we successfully recognize the multiple hand gestures.

Remote Control of a Mobile Robot Using Human Adaptive Interface (사용자 적응 인터페이스를 사용한 이동로봇의 원격제어)

  • Hwang, Chang-Soon;Lee, Sang-Ryong;Park, Keun-Young;Lee, Choon-Young
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.8
    • /
    • pp.777-782
    • /
    • 2007
  • Human Robot Interaction(HRI) through a haptic interface plays an important role in controlling robot systems remotely. The augmented usage of bio-signals in the haptic interface is an emerging research area. To consider operator's state in HRI, we used bio-signals such as ECG and blood pressure in our proposed force reflection interface. The variation of operator's state is checked from the information processing of bio-signals. The statistical standard variation in the R-R intervals and blood pressure were used to adaptively adjust force reflection which is generated from environmental condition. To change the pattern of force reflection according to the state of the human operator is our main idea. A set of experiments show the promising results on our concepts of human adaptive interface.

Pictorial Model of Upper Body based Pose Recognition and Particle Filter Tracking (그림모델과 파티클필터를 이용한 인간 정면 상반신 포즈 인식)

  • Oh, Chi-Min;Islam, Md. Zahidul;Kim, Min-Wook;Lee, Chil-Woo
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.186-192
    • /
    • 2009
  • In this paper, we represent the recognition method for human frontal upper body pose. In HCI(Human Computer Interaction) and HRI(Human Robot Interaction) when a interaction is established the human has usually frontal direction to the robot or computer and use hand gestures then we decide to focus on human frontal upper-body pose, The two main difficulties are firstly human pose is consist of many parts which cause high DOF(Degree Of Freedom) then the modeling of human pose is difficult. Secondly the matching between image features and modeling information is difficult. Then using Pictorial Model we model the human main poses which are mainly took the space of frontal upper-body poses and we recognize the main poses by making main pose database. using determined main pose we used the model parameters for particle filter which predicts the posterior distribution for pose parameters and can determine more specific pose by updating model parameters from the particle having the maximum likelihood. Therefore based on recognizing main poses and tracking the specific pose we recognize the human frontal upper body poses.

  • PDF

Implementation of Home Service Robot System consisting of Object Oriented Slave Robots (객체 지향적 슬레이브 로봇들로 구성된 홈서비스 로봇 시스템의 구현)

  • Ko, Chang-Gun;Ko, Dae-Gun;Kwan, Hye-Jin;Park, Jung-Il;Lee, Suk-Gyu
    • Proceedings of the KIEE Conference
    • /
    • 2007.04a
    • /
    • pp.337-339
    • /
    • 2007
  • This paper proposes a new paradigm for cooperation of multi-robot system for home service. For localization of each robot. the master robot collects information of location of each robot based on communication of RFID tag on the floor and RFID reader attached on the bottom of the robot. The Master robot communicates with slave robots via wireless LAN to check the motion of robots and command to them based on the information from slave robots. The operator may send command to slave robots based on the HRI(Human-Robot Interaction) screened on masted robot using information from slave robots. The cooperation of multiple robots will enhance the performance comparing with single robot.

  • PDF

An analysis of the component of Human-Robot Interaction for Intelligent room

  • Park, Jong-Chan;Kwon, Dong-Soo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.2143-2147
    • /
    • 2005
  • Human-Robot interaction (HRI) has recently become one of the most important issues in the field of robotics. Understanding and predicting the intentions of human users is a major difficulty for robotic programs. In this paper we suggest an interaction method allows the robot to execute the human user's desires in an intelligent room-based domain, even when the user does not give a specific command for the action. To achieve this, we constructed a full system architecture of an intelligent room so that the following were present and sequentially interconnected: decision-making based on the Bayesian belief network, responding to human commands, and generating queries to remove ambiguities. The robot obtained all the necessary information from analyzing the user's condition and the environmental state of the room. This information is then used to evaluate the probabilities of the results coming from the output nodes of the Bayesian belief network, which is composed of the nodes that includes several states, and the causal relationships between them. Our study shows that the suggested system and proposed method would improve a robot's ability to understand human commands, intuit human desires, and predict human intentions resulting in a comfortable intelligent room for the human user.

  • PDF

Estimating Human Walking Pace and Direction Using Vibration Signals (진동감지를 이용한 사용자 걸음걸이 인식)

  • Jeong, Eunseok;Kim, DaeEun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.5
    • /
    • pp.481-485
    • /
    • 2014
  • In service robots, a number of human movements are analyzed using a variety of sensors. Vibration signals from walking movements of a human provide useful information about the distance and the movement direction of the human. In this paper, we measure the intensity of vibrations and detect both human walking pace and direction. In our experiments, vibration signals detected by microphone sensors provide good estimation of the distance and direction of a human movement. This can be applied to HRI (Human-Robot Interaction) technology.