• Title/Summary/Keyword: human following robot

Search Result 53, Processing Time 0.028 seconds

Sound localization for Teller Following of A dialog type Humanoid Robot (대화형 로봇의 화자 추종을 위한 sound localization)

  • Shim, H.M.;Lee, J.S.;Kwon, O.S.;Lee, E.H.;Hong, S.H.
    • Proceedings of the KIEE Conference
    • /
    • 2001.11c
    • /
    • pp.111-114
    • /
    • 2001
  • In this paper, we supposed teller following algorithm that using sound localization for developing dialog type humanoid robot. A sound localization is studied for develop the techniques of an efficient 3-D sound system based on the psychoacoustics of spatial hearing with multimedia or virtual reality. When a robot talk with human, it is necessary that robot follow human for improved human interface and adaptive noise canceling. We apply this algorithm to robot system.

  • PDF

Development of Human Following Method of Mobile Robot Using QR Code and 2D LiDAR Sensor (QR 2D 코드와 라이다 센서를 이용한 모바일 로봇의 사람 추종 기법 개발)

  • Lee, SeungHyeon;Choi, Jae Won;Van Dang, Chien;Kim, Jong-Wook
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.15 no.1
    • /
    • pp.35-42
    • /
    • 2020
  • In this paper, we propose a method to keep the robot at a distance of 30 to 45cm from the user in consideration of each individual's minimum area and inconvenience by using a 2D LiDAR sensor LDS-01 as the secondary sensor along with a QR code. First, the robot determines the brightness of the video and the presence of a QR code. If the light is bright and there is a QR code due to human's presence, the range of the 2D LiDAR sensor is set based on the position of the QR code in the captured image to find and follow the correct target. On the other hand, when the robot does not recognize the QR code due to the low light, the target is followed using a database that stores obstacles and human actions made before the experiment using only the 2D LiDAR sensor. As a result, our robot can follow the target person in four situations based on nine locations with seven types of motion.

Design and Control of Wire-driven Flexible Robot Following Human Arm Gestures (팔 동작 움직임을 모사하는 와이어 구동 유연 로봇의 설계 및 제어)

  • Kim, Sanghyun;Kim, Minhyo;Kang, Junki;Son, SeungJe;Kim, Dong Hwan
    • The Journal of Korea Robotics Society
    • /
    • v.14 no.1
    • /
    • pp.50-57
    • /
    • 2019
  • This work presents a design and control method for a flexible robot arm operated by a wire drive that follows human gestures. When moving the robot arm to a desired position, the necessary wire moving length is calculated and the motors are rotated accordingly to the length. A robotic arm is composed of a total of two module-formed mechanism similar to real human motion. Two wires are used as a closed loop in one module, and universal joints are attached to each disk to create up, down, left, and right movements. In order to control the motor, the anti-windup PID was applied to limit the sudden change usually caused by accumulated error in the integral control term. In addition, master/slave communication protocol and operation program for linking 6 motors to MYO sensor and IMU sensor output were developed at the same time. This makes it possible to receive the image information of the camera attached to the robot arm and simultaneously send the control command to the robot at high speed.

Development of Human Following Method of Mobile Robot Using TRT Pose (TRT Pose를 이용한 모바일 로봇의 사람 추종 기법)

  • Choi, Jun-Hyeon;Joo, Kyeong-Jin;Yun, Sang-Seok;Kim, Jong-Wook
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.15 no.6
    • /
    • pp.281-287
    • /
    • 2020
  • In this paper, we propose a method for estimating a walking direction by which a mobile robots follows a person using TRT (Tensor RT) pose, which is motion recognition based on deep learning. Mobile robots can measure individual movements by recognizing key points on the person's pelvis and determine the direction in which the person tries to move. Using these information and the distance between robot and human, the mobile robot can follow the person stably keeping a safe distance from people. The TRT Pose only extracts key point information to prevent privacy issues while a camera in the mobile robot records video. To validate the proposed technology, experiment is carried out successfully where human walks away or toward the mobile robot in zigzag form and the robot continuously follows human with prescribed distance.

An analysis of the component of Human-Robot Interaction for Intelligent room

  • Park, Jong-Chan;Kwon, Dong-Soo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.2143-2147
    • /
    • 2005
  • Human-Robot interaction (HRI) has recently become one of the most important issues in the field of robotics. Understanding and predicting the intentions of human users is a major difficulty for robotic programs. In this paper we suggest an interaction method allows the robot to execute the human user's desires in an intelligent room-based domain, even when the user does not give a specific command for the action. To achieve this, we constructed a full system architecture of an intelligent room so that the following were present and sequentially interconnected: decision-making based on the Bayesian belief network, responding to human commands, and generating queries to remove ambiguities. The robot obtained all the necessary information from analyzing the user's condition and the environmental state of the room. This information is then used to evaluate the probabilities of the results coming from the output nodes of the Bayesian belief network, which is composed of the nodes that includes several states, and the causal relationships between them. Our study shows that the suggested system and proposed method would improve a robot's ability to understand human commands, intuit human desires, and predict human intentions resulting in a comfortable intelligent room for the human user.

  • PDF

Human-Tracking Behavior of Mobile Robot Using Multi-Camera System in a Networked ISpace (공간지능화에서 다중카메라를 이용한 이동로봇의 인간추적행위)

  • Jin, Tae-Seok;Hashimoto, Hideki
    • The Journal of Korea Robotics Society
    • /
    • v.2 no.4
    • /
    • pp.310-316
    • /
    • 2007
  • The paper proposes a human-following behavior of mobile robot and an intelligent space (ISpace) is used in order to achieve these goals. An ISpace is a 3-D environment in which many sensors and intelligent devices are distributed. Mobile robots exist in this space as physical agents providing humans with services. A mobile robot is controlled to track a walking human using distributed intelligent sensors as stably and precisely as possible. The moving objects is assumed to be a point-object and projected onto an image plane to form a geometrical constraint equation that provides position data of the object based on the kinematics of the intelligent space. Uncertainties in the position estimation caused by the point-object assumption are compensated using the Kalman filter. To generate the shortest time trajectory to track the walking human, the linear and angular velocities are estimated and utilized. The computer simulation and experimental results of estimating and trackinging of the walking human with the mobile robot are presented.

  • PDF

Position Improvement of a Human-Following Mobile Robot Using Image Information of Walking Human (보행자의 영상정보를 이용한 인간추종 이동로봇의 위치 개선)

  • Jin Tae-Seok;Lee Dong-Heui;Lee Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.5
    • /
    • pp.398-405
    • /
    • 2005
  • The intelligent robots that will be needed in the near future are human-friendly robots that are able to coexist with humans and support humans effectively. To realize this, robots need to recognize their position and posture in known environment as well as unknown environment. Moreover, it is necessary for their localization to occur naturally. It is desirable for a robot to estimate of his position by solving uncertainty for mobile robot navigation, as one of the best important problems. In this paper, we describe a method for the localization of a mobile robot using image information of a moving object. This method combines the observed position from dead-reckoning sensors and the estimated position from the images captured by a fixed camera to localize a mobile robot. Using a priori known path of a moving object in the world coordinates and a perspective camera model, we derive the geometric constraint equations which represent the relation between image frame coordinates for a moving object and the estimated robot's position. Also, the control method is proposed to estimate position and direction between the walking human and the mobile robot, and the Kalman filter scheme is used for the estimation of the mobile robot localization. And its performance is verified by the computer simulation and the experiment.

Person-following of a Mobile Robot using a Complementary Tracker with a Camera-laser Scanner (카메라-레이저스캐너 상호보완 추적기를 이용한 이동 로봇의 사람 추종)

  • Kim, Hyoung-Rae;Cui, Xue-Nan;Lee, Jae-Hong;Lee, Seung-Jun;Kim, Hakil
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.1
    • /
    • pp.78-86
    • /
    • 2014
  • This paper proposes a method of tracking an object for a person-following mobile robot by combining a monocular camera and a laser scanner, where each sensor can supplement the weaknesses of the other sensor. For human-robot interaction, a mobile robot needs to maintain a distance between a moving person and itself. Maintaining distance consists of two parts: object tracking and person-following. Object tracking consists of particle filtering and online learning using shape features which are extracted from an image. A monocular camera easily fails to track a person due to a narrow field-of-view and influence of illumination changes, and has therefore been used together with a laser scanner. After constructing the geometric relation between the differently oriented sensors, the proposed method demonstrates its robustness in tracking and following a person with a success rate of 94.7% in indoor environments with varying lighting conditions and even when a moving object is located between the robot and the person.

Color Pattern Recognition and Tracking for Multi-Object Tracking in Artificial Intelligence Space (인공지능 공간상의 다중객체 구분을 위한 컬러 패턴 인식과 추적)

  • Tae-Seok Jin
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.27 no.2_2
    • /
    • pp.319-324
    • /
    • 2024
  • In this paper, the Artificial Intelligence Space(AI-Space) for human-robot interface is presented, which can enable human-computer interfacing, networked camera conferencing, industrial monitoring, service and training applications. We present a method for representing, tracking, and objects(human, robot, chair) following by fusing distributed multiple vision systems in AI-Space. The article presents the integration of color distributions into particle filtering. Particle filters provide a robust tracking framework under ambiguous conditions. We propose to track the moving objects(human, robot, chair) by generating hypotheses not in the image plane but on the top-view reconstruction of the scene.

A vision based people tracking and following for mobile robots using CAMSHIFT and KLT feature tracker (캠시프트와 KLT특징 추적 알고리즘을 융합한 모바일 로봇의 영상기반 사람추적 및 추종)

  • Lee, S.J.;Won, Mooncheol
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.7
    • /
    • pp.787-796
    • /
    • 2014
  • Many mobile robot navigation methods utilize laser scanners, ultrasonic sensors, vision camera, and so on for detecting obstacles and path following. However, human utilizes only vision(e.g. eye) information for navigation. In this paper, we study a mobile robot control method based on only the camera vision. The Gaussian Mixture Model and a shadow removal technology are used to divide the foreground and the background from the camera image. The mobile robot uses a combined CAMSHIFT and KLT feature tracker algorithms based on the information of the foreground to follow a person. The algorithm is verified by experiments where a person is tracked and followed by a robot in a hallway.