• Title/Summary/Keyword: Robot Sensor

Search Result 1,588, Processing Time 0.026 seconds

Predictive Control of an Efficient Human Following Robot Using Kinect Sensor (Kinect 센서를 이용한 효율적인 사람 추종 로봇의 예측 제어)

  • Heo, Shin-Nyeong;Lee, Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.9
    • /
    • pp.957-963
    • /
    • 2014
  • This paper proposes a predictive control for an efficient human following robot using Kinect sensor. Especially, this research is focused on detecting of foot-end-point and foot-vector instead of human body which can be occluded easily by the obstacles. Recognition of the foot-end-point by the Kinect sensor is reliable since the two feet images can be utilized, which increases the detection possibility of the human motion. Depth image features and a decision tree have been utilized to estimate the foot end-point precisely. A tracking point average algorithm is also adopted in this research to estimate the location of foot accurately. Using the continuous locations of foot, the human motion trajectory is estimated to guide the mobile robot along a smooth path to the human. It is verified through the experiments that detecting foot-end-point is more reliable and efficient than detecting the human body. Finally, the tracking performance of the mobile robot is demonstrated with a human motion along an 'L' shape course.

A Study on the Compensating of the Dead-reckoning Based on SLAM Using the Inertial Sensor (관성센서를 이용한 SLAM 기반의 위치 오차 보정 기법에 관한 연구)

  • Kang, Shin-Hyuk;Jang, Mun-Suck;Lee, Dong-Kwang;Lee, Eung-Hyuk
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.46 no.2
    • /
    • pp.28-35
    • /
    • 2009
  • Positioning technology which a part technology of Mobile Robot is an essential technology to locate the position of Robot and navigate to wanted position. The Robot that based on wheel drive uses Odometry position. technology. But when using Odometry positioning technology, it's hard to find out constant error value because a slip phenomenon occurs as the Robot runs. In this paper, we present the way to minimize positioning error by using Odometry and Inertial sensor. Also, the way to reduce error with Inertial sensor on SLAM using image will be shown, too.

Control of Mobile Robot Navigation Using Vision Sensor Data Fusion by Nonlinear Transformation (비선형 변환의 비젼센서 데이터융합을 이용한 이동로봇 주행제어)

  • Jin Tae-Seok;Lee Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.4
    • /
    • pp.304-313
    • /
    • 2005
  • The robots that will be needed in the near future are human-friendly robots that are able to coexist with humans and support humans effectively. To realize this, robot need to recognize his position and direction for intelligent performance in an unknown environment. And the mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. Notice that in the conventional fusion schemes, the measurement is dependent on the current data sets only. Therefore, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this research, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the accurate measurement. As a general approach of sensor fusion, a UT -Based Sensor Fusion(UTSF) scheme using Unscented Transformation(UT) is proposed for either joint or disjoint data structure and applied to the landmark identification for mobile robot navigation. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations and experiments. The newly proposed, UT-Based UTSF scheme is applied to the navigation of a mobile robot in an unstructured environment as well as structured environment, and its performance is verified by the computer simulation and the experiment.

Bayesian Sensor Fusion of Monocular Vision and Laser Structured Light Sensor for Robust Localization of a Mobile Robot (이동 로봇의 강인 위치 추정을 위한 단안 비젼 센서와 레이저 구조광 센서의 베이시안 센서융합)

  • Kim, Min-Young;Ahn, Sang-Tae;Cho, Hyung-Suck
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.4
    • /
    • pp.381-390
    • /
    • 2010
  • This paper describes a procedure of the map-based localization for mobile robots by using a sensor fusion technique in structured environments. A combination of various sensors with different characteristics and limited sensibility has advantages in view of complementariness and cooperation to obtain better information on the environment. In this paper, for robust self-localization of a mobile robot with a monocular camera and a laser structured light sensor, environment information acquired from two sensors is combined and fused by a Bayesian sensor fusion technique based on the probabilistic reliability function of each sensor predefined through experiments. For the self-localization using the monocular vision, the robot utilizes image features consisting of vertical edge lines from input camera images, and they are used as natural landmark points in self-localization process. However, in case of using the laser structured light sensor, it utilizes geometrical features composed of corners and planes as natural landmark shapes during this process, which are extracted from range data at a constant height from the navigation floor. Although only each feature group of them is sometimes useful to localize mobile robots, all features from the two sensors are simultaneously used and fused in term of information for reliable localization under various environment conditions. To verify the advantage of using multi-sensor fusion, a series of experiments are performed, and experimental results are discussed in detail.

A Remote Control of 6 d.o.f. Robot Arm Based on 2D Vision Sensor (2D 영상센서 기반 6축 로봇 팔 원격제어)

  • Hyun, Woong-Keun
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.17 no.5
    • /
    • pp.933-940
    • /
    • 2022
  • In this paper, the algorithm was developed to recognize hand 3D position through 2D image sensor and implemented a system to remotely control the 6 d.o.f. robot arm by using it. The system consists of a camera that acquires hand position in 2D, a computer that controls robot arm that performs movement by hand position recognition. The image sensor recognizes the specific color of the glove putting on operator's hand and outputs the recognized range and position by including the color area of the glove as a shape of rectangle. We recognize the velocity vector of end effector and control the robot arm by the output data of the position and size of the detected rectangle. Through the several experiments using developed 6 axis robot, it was confirmed that the 6 d.o.f. robot arm remote control was successfully performed.

Design of a Six-axis Force/moment Sensor for Wrist Twist-exercise Rehabilitation Robot (손목회전운동 재활로봇을 위한 6축 힘/모멘트센서 설계)

  • Kim, Hyeon Min;Kim, Gab Soon
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.30 no.5
    • /
    • pp.529-536
    • /
    • 2013
  • Most serious stroke patients have the paralysis on their wrists, and can't use their hands freely. But their wrists can be recovered by rehabilitation exercises. Recently, professional rehabilitation therapeutists help stroke patients exercise their wrists in hospital. But it is difficult for them to rehabilitate their wrists, because the therapeutists are much less than stroke patients in number. Therefore, the wrist twist-exercise rehabilitation robot that can measure the twist force of the patients' wrists is needed and developed. In this paper, the six-axis force/moment sensor was designed appropriately for the robot. As a test result, the interference error of the six-axis force/moment sensor was less than 0.85%. It is thought that the sensor can be used to measure the wrist twist force of the patient.

Human-oriented programming technology for articulated robots using a force/torque sensor

  • Kang, Hyo-Sig;Park, Jong-Oh;Baek, Yoon-Su
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1992.10b
    • /
    • pp.96-99
    • /
    • 1992
  • Currently, there are various robot programming methods for articulated robots. Although each method has merits and drawbacks, they have commonly weak points for practical application, and especially the weak point can be even more vulnerable when the robot programming requires the subtle feelings of human being. This is because the movement of a human being is synthetic while the robot programming is analytic. Therefore, the present method of programming has limits in performing these kinds of subtle robot movement. In this paper, we propose a direct robot programming method, which generates robot programs based on the force/torque vector applied to a force/torque sensor by the human operator. The method reduces the effort required in the robot programming.

  • PDF

Fuzzy Inference Based Collision Free Navigation of a Mobile Robot using Sensor Fusion (퍼지추론기반 센서융합 이동로봇의 장애물 회피 주행기법)

  • Jin, Taeseok
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.21 no.2
    • /
    • pp.95-101
    • /
    • 2018
  • This paper presents a collision free mobile robot navigation based on the fuzzy inference fusion model in unkonown environments using multi-ultrasonic sensor. Six ultrasonic sensors are used for the collision avoidance approach where CCD camera sensors is used for the trajectory following approach. The fuzzy system is composed of three inputs which are the six distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot's wheels, and three cost functions for the robot's movement, direction, obstacle avoidance, and rotation. For the evaluation of the proposed algorithm, we performed real experiments with mobile robot with ultrasonic sensors. The results show that the proposed algorithm is apt to identify obstacles in unknown environments to guide the robot to the goal location safely.

Localization and mapmaking of a mobile robot (이동 로봇의 위치추정과 지도작성)

  • Yun, Dong-Woo;Oh, Sung-Nam;Kim, Kab-Il;Son, Young-Ik
    • Proceedings of the KIEE Conference
    • /
    • 2007.04a
    • /
    • pp.352-354
    • /
    • 2007
  • This paper presents a method to estimate the position of a mobile robot by using a gyro sensor and accelerometer sensors on it. Together with contact sensors we propose a mapmaking algorithm for an indoor environment where the robot moves. The direction of robot can be estimated through a gyro sensor and the distance is founded out by accelerometers. Then one can presume the position of robot. Using the direction and distance values vector-based mapmaking job can be performed. Tactile sensors help the robot recognize the boundary limit value of indoor environment and decide outer wall line of the map.

  • PDF

Designing Factory Safety Monitoring Robot Using Microsoft Robotic Studio

  • Loh, Byoung-Gook
    • International Journal of Safety
    • /
    • v.7 no.1
    • /
    • pp.1-4
    • /
    • 2008
  • Application of the Microsoft robotics studio (MSRS) to the design of a factory safety monitoring robot is presented. Basic structures of the MSRS and the service are introduced. The service is the key building block of the MSRS. Control of the safety monitoring robot is performed using four basic services: 1) the robot service which communicates with the embedded micro-processor and other services, 2) the sensor service that notifies the subscribing services of the change of the sensor value, 3) the motor service which controls the power levels to the motors, 4) the drive service which maneuvers the robot. With built-in capabilities of the MSRS, control of factory safety monitoring robot can be more easily performed.