• Title/Summary/Keyword: Robot Sensor

Search Result 1,588, Processing Time 0.031 seconds

Design and fabrication of robot′s finger 3-axis force sensor for grasping an unknown object (미지물체를 잡기 위한 로봇 손가락의 3축 힘감지센서 설계 및 제작)

  • 김갑순
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2002.05a
    • /
    • pp.229-232
    • /
    • 2002
  • This paper describes the development of robot's finger 3-axis force sensor that detects the Fx, Fy, and Fz simultaneously fur stably grasping an unknown object. In order to safely grasp an unknown object using the robot's fingers, they should detect the force of gripping direction and the force of gravity direction, and perform the force control using the detected farces. The 3-axis force sensor that detects the Fx, Fy, and Fz simultaneously should be used for accurately detecting the weight of an unknown object of gravity direction. Thus, in this paper, robot's finger for stably grasping an unknown object is developed. And, the 3-axis farce sensor that detects the Fx, Fy, and Fz simultaneously fur constructing a robot's finger is newly modeled using several parallel-plate beams, and is fabricated. Also, it is calibrated, and evaluated.

  • PDF

Simulation of Mobile Robot Navigation based on Multi-Sensor Data Fusion by Probabilistic Model

  • Jin, Tae-seok
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.21 no.4
    • /
    • pp.167-174
    • /
    • 2018
  • Presently, the exploration of an unknown environment is an important task for the development of mobile robots and mobile robots are navigated by means of a number of methods, using navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems, In mobile robotics, multi-sensor data fusion(MSDF) became useful method for navigation and collision avoiding. Moreover, their applicability for map building and navigation has exploited in recent years. In this paper, as the preliminary step for developing a multi-purpose autonomous carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as ultrasonic sensor, IR sensor for mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within indoor environments. Simulation results with a mobile robot will demonstrate the effectiveness of the discussed methods.

Design of a Robot's Hand with Two 3-Axis Force Sensor for Grasping an Unknown Object

  • Kim, Gab-Soon
    • International Journal of Precision Engineering and Manufacturing
    • /
    • v.4 no.3
    • /
    • pp.12-19
    • /
    • 2003
  • This paper describes the design of a robot's hand with two fingers for stably grasping an unknown object, and the development of a 3-axis force sensor for which is necessary to constructing the robot's fingers. In order to safely grasp an unknown object using the robot's fingers, they should measure the forces in the gripping and in the gravity directions, and control the measured forces. The 3-axis force sensor should be used for accurately measuring the weight of an unknown object in the gravity direction. Thus, in this paper, the robot's hand with two fingers for stably grasping an unknown object is designed, and the 3-axis force sensor is newly modeled and fabricated using several parallel-plate beams.

Indoor Localization of a Mobile Robot Using External Sensor (외부 센서를 이용한 이동 로봇 실내 위치 추정)

  • Ko, Nak-Yong;Kim, Tae-Gyun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.5
    • /
    • pp.420-427
    • /
    • 2010
  • This paper describes a localization method based on Monte Carlo Localization approach for a mobile robot. The method uses range data which are measured from ultrasound transmitting beacons whose locations are given a priori. The ultrasound receiver on-board a robot detects the range from the beacons. The method requires several beacons, theoretically over three. The method proposes a sensor model for the range sensing based on statistical analysis of the sensor output. The experiment uses commercialized beacons and detector which are used for trilateration localization. The performance of the proposed method is verified through real implementation. Especially, it is shown that the performance of the localization degrades as the sensor update rate decreases compared with the MCL algorithm update rate. Though the method requires exact location of the beacons, it doesn't require geometrical map information of the environment. Also, it is applicable to estimation of the location of both the beacons and robot simultaneously.

Control and Calibration for Robot Navigation based on Light's Panel Landmark (천장 전등패널 기반 로봇의 주행오차 보정과 제어)

  • Jin, Tae-Seok
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.20 no.2
    • /
    • pp.89-95
    • /
    • 2017
  • In this paper, we suggest the method for a mobile robot to move safely from an initial position to a goal position in the wide environment like a building. There is a problem using odometry encoder sensor to estimate the position of a mobile robot in the wide environment like a building. Because of the phenomenon of wheel's slipping, a encoder sensor has the accumulated error of a sensor measurement as time. Therefore the error must be compensated with using other sensor. A vision sensor is used to compensate the position of a mobile robot as using the regularly attached light's panel on a building's ceiling. The method to create global path planning for a mobile robot model a building's map as a graph data type. Consequently, we can apply floyd's shortest path algorithm to find the path planning. The effectiveness of the method is verified through simulations and experiments.

Fuzzy Steering Controller for Outdoor Autonomous Mobile Robot using MR sensor (MR센서를 이용한 실외형 자율이동 로봇의 퍼지 조향제어기에 관한 연구)

  • Kim, Jeong-Heui;Son, Seok-Jun;Lim, Young-Chelo;Kim, Tae-Gon;Kim, Eui-Sun;Ryoo, Young-Jae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.12 no.1
    • /
    • pp.27-32
    • /
    • 2002
  • This paper describes a fuzzy steering controller for an outdoor autonomous mobile robot using MR(magneto-resistive) sensor. Using the magnetic field difference values(dBy, dBz) obtained from the MR sensor, we designed fuzzy logic controller for driving the robot on the road center and proposed a method to eliminate the Earth magnetic field. To develop an autonomous mobile robot simulation program, we have done modeling MR sensor, mobile robot and coordinate transformation. A computer simulation of the robot including mobile robot dynamics and steering was used to verify the driving performance of the mobile robot controller using the fuzzy logic. So, we confirmed the robustness of the proposed fuzzy controller by computer simulation.

Network Based Robot Simulator Implementing Uncertainties in Robot Motion and Sensing (로봇의 이동 및 센싱 불확실성이 고려된 네트워크 기반 로봇 시뮬레이션 프로그램)

  • Seo, Dong-Jin;Ko, Nak-Yong;Jung, Se-Woong;Lee, Jong-Bae
    • The Journal of Korea Robotics Society
    • /
    • v.5 no.1
    • /
    • pp.23-31
    • /
    • 2010
  • This paper suggests a multiple robot simulator which considers the uncertainties in robot motion and sensing. A mobile robot moves with errors due to some kinds of uncertainties from actuators, wheels, electrical components, environments. In addition, sensors attached to a mobile robot can't make accurate output information because of uncertainties of the sensor itself and environment. Uncertainties in robot motion and sensing leads researchers find difficulty in building mobile robot navigation algorithms. Generally, a robot algorithm without considering unexpected uncertainties fails to control its action in a real working environment and it leads to some troubles and damages. Thus, the authors propose a simulator model which includes robot motion and sensing uncertainties to help making robust algorithms. Sensor uncertainties are applied in range sensors which are widely used in mobile robot localization, obstacle detection, and map building. The paper shows performances of the proposed simulator by comparing it with a simulator without any uncertainty.

Vision Sensor and Ultrasonic Sensor Fusion Using Neural Network

  • Baek, Sang-Hoon;Oh, Se-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.668-671
    • /
    • 2004
  • This paper proposes a new method of sensor fusion of an ultrasonic sensor and a vision sensor at the sensor level. In general vision system, the vision system finds edges of objects. And in general ultrasonic system, the ultrasonic system finds absolute distance between robot and object. So, the method integrates data of two different types. The system makes perfect output for robot control in the end. But this paper does not propose only integrating a different kind of data but also fusion information which receives from different kind of sensors. This method has advantages which can simply embody algorithm and can control robot on real time.

  • PDF

A Data Fusion Method of Odometry Information and Distance Sensor for Effective Obstacle Avoidance of a Autonomous Mobile Robot (자율이동로봇의 효율적인 충돌회피를 위한 오도메트리 정보와 거리센서 데이터 융합기법)

  • Seo, Dong-Jin;Ko, Nak-Yong
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.57 no.4
    • /
    • pp.686-691
    • /
    • 2008
  • This paper proposes the concept of "virtual sensor data" and its application for real time obstacle avoidance. The virtual sensor data is virtual distance which takes care of the movement of the obstacle as well as that of the robot. In practical application, the virtual sensor data is calculated from the odometry data and the range sensor data. The virtual sensor data can be used in all the methods which use distance data for collision avoidance. Since the virtual sensor data considers the movement of the robot and the obstacle, the methods utilizing the virtual sensor data results in more smooth and safer collision-free motion.

Landmark Detection Based on Sensor Fusion for Mobile Robot Navigation in a Varying Environment

  • Jin, Tae-Seok;Kim, Hyun-Sik;Kim, Jong-Wook
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.10 no.4
    • /
    • pp.281-286
    • /
    • 2010
  • We propose a space and time based sensor fusion method and a robust landmark detecting algorithm based on sensor fusion for mobile robot navigation. To fully utilize the information from the sensors, first, this paper proposes a new sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable an accurate measurement. Exploration of an unknown environment is an important task for the new generation of mobile robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. The newly proposed, STSF (Space and Time Sensor Fusion) scheme is applied to landmark recognition for mobile robot navigation in an unstructured environment as well as structured environment, and the experimental results demonstrate the performances of the landmark recognition.