• Title/Summary/Keyword: Robot Sensor

Search Result 1,588, Processing Time 0.025 seconds

Mobile Robot Control with Image Tracking (영상 추적을 이용한 이동 로봇 제어)

  • Hong, Seon-Hack
    • Journal of the Institute of Electronics Engineers of Korea TE
    • /
    • v.42 no.4
    • /
    • pp.33-40
    • /
    • 2005
  • This paper represents the stable path recognition by the ultrasonic sensor which gathers navigation environments and the monocular image sensor which generates the self localization information of mobile robot. The proposed ultrasonic sensor and vision camera system recognizes the target and extracts parameters for generating the world map and self localization. Therefore, this paper has developed an indoor mobile robot and has stably demonstrated in a corridor environment.

A New Hand-eye Calibration Technique to Compensate for the Lens Distortion Effect (렌즈왜곡효과를 보상하는 새로운 Hand-eye 보정기법)

  • Chung, Hoi-Bum
    • Proceedings of the KSME Conference
    • /
    • 2000.11a
    • /
    • pp.596-601
    • /
    • 2000
  • In a robot/vision system, the vision sensor, typically a CCD array sensor, is mounted on the robot hand. The problem of determining the relationship between the camera frame and the robot hand frame is refered to as the hand-eye calibration. In the literature, various methods have been suggested to calibrate camera and for sensor registration. Recently, one-step approach which combines camera calibration and sensor registration is suggested by Horaud & Dornaika. In this approach, camera extrinsic parameters are not need to be determined at all configurations of robot. In this paper, by modifying the camera model and including the lens distortion effect in the perspective transformation matrix, a new one-step approach is proposed in the hand-eye calibration.

  • PDF

A New Hand-eye Calibration Technique to Compensate for the Lens Distortion Effect (렌즈왜곡효과를 보상하는 새로운 hand-eye 보정기법)

  • Chung, Hoi-Bum
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.19 no.1
    • /
    • pp.172-179
    • /
    • 2002
  • In a robot/vision system, the vision sensor, typically a CCD array sensor, is mounted on the robot hand. The problem of determining the relationship between the camera frame and the robot hand frame is refered to as the hand-eye calibration. In the literature, various methods have been suggested to calibrate camera and for sensor registration. Recently, one-step approach which combines camera calibration and sensor registration is suggested by Horaud & Dornaika. In this approach, camera extrinsic parameters are not need to be determined at all configurations of robot. In this paper, by modifying the camera model and including the lens distortion effect in the perspective transformation matrix, a new one-step approach is proposed in the hand-eye calibration.

Development of 6-Axis Force/Moment Sensor Considered Adult Weight for a Humanoid Robot's Foot (성인 체중을 고려한 로봇의 지능형 발을 위한 6축 힘/모멘트센서 개발)

  • Kim, Gab-Soon;Yoon, Jung-Won
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.24 no.7 s.196
    • /
    • pp.90-97
    • /
    • 2007
  • This paper describes the development of 6-axis force/moment sensor considered adult weight far an intelligent foot of humanoid robot. In order to walk on uneven terrain safely, the foot should perceive the applied forces Fx, Fy, Fz and moments Mx, My, Mz to itself and control the foot using the forces and moments. The applied forces and moments should be measured from a 6-axis force/moment sensor attached to the foot, which is composed of Fx sensor, Fy sensor, Fz sensor, Mx sensor, My sensor and Mz sensor in a body. Each sensor should get the deferent rated load, because the applied forces and moments to foot in walking are deferent. Therefore, one of the important things in the sensor is to design each sensor with the deferent rated load and the same rated output. In this paper, a 6-axis force/moment sensor (rated load of Fx and Fy are 500Nm and Fz sensor is 1000N, and those of Mx and My are 18Nm, Mz sensor is 8Nm) for perceiving forces and moments in a humanoid robot's foot was developed using many PPBs (parallel plate-beams). The structure of the sensor was newly modeled, and the sensing elements (plate-beams) of the sensor were designed using by ANSYS software (FEM (Finite Element Method) program). Then, a 6-axis force/moment sensor was fabricated by attaching strain-gages on the sensing elements, and the characteristic test of the developed sensor was carried out. The rated outputs from FEM analysis agree well with that from the characteristic test.

Performance Comparison of Sensor-Programming Schemes According to the Shapes of Obstacles

  • Chung, Jong-In;Chae, Yi-Geun
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.13 no.3
    • /
    • pp.56-62
    • /
    • 2021
  • MSRDS(Microsoft Robotics Developer Studio) provides the ability to simulate these technologies. SPL(Simple Programming Language) of MSRDS provides many functions for sensor programming to control autonomous robots. Sensor programming in SPL can be implemented in two types of schemes: procedure sensor notification and while-loop schemes. We considered the three programming schemes to control the robot movement after studying the advantages and disadvantages of the sensor notification procedure and the while-loop scheme. We also created simulation environments to evaluate the performance of the three considered schemes when applied to four different mazes. The simulation environment consisted of a maze and a robot with the most powerful sensor, i.e., the LRF(Laser Range Finder) sensor. We measured the required travel time and robot actions (number of turns and number of collisions) needed to escape the maze and compared the performance outcomes of the three considered schemes in the four different mazes.

Detection of Implicit Walking Intention for Walking-assistant Robot Based on Analysis of Bio/Kinesthetic Sensor Signals (보행보조로봇을 위한 다중 생체/역학 센서의 신호 분석 및 사용자 의도 감지)

  • Jang, Eun-Hye;Chun, Byung-Tae;Chi, Su-Young;Lee, Jae-Yeon;Cho, Young-Jo
    • The Journal of Korea Robotics Society
    • /
    • v.5 no.4
    • /
    • pp.294-301
    • /
    • 2010
  • In order to produce a convenient robot for the aged and the lower limb disabled, it is needed for the research detecting implicit walking intention and controlling robot by a user's intention. In this study, we developed sensor module system to control the walking- assist robot using FSR sensor and tilt sensor, and analyzed the signals being acquired from two sensors. The sensor module system consisted of the assist device control unit, communication unit by wire/wireless, information collection unit, information operation unit, and information processing PC which handles integrated processing of assist device control. The FSR sensors attached user's the palm and the soles of foot are sensing force/pressure signals from these areas and are used for detecting the walking intention and states. The tilt sensor acquires roll and pitch signal from area of vertebrae lumbales and reflects the pose of the upper limb. We could recognize the more detailed user's walking intention such as 'start walking', 'start of right or left foot forward', and 'stop walking' by the combination of FSR and tilt signals can recognize.

Vision Based Sensor Fusion System of Biped Walking Robot for Environment Recognition (영상 기반 센서 융합을 이용한 이쪽로봇에서의 환경 인식 시스템의 개발)

  • Song, Hee-Jun;Lee, Seon-Gu;Kang, Tae-Gu;Kim, Dong-Won;Seo, Sam-Jun;Park, Gwi-Tae
    • Proceedings of the KIEE Conference
    • /
    • 2006.04a
    • /
    • pp.123-125
    • /
    • 2006
  • This paper discusses the method of vision based sensor fusion system for biped robot walking. Most researches on biped walking robot have mostly focused on walking algorithm itself. However, developing vision systems for biped walking robot is an important and urgent issue since biped walking robots are ultimately developed not only for researches but to be utilized in real life. In the research, systems for environment recognition and tole-operation have been developed for task assignment and execution of biped robot as well as for human robot interaction (HRI) system. For carrying out certain tasks, an object tracking system using modified optical flow algorithm and obstacle recognition system using enhanced template matching and hierarchical support vector machine algorithm by wireless vision camera are implemented with sensor fusion system using other sensors installed in a biped walking robot. Also systems for robot manipulating and communication with user have been developed for robot.

  • PDF

Self Localization of Mobile Robot Using Sonar Sensing and Map Building

  • Kim, Ji-Min;Lee, Ki-Seong;Jeong, Tae-Won
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1931-1935
    • /
    • 2004
  • A location estimate problem is critical issues for mobile robot. Because it is basic problem in practical use of the mobile robot which do what, or move where, or reach an aim. Already there are many technologies of robot localization (like GPS, vision, sonar sensor, etc) used on development. But the elevation of accurateness was brought the problem that must consider an increase of a hardware cost and addition electric power in each ways. There is the core in question to develop available and accurate sensing algorithm though it is economical. We used a ultrasonic sensor and was going to implement comparatively accurate localization though economical. Using a sensing data, we could make a grid map and estimate a position of a mobile robot. In this paper, to get a satisfactory answer about this problem using a ultrasonic sensor.

  • PDF

The Mobile Robot Localizaion Using a Single Sonalr and Cylindrical Beacon (초음파 센서와 실린더형 등대를 이용한 이동 로봇의 위치 추정)

  • 범희락;조형석
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 1993.10a
    • /
    • pp.570-574
    • /
    • 1993
  • This paper proposes a new method of estimating the position and heading angle of a mobile robot moving on a flat surface. The proposed localization method utilizes two passive beacons and a single rotating ultrasonic sensor. The passive beacons consist of two cylinders with different diameters and reflect the ultrasonic pulses coming from the sonar sensor mounted on the mobile robot. The geometric parameter set of beacon is acquired from the sonar scan data obtained at a single mobile robot location using a new data processing algorithm. Form this parameter set, the position and heading angle of the mobile robot is determined directly. The performance and validity of the proposed method are evaluated using two beacons and a single sonar sensor attached at the pan-tilt device mounted on a mobile robot, named LCAR, in our laboratory.

  • PDF

A Study on Odometry Error Compensation using Multisensor fusion for Mobile Robot Navigation (멀티센서 융합을 이용한 자율이동로봇의 주행기록계 에러 보상에 관한 연구)

  • Song, Sin-Woo;Park, Mun-Soo;Hong, Suk-Kyo
    • Proceedings of the KIEE Conference
    • /
    • 2001.11c
    • /
    • pp.288-291
    • /
    • 2001
  • This paper present effective odometry error compensation using multisensor fusion for the accurate positioning of mobile robot in navigation. During obstacle avoidance and wall following of mobile robot, position estimates obtained by odometry become unrealistic and useless because of its accumulated errors. To measure the position and heading direction of mobile robot accurately, odometry sensor a gyroscope and an azimuth sensor are mounted on mobile robot and Complementary-filter is designed and implemented in order to compensate complementary drawback of each sensor and fuse their information. The experimental results show that the multisensor fusion system is more accurate than odometry only in estimation of the position and direction of mobile robot.

  • PDF