• Title/Summary/Keyword: robot systems

Search Result 3,642, Processing Time 0.036 seconds

3-D vision sensor system for arc welding robot with coordinated motion by transputer system

  • Ishida, Hirofumi;Kasagami, Fumio;Ishimatsu, Takakazu
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1993.10b
    • /
    • pp.446-450
    • /
    • 1993
  • In this paper we propose an arc welding robot system, where two robots works coordinately and employ the vision sensor. In this system one robot arm holds a welding target as a positioning device, and the other robot moves the welding torch. The vision sensor consists of two laser slit-ray projectors and one CCD TV camera, and is mounted on the top of one robot. The vision sensor detects the 3-dimensional shape of the groove on the target work which needs to be weld. And two robots are moved coordinately to trace the grooves with accuracy. In order to realize fast image processing, totally five sets of high-speed parallel processing units (Transputer) are employed. The teaching tasks of the coordinated motions are simplified considerably due to this vision sensor. Experimental results reveal the applicability of our system.

  • PDF

Landmark Detection Based on Sensor Fusion for Mobile Robot Navigation in a Varying Environment

  • Jin, Tae-Seok;Kim, Hyun-Sik;Kim, Jong-Wook
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.10 no.4
    • /
    • pp.281-286
    • /
    • 2010
  • We propose a space and time based sensor fusion method and a robust landmark detecting algorithm based on sensor fusion for mobile robot navigation. To fully utilize the information from the sensors, first, this paper proposes a new sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable an accurate measurement. Exploration of an unknown environment is an important task for the new generation of mobile robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. The newly proposed, STSF (Space and Time Sensor Fusion) scheme is applied to landmark recognition for mobile robot navigation in an unstructured environment as well as structured environment, and the experimental results demonstrate the performances of the landmark recognition.

Human-Robot Interaction in Real Environments by Audio-Visual Integration

  • Kim, Hyun-Don;Choi, Jong-Suk;Kim, Mun-Sang
    • International Journal of Control, Automation, and Systems
    • /
    • v.5 no.1
    • /
    • pp.61-69
    • /
    • 2007
  • In this paper, we developed not only a reliable sound localization system including a VAD(Voice Activity Detection) component using three microphones but also a face tracking system using a vision camera. Moreover, we proposed a way to integrate three systems in the human-robot interaction to compensate errors in the localization of a speaker and to reject unnecessary speech or noise signals entering from undesired directions effectively. For the purpose of verifying our system's performances, we installed the proposed audio-visual system in a prototype robot, called IROBAA(Intelligent ROBot for Active Audition), and demonstrated how to integrate the audio-visual system.

Trajectory Controller Design of Mobile Robot Systems based on Back-stepping Procedure (백스테핑을 이용한 이동 로봇의 경로 제어기의 설계)

  • 이기철;이성렬;류신형;고재원;박민용
    • Proceedings of the IEEK Conference
    • /
    • 2000.06e
    • /
    • pp.23-26
    • /
    • 2000
  • Generally, the wheel-driven mobile robot systems, by their structural property, have nonholonomic constraints. These constraints are not integrable and cannot be written as time derivatives of some functions with respect to the generalized coordinates. Hence, nonlinear approaches are required to solve the problems. In this paper, the trajectory controller of wheeled mobile robot systems is suggested to guarantee its convergence to reference trajectory. Design procedure of the suggested trajectory controller is back-stepping scheme which was introduced recently in nonlinear control theory. The performance of the proposed trajectory controller is verified via computer simulation. In the simulation, the trajectory controller is applied to differentially driven robot system and car-like mobile robot system on the assumption that the trajectory planner be given.

  • PDF

Development of Patrol Robot using DGPS and Curb Detection (DGPS와 연석추출을 이용한 순찰용 로봇의 개발)

  • Kim, Seung-Hun;Kim, Moon-June;Kang, Sung-Chul;Hong, Suk-Kyo;Roh, Chi-Won
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.2
    • /
    • pp.140-146
    • /
    • 2007
  • This paper demonstrates the development of a mobile robot for patrol. We fuse differential GPS, angle sensor and odometry data using the framework of extended Kalman filter to localize a mobile robot in outdoor environments. An important feature of road environment is the existence of curbs. So, we also propose an algorithm to find out the position of curbs from laser range finder data using Hough transform. The mobile robot builds the map of the curbs of roads and the map is used fur tracking and localization. The patrol robot system consists of a mobile robot and a control station. The mobile robot sends the image data from a camera to the control station. The remote control station receives and displays the image data. Also, the patrol robot system can be used in two modes, teleoperated or autonomous. In teleoperated mode, the teleoperator commands the mobile robot based on the image data. On the other hand, in autonomous mode, the mobile robot has to autonomously track the predefined waypoints. So, we have designed a path tracking controller to track the path. We have been able to confirm that the proposed algorithms show proper performances in outdoor environment through experiments in the road.

Home Automation System using Intelligent Mobile Robot (지능형 이동 로봇을 이용한 홈오토메이션 시스템)

  • Ahn, Ho-Seok;Choi, Jin-Young
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.4
    • /
    • pp.486-491
    • /
    • 2005
  • This paper proposes the system model that is more efficient and active than formal home automation system and it can conquer the limits of formal one using intelligent mobile robot. This system uses specialized intelligent mobile robot for home environment and the robot moves around home instead of human. We call the system model to HAuPIRS (Home Automation system using PDA based Intelligent Robot System). HAuPIRS control architecture is composed three parts and each part is User Level, Cognitive Level, Executive Level. It is easy to use system and possible to extend the home apparatus from new technology. We made the PBMoRo System (PDA Based Mobile Robot System) based on HAuPIRS architecture and verified the efficiency of the system model.

Implementation of Path Finding Method using 3D Mapping for Autonomous Robotic (3차원 공간 맵핑을 통한 로봇의 경로 구현)

  • Son, Eun-Ho;Kim, Young-Chul;Chong, Kil-To
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.14 no.2
    • /
    • pp.168-177
    • /
    • 2008
  • Path finding is a key element in the navigation of a mobile robot. To find a path, robot should know their position exactly, since the position error exposes a robot to many dangerous conditions. It could make a robot move to a wrong direction so that it may have damage by collision by the surrounding obstacles. We propose a method obtaining an accurate robot position. The localization of a mobile robot in its working environment performs by using a vision system and Virtual Reality Modeling Language(VRML). The robot identifies landmarks located in the environment. An image processing and neural network pattern matching techniques have been applied to find location of the robot. After the self-positioning procedure, the 2-D scene of the vision is overlaid onto a VRML scene. This paper describes how to realize the self-positioning, and shows the overlay between the 2-D and VRML scenes. The suggested method defines a robot's path successfully. An experiment using the suggested algorithm apply to a mobile robot has been performed and the result shows a good path tracking.

Development of Command Signal Generating Method for Assistive Wearable Robot of the Human Upper Extremity (상지 근력지원용 웨어러블 로봇을 위한 명령신호 생성 기법 개발)

  • Lee, Hee-Don;Yu, Seung-Nam;Lee, Seung-Hoon;Jang, Jae-Ho;Han, Jung-Soo;Han, Chang-Soo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.2
    • /
    • pp.176-183
    • /
    • 2009
  • This paper proposes command signal generating method for a wearable robot using the force as the input signal. The basic concept of this system pursues the combination of the natural and sophisticated intelligence of human with the powerful motion capability of the robot. We define a task for the command signal generation to operate with the human body simultaneously, paying attention to comfort and ease of wear. In this study, we suggest a basic exoskeleton experimental system to evaluate a HRI(Human Robot Interface), selecting interfaces of arm braces on both wrists and a weight harness on the torso to connect the robot and human. We develop the HRI to provide a command for the robot motion. It connects between the human and the robot with the multi-axis load-cell, and it measures the relative force between the human and the robot. The control system calculates the trajectory of end-effector using this force signal. In this paper, we verify the performance of proposed system through the motion of elbow E/F(Extension/Flexion), the shoulder E/F and the shoulder Ab/Ad (Abduction/Adduction).

Development of an Emotion Recognition Robot using a Vision Method (비전 방식을 이용한 감정인식 로봇 개발)

  • Shin, Young-Geun;Park, Sang-Sung;Kim, Jung-Nyun;Seo, Kwang-Kyu;Jang, Dong-Sik
    • IE interfaces
    • /
    • v.19 no.3
    • /
    • pp.174-180
    • /
    • 2006
  • This paper deals with the robot system of recognizing human's expression from a detected human's face and then showing human's emotion. A face detection method is as follows. First, change RGB color space to CIElab color space. Second, extract skin candidate territory. Third, detect a face through facial geometrical interrelation by face filter. Then, the position of eyes, a nose and a mouth which are used as the preliminary data of expression, he uses eyebrows, eyes and a mouth. In this paper, the change of eyebrows and are sent to a robot through serial communication. Then the robot operates a motor that is installed and shows human's expression. Experimental results on 10 Persons show 78.15% accuracy.

SCARA robot calibration on off-line programming (오프라인 프로그래밍에서 스카라 로봇의 보정)

  • Jung, Sung-Woo;Son, Kwon;Lee, Min-Chul;Choi, Jae-Won
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1997.10a
    • /
    • pp.1832-1835
    • /
    • 1997
  • Off-line programming systems are widely spread in assembly lines of minute electronic products to huge offshore structures. Any OLP system has to be calibrated before the on-line robot tasks are performed because there are inherent differences between the CAD model on OLP and the real robot workspace. This paper uses simple geometric expressions to propose a calibration method applicable to an OLP for SCARA robots. A positioning task on the two-dimensional horizontal surface was used in the error analysis of a SCARA robot and the anaysis shows that the inaccuracy results from the two error sources non-zero offset angles of two rotational joints at the zero return and differences in link lengths. Pen marks on a sheet of plotting paper are used to determine the accurate data on the joint centers and link dimensions. The calculated offset angles and link lengths are fed back to the OLP for the calibration of the CAD model of the robot and task environments.

  • PDF