• Title/Summary/Keyword: Guided Robot

Search Result 85, Processing Time 0.026 seconds

Axiomatic Design of Composite Double Arm Type Robot Hands and Wrists for Handling Large Glass Panel Displays (공리 설계를 적용한 대형 평판 디스플레이용 더블암형 복합재료 로봇 핸드 및 리스트)

  • 이창섭;이대길;최진경
    • Proceedings of the Korean Society For Composite Materials Conference
    • /
    • 2002.10a
    • /
    • pp.241-244
    • /
    • 2002
  • Recently, the size of glass panel is increased to $1250 mm{\times}1100 mm{\times}0.7 mm$, whose mass is 2.65 kg, which requires much stiffer robot structure. In addition to the high stiffness, the robot hands and wrists for glass panel handling should have miller surface finishing of its outer surface to prevent particles and dusts from adhering on the surface. The maximum height of the robot structure should not be larger than 1500 mm because other automated guided vehicles (AGV) and transfer equipments have been designed within this size limit. The difference of maximum deflections of the four ends of the hands before and after loading the glass panel should be less than 2.0 mm. In this work, the robot hands and wrists for handling large glass panel displays were designed based on the axiomatic design using the finite element method along with optimization routine.

  • PDF

An Image-Guided Robotic Surgery System for Spinal Fusion

  • Chung Goo Bong;Kim Sungmin;Lee Soo Gang;Yi Byung-Ju;Kim Wheekuk;Oh Se Min;Kim Young Soo;So Byung Rok;Park Jong Il;Oh Seong Hoon
    • International Journal of Control, Automation, and Systems
    • /
    • v.4 no.1
    • /
    • pp.30-41
    • /
    • 2006
  • The goal of this work is to develop and test a robot-assisted surgery system for spinal fusion. The system is composed of a robot, a surgical planning system, and a navigation system. It plays the role of assisting surgeons for inserting a pedicle screw in the spinal fusion procedure. Compared to conventional methods for spinal fusion, the proposed surgical procedure ensures minimum invasion and better accuracy by using robot and image information. The robot plays the role of positioning and guiding needles, drills, and other surgical instruments or conducts automatic boring and screwing. Pre-operative CT images intra-operative fluoroscopic images are integrated to provide the surgeon with information for surgical planning. Some experiments employing the developed robotic surgery system are conducted. The experimental results confirm that the system is not only able to guide the surgical tools by accurately pointing and orienting the specified location, but also successfully compensate the movement of the patient due to respiration.

Development of An Image-Guided Robotic Surgery System for Spinal Fusion (영상 지원 척추 융합 수술 로봇 시스템의 개발)

  • Chung Goo-Bong;Lee Soo-Gang;Kim Sung-Min;Oh Se-Min;Yi Byung-Ju;Kim Young-Soo;Park Jong-Il;Oh Seong-Hoon;Kim Whee-Kuk
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.10a
    • /
    • pp.144-148
    • /
    • 2005
  • The goal of this work is to develop and test a robot-assisted surgery system for spinal fusion. The system is composed of a robot, a surgical planning system, and a navigation system. It plays the role of assisting surgeons for inserting a pedicle screw in the spinal fusion procedure. Compared to conventional methods fer spinal fusion, the proposed surgical procedure ensures minimum invasion and better accuracy by using robot and image information. The robot plays the role of positioning and guiding needles, drills, and other surgical instruments or conducts automatic boring and screwing. Pre-operative CT images and intra-operative fluoroscopic images are integrated to provide the surgeon with information for surgical planning. Several experiments employing the developed robotic surgery system are conducted. The experimental results confirmed that the system is not only able to guide the surgical tools by accurately pointing and orienting the specified location, but also successfully compensate the movement of the patient due to his/her respiration.

  • PDF

Object-Transportation Control of Cooperative AGV Systems Based on Virtual-Passivity Decentralized Control Algorithm

  • Suh, Jin-Ho;Lee, Young-Jin;Lee, Kwon-Soon
    • Journal of Mechanical Science and Technology
    • /
    • v.19 no.9
    • /
    • pp.1720-1730
    • /
    • 2005
  • Automatic guided vehicle in the factory has an important role to advance the flexible manufacturing system. In this paper, we propose a novel object-transportation control algorithm of cooperative AGV systems to apply decentralized control to multiple AGV systems. Each AGV system is under nonholonomic constraints and conveys a common object-transportation in a horizontal plain. Moreover it is shown that cooperative robot systems ensure stability and the velocities of augmented systems convergence to a scaled multiple of each desired velocity field for cooperative AGV systems. Finally, the application of proposed virtual passivity-based decentralized control algorithm via system augmentation is applied to trace a circle. Finally, the simulation and experimental results for the object-transportation by two AGV systems illustrates the validity of the proposed virtual-passivity decentralized control algorithm.

A Study on Range Sensor for Autonomous Guided Vehicle using Milimeter Wave Sensor (밀리미터 파 센서를 이용한 무인 자동차용 거리 측정기에 대한 연구)

  • Do, Tae-Yong;Kim, Seong-Do;Chung, Myung-Jin;Park, Seung-Mo;Yang, Bae-Duck
    • Proceedings of the KIEE Conference
    • /
    • 1993.07a
    • /
    • pp.403-405
    • /
    • 1993
  • The ultrasonic sensor used in autonomous mobile robot and autonomous guided vehicle(A.G.V.) is not available for long range measurement. And as the performance of autonomous mobile robot and A.G.V. improves, the importance of the range sensor for long range measurement is increasing. In this paper, we introduce the range sensor for long range measurement using milimeter wave sensor and propose the structure of that system.

  • PDF

A novel visual servoing techniques considering robot dynamics (로봇의 운동특성을 고려한 새로운 시각구동 방법)

  • 이준수;서일홍;김태원
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.410-414
    • /
    • 1996
  • A visual servoing algorithm is proposed for a robot with a camera in hand. Specifically, novel image features are suggested by employing a viewing model of perspective projection to estimate relative pitching and yawing angles between the object and the camera. To compensate dynamic characteristics of the robot, desired feature trajectories for the learning of visually guided line-of-sight robot motion are obtained by measuring features by the camera in hand not in the entire workspace, but on a single linear path along which the robot moves under the control of a, commercially provided function of linear motion. And then, control actions of the camera are approximately found by fuzzy-neural networks to follow such desired feature trajectories. To show the validity of proposed algorithm, some experimental results are illustrated, where a four axis SCARA robot with a B/W CCD camera is used.

  • PDF

Vision-Based Robot Manipulator for Grasping Objects (물체 잡기를 위한 비전 기반의 로봇 메뉴플레이터)

  • Baek, Young-Min;Ahn, Ho-Seok;Choi, Jin-Young
    • Proceedings of the KIEE Conference
    • /
    • 2007.04a
    • /
    • pp.331-333
    • /
    • 2007
  • Robot manipulator is one of the important features in service robot area. Until now, there has been a lot of research on robot" manipulator that can imitate the functions of a human being by recognizing and grasping objects. In this paper, we present a robot arm based on the object recognition vision system. We have implemented closed-loop control that use the feedback from visual information, and used a sonar sensor to improve the accuracy. We have placed the web-camera on the top of the hand to recognize objects. We also present some vision-based manipulation issues and our system features.

  • PDF

Corridor Navigation of the Mobile Robot Using Image Based Control

  • Han, Kyu-Bum;Kim, Hae-Young;Baek, Yoon-Su
    • Journal of Mechanical Science and Technology
    • /
    • v.15 no.8
    • /
    • pp.1097-1107
    • /
    • 2001
  • In this paper, the wall following navigation algorithm of the mobile robot using a mono vision system is described. The key points of the mobile robot navigation system are effective acquisition of the environmental information and fast recognition of the robot position. Also, from this information, the mobile robot should be appropriately controlled to follow a desired path. For the recognition of the relative position and orientation of the robot to the wall, the features of the corridor structure are extracted using the mono vision system, then the relative position, the offset distance and steering angle of the robot from the wall, is derived for a simple corridor geometry. For the alleviation of the computation burden of the image processing, the Kalman filter is used to reduce search region in the image space for line detection. Next, the robot is controlled by this information to follow the desired path. The wall following control scheme by the PD control scheme is composed of two control parts, the approaching control and the orientation control, and each control is performed by steering and forward-driving motion of the robot. To verify the effectiveness of the proposed algorithm, the real time navigation experiments are performed. Through the result of the experiments, the effectiveness and flexibility of the suggested algorithm are verified in comparison with a pure encoder-guided mobile robot navigation system.

  • PDF

Analysis of User's Eye Gaze Distribution while Interacting with a Robotic Character (로봇 캐릭터와의 상호작용에서 사용자의 시선 배분 분석)

  • Jang, Seyun;Cho, Hye-Kyung
    • The Journal of Korea Robotics Society
    • /
    • v.14 no.1
    • /
    • pp.74-79
    • /
    • 2019
  • In this paper, we develop a virtual experimental environment to investigate users' eye gaze in human-robot social interaction, and verify it's potential for further studies. The system consists of a 3D robot character capable of hosting simple interactions with a user, and a gaze processing module recording which body part of the robot character, such as eyes, mouth or arms, the user is looking at, regardless of whether the robot is stationary or moving. To verify that the results acquired on this virtual environment are aligned with those of physically existing robots, we performed robot-guided quiz sessions with 120 participants and compared the participants' gaze patterns with those in previous works. The results included the followings. First, when interacting with the robot character, the user's gaze pattern showed similar statistics as the conversations between humans. Second, an animated mouth of the robot character received longer attention compared to the stationary one. Third, nonverbal interactions such as leakage cues were also effective in the interaction with the robot character, and the correct answer ratios of the cued groups were higher. Finally, gender differences in the users' gaze were observed, especially in the frequency of the mutual gaze.