• Title/Summary/Keyword: robotic vision

Search Result 126, Processing Time 0.025 seconds

Development of a Robotic Transplanter for Bedding Plants (I)-Development of the Machine Vision System of a Robotic Transplanter- (육묘용 로봇 이식기의 개발(I)-로봇 이식기의 기계시각 시스템의 개발-)

  • 류관희;이희환;김기영;황호준
    • Proceedings of the Korean Society for Agricultural Machinery Conference
    • /
    • 1997.12a
    • /
    • pp.392-400
    • /
    • 1997
  • This study was conducted to develope the machine vision system of a robotic transplanter for bedding plants. Specific objectives of this study were 1) to get coordinates of the healthy seedlings except empty cells and bad seedlings in high-density plug tray, and 2) to get the angle of the leaves of the healthy seedlings to avoid damage to the seedlings by gripper. The results of this study are summarized as follows. (1) The machine vision system of a robotic transplanter was developed. (2) The success rates of detecting empty cell and bad seedlings in 72-cell and 128-cell plug trays were 98.8% and 94.9% respectively. (3) The success rates of calculating the angle of leaves in 72-cell and 128-cell plug trays were 93.5% and 91.0% respectively.

  • PDF

DEVELOPMENT OF A MACHINE VISION SYSTEM FOR WEED CONTROL USING PRECISION CHEMICAL APPLICATION

  • Lee, Won-Suk;David C. Slaughter;D.Ken Giles
    • Proceedings of the Korean Society for Agricultural Machinery Conference
    • /
    • 1996.06c
    • /
    • pp.802-811
    • /
    • 1996
  • Farmers need alternatives for weed control due to the desire to reduce chemicals used in farming. However, conventional mechanical cultivation cannot selectively remove weeds located in the seedline between crop plants and there are no selective heribicides for some crop/weed situations. Since hand labor is costly , an automated weed control system could be feasible. A robotic weed control system can also reduce or eliminate the need for chemicals. Currently no such system exists for removing weeds located in the seedline between crop plants. The goal of this project is to build a real-time , machine vision weed control system that can detect crop and weed locations. remove weeds and thin crop plants. In order to accomplish this objective , a real-time robotic system was developed to identify and locate outdoor plants using machine vision technology, pattern recognition techniques, knowledge-based decision theory, and robotics. The prototype weed control system is composed f a real-time computer vision system, a uniform illumination device, and a precision chemical application system. The prototype system is mounted on the UC Davis Robotic Cultivator , which finds the center of the seedline of crop plants. Field tests showed that the robotic spraying system correctly targeted simulated weeds (metal coins of 2.54 cm diameter) with an average error of 0.78 cm and the standard deviation of 0.62cm.

  • PDF

Real-time Robotic Vision Control Scheme Using Optimal Weighting Matrix for Slender Bar Placement Task (얇은 막대 배치작업을 위한 최적의 가중치 행렬을 사용한 실시간 로봇 비젼 제어기법)

  • Jang, Min Woo;Kim, Jae Myung;Jang, Wan Shik
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.26 no.1
    • /
    • pp.50-58
    • /
    • 2017
  • This paper proposes a real-time robotic vision control scheme using the weighting matrix to efficiently process the vision data obtained during robotic movement to a target. This scheme is based on the vision system model that can actively control the camera parameter and robotic position change over previous studies. The vision control algorithm involves parameter estimation, joint angle estimation, and weighting matrix models. To demonstrate the effectiveness of the proposed control scheme, this study is divided into two parts: not applying the weighting matrix and applying the weighting matrix to the vision data obtained while the camera is moving towards the target. Finally, the position accuracy of the two cases is compared by performing the slender bar placement task experimentally.

Development of a Robotic Transplanter for Bedding Plants(III)-Development of a Robotic Transplanter (육묘용 로봇 이식기의 개발(III)-로봇이식기의 개발-)

  • 류관희;이희환;김기영;한재성
    • Proceedings of the Korean Society for Agricultural Machinery Conference
    • /
    • 1997.06c
    • /
    • pp.238-246
    • /
    • 1997
  • This study was conducted to develop a robotic transplanter for bedding plants. The robotic transplanter consisted of machine vision system, a manipulator, a gripper and plug tray transfer system. The performance of the robotic transplanter was tested and compared by two different transplanting methods, which were to consider the leaf orientation of seedlings and not to. Results of this study were as follows. (1) A cartesian coordinate manipulator for a robotic transplanter with 3 degree of freedom was constructed. The accuracy of position control was $\pm$1 mm. (2) The robotic transplanter with the machine vision system, the manipulator, the gripper and the transfer system was developed and tested with a shovel-type finger. Without considering the orientation of leaves, the success rates of transplanting healthy cucumber seedlings in 72-cell and 128-cell plug-trays were 95.5% and 94.5% respectively. Considering the orientation of leaves, the success rates of transplanting healthy cucumber seedling in 72-cell and 128-cell plug-trays were 96.0% and 95.0% respectively.

  • PDF

Development of a Robotic Transplanter for Bedding Plants(I) - Machine Vision System - (육묘용 로봇 이식기의 개발(I) - 기계시각 시스템 -)

  • 류관희;김기영;이희환;황호준
    • Journal of Biosystems Engineering
    • /
    • v.22 no.3
    • /
    • pp.317-324
    • /
    • 1997
  • This study was conducted to develope a machine vision system for a robotic transplanter for bedding plants. Specific objectives of this study were 1) to get coordinates of the healthy seedlings in high-density plug tray, and 2) to get the angle of the leaves of the healthy seedlings to avoid the damage to seedlings by gripper. Results of this study were summarized as follows. (1) The machine vision system of a robotic transplanter was developed. (2) Success rates of detecting empty cell and bad seedlings for 72-cell and 128-cell plug-trays were 98.8% and 94, 9% respectively. (3) Success rates of calculating the angle of leaves fer 72-cell and 128-cell plug-trays were 93.5% and 91.0% respectively.

  • PDF

A Robotic Vision System for Turbine Blade Cooling Hole Detection

  • Wang, Jianjun;Tang, Qing;Gan, Zhongxue
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.237-240
    • /
    • 2003
  • Gas turbines are extensively used in flight propulsion, electrical power generation, and other industrial applications. During its life span, a turbine blade is taken out periodically for repair and maintenance. This includes re-coating the blade surface and re-drilling the cooling holes/channels. A successful laser re-drilling requires the measurement of a hole within the accuracy of ${\pm}0.15mm$ in position and ${\pm}3^{\circ}$ in orientation. Detection of gas turbine blade/vane cooling hole position and orientation thus becomes a very important step for the vane/blade repair process. The industry is in urgent need of an automated system to fulfill the above task. This paper proposes approaches and algorithms to detect the cooling hole position and orientation by using a vision system mounted on a robot arm. The channel orientation is determined based on the alignment of the vision system with the channel axis. The opening position of the channel is the intersection between the channel axis and the surface around the channel opening. Experimental results have indicated that the concept of cooling hole identification is feasible. It has been shown that the reproducible detection of cooling channel position is with +/- 0.15mm accuracy and cooling channel orientation is with +/$-\;3^{\circ}$ with the current test conditions. Average processing time to search and identify channel position and orientation is less than 1 minute.

  • PDF

Towards a Ubiquitous Robotic Companion: Design and Implementation of Ubiquitous Robotic Service Framework

  • Ha, Young-Guk;Sohn, Joo-Chan;Cho, Young-Jo;Yoon, Hyun-Soo
    • ETRI Journal
    • /
    • v.27 no.6
    • /
    • pp.666-676
    • /
    • 2005
  • In recent years, motivated by the emergence of ubiquitous computing technologies, a new class of networked robots, ubiquitous robots, has been introduced. The Ubiquitous Robotic Companion (URC) is our conceptual vision of ubiquitous service robots that provide users with the services they need, anytime and anywhere in ubiquitous computing environments. To realize the vision of URC, one of the essential requirements for robotic systems is to support ubiquity of services: that is, a robot service must be always available even though there are changes in the service environments. Specifically robotic systems need to be automatically interoperable with sensors and devices in current service environments, rather than statically preprogrammed for them. In this paper, the design and implementation of a semantic-based ubiquitous robotic space (SemanticURS) is presented. SemanticURS enables automated integration of networked robots into ubiquitous computing environments exploiting Semantic Web Services and AI-based planning technologies.

  • PDF

Development of a Robotic Transplanter Using Machine Vision for Bedding Plants (기계시각을 이용한 육묘용 로봇 이식기의 개발)

  • 류관희;김기영;이희환;한재성;황호준
    • Journal of Bio-Environment Control
    • /
    • v.6 no.1
    • /
    • pp.55-65
    • /
    • 1997
  • This study was conducted to develop a robotic transplanter for bedding plants. The robotic transplanter consisted of machine vision system, manipulator attached with the specially designed gripper, and plug tray transfer system. Results of this study were as follows. 1. A machine vision system for a robotic transplanter was developed. The success rates of detecting empty cells and bad seedlings in 72-cell and 128-cell plug-trays for cucumber seedlings were 98.8% and 94.9% respectively. The success rates of identifying leaf orientation for 72- cell and 128-cell plug-trays were 93.5% and 91.0%, respectively. 2. A cartesian coordinate manipulator for a robotic transplanter with 3 degrees of freedom was constructed. The accuracy of position control was $\pm$ 1mm. 3. The robotic transplanter was tested with a shovel-type finger. Without considering leaf orientation, the success rates of transplanting healthy cucumber seedlings for 72-cell and 128-cell plug-trays were 95.5% and 94.5%, respectively. Considering leaf orientation, the success rates of transplanting healthy cucumber seedling in 72-cell and 128-cell plug-trays were 96.0% and 95.0%, respectively.

  • PDF

A VISION SYSTEM IN ROBOTIC WELDING

  • Absi Alfaro, S. C.
    • Proceedings of the KWS Conference
    • /
    • 2002.10a
    • /
    • pp.314-319
    • /
    • 2002
  • The Automation and Control Group at the University of Brasilia is developing an automatic welding station based on an industrial robot and a controllable welding machine. Several techniques were applied in order to improve the quality of the welding joints. This paper deals with the implementation of a laser-based computer vision system to guide the robotic manipulator during the welding process. Currently the robot is taught to follow a prescribed trajectory which is recorded a repeated over and over relying on the repeatability specification from the robot manufacturer. The objective of the computer vision system is monitoring the actual trajectory followed by the welding torch and to evaluate deviations from the desired trajectory. The position errors then being transfer to a control algorithm in order to actuate the robotic manipulator and cancel the trajectory errors. The computer vision systems consists of a CCD camera attached to the welding torch, a laser emitting diode circuit, a PC computer-based frame grabber card, and a computer vision algorithm. The laser circuit establishes a sharp luminous reference line which images are captured through the video camera. The raw image data is then digitized and stored in the frame grabber card for further processing using specifically written algorithms. These image-processing algorithms give the actual welding path, the relative position between the pieces and the required corrections. Two case studies are considered: the first is the joining of two flat metal pieces; and the second is concerned with joining a cylindrical-shape piece to a flat surface. An implementation of this computer vision system using parallel computer processing is being studied.

  • PDF

A Study on the Improvement of Pose Information of Objects by Using Trinocular Vision System (Trinocular Vision System을 이용한 물체 자세정보 인식 향상방안)

  • Kim, Jong Hyeong;Jang, Kyoungjae;Kwon, Hyuk-dong
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.26 no.2
    • /
    • pp.223-229
    • /
    • 2017
  • Recently, robotic bin-picking tasks have drawn considerable attention, because flexibility is required in robotic assembly tasks. Generally, stereo camera systems have been used widely for robotic bin-picking, but these have two limitations: First, computational burden for solving correspondence problem on stereo images increases calculation time. Second, errors in image processing and camera calibration reduce accuracy. Moreover, the errors in robot kinematic parameters directly affect robot gripping. In this paper, we propose a method of correcting the bin-picking error by using trinocular vision system which consists of two stereo cameras andone hand-eye camera. First, the two stereo cameras, with wide viewing angle, measure object's pose roughly. Then, the 3rd hand-eye camera approaches the object, and corrects the previous measurement of the stereo camera system. Experimental results show usefulness of the proposed method.